Announcing the first Global Majority Research Committee (GMRC) research showcase of the year on April 21 8:00 - 9:30 AM PT! Featured panelists include: Aliya Bhatia Hesam Nourooz pour Fahim Aslam Tamer Farag Diyi Liu The GMRC, one of our three T&S Research Coalition committees, aims to foster a more equitable and effective trust and safety ecosystem by elevating the expertise and inclusion of Majority World voices in product design, policy development, and organizational practices. GMRC members Aliya, Fahim, and Hesam and special guests Tamer and Diyi will all share recent work related to trust and safety and the Majority World. Register here: https://lnkd.in/gbapeMVA
Trust and Safety Foundation
Non-profit Organizations
San Francisco, CA 4,329 followers
Building Trust. Sharing Knowledge. Advancing Research.
About us
The Trust and Safety Foundation (TSF) is a 501(c)(3) non-partisan, charitable organization with the mission to convene stakeholders, share knowledge, and engage in activities that facilitate meaningful advancements in trust and safety.
- Website
-
https://trustandsafetyfoundation.org
External link for Trust and Safety Foundation
- Industry
- Non-profit Organizations
- Company size
- 2-10 employees
- Headquarters
- San Francisco, CA
- Type
- Nonprofit
- Founded
- 2020
Locations
-
Primary
Get directions
San Francisco, CA, US
Employees at Trust and Safety Foundation
Updates
-
Check it out! T&S Research Coalition member Marlyn Thomas Savio, PhD, CPsychol will share more about our Psychological Health Research Committee (PHRC) and the work they've done with our fantastic academic collaborators Deepa Manjanatha, MPH, MS and Cinnamon Bloss, PhD.
We are thrilled to announce our latest series starting in April: TSPA Group Series: Building Future-Ready Wellbeing Metrics for Moderators! This three-part discussion-based roundtable series for cross-functional Trust & Safety stakeholders focused on improving moderator wellbeing measurement across APAC, EMEA, and Global Majority contexts. The series explores what to measure, how to measure it well, and how to use wellbeing data responsibly, with shared learning documented in an evolving Moderator Wellbeing Measurement & Action Resource. The first session in the series, “The Moderator Mindset: What Do We Really Need to Measure,” will be held on April 28 from 8:30-10:00 PM PT | 8:30 AM IST | 11:30 AM SGT. Presented by Dr. Aparna Samuel Balasundaram (she,her,hers), Bindiya Lakshmi Raghunath, Jayshree Sarda, Srihari B N Swamy, and with guest spotlight Marlyn Thomas Savio, PhD, CPsychol, this webinar will explore how to ground wellbeing measurement in the real experiences of T&S moderators. Without clarity on what we’re measuring, it’s hard to know what we’re actually trying to improve, or whether it’s working. This session focuses on the full picture of moderator wellbeing, from content exposure to work design, cognitive load, moral distress, and the evolving challenges of GenAI moderation. Through a structured roundtable, participants will dig into key questions around why measurement matters, who it should focus on, what constructs are most meaningful, and what should be prioritized now. Read more about the series and this session on the registration page: https://lnkd.in/gn97DNHh. Please note that we will prioritize TSPA members, and non-members will be placed on a waiting list by default.
-
-
T&S Research Coalition Q&A: Get to Know the Researchers Collaborating with TSF We're continuing our series introducing researchers from our T&S Research Coalition committees. Each post features a conversation with someone whose work is helping tackle some of the hardest problems in trust and safety. Next up, from our Tooling Research Committee (TRC): Sally Lait 👇 Sally Lait is a fractional technology leader, consultant, and advisor with two decades of experience across start-ups, scale-ups, and global organisations. Most recently Engineering Director for Trust & Safety at Bumble Inc., she has also held senior leadership roles at Monzo and Farewill. She now provides fractional leadership, advisory support, and coaching for both general technology and T&S-focused organisations. You can check out her website: https://sallylait.com 📘 What drew you to trust and safety research, and what keeps you motivated in this field? In my previous role, staying in touch with wider trends, best practices, and collaboration opportunities was key. After setting out to work independently, I was keen to use some of my capacity to give more back to some of the communities I had already benefited so much from — and Tooling stood out as a natural fit between my T&S experience and my broader technology expertise. What keeps me motivated: playing even a small role to help the industry prevent both online and real world harms more effectively is a cause that I deeply believe in, and will never stop being motivated by. 📘 In one sentence, how would you describe your current research focus to someone outside the field? Developing a comprehensive taxonomy of T&S tooling — collecting, categorising, and ultimately helping people better find, understand, and select the right tools, while highlighting under-served areas. 📘 What's a common misconception about your area of T&S research that you'd like to correct? T&S tooling is often pretty ‘behind the scenes’ for companies, kept secret because of a desire not to share too much, or simply because it’s sometimes not seen as novel or shiny as other product development. In my experience, it’s been quite the opposite: there’s a huge amount of interest in this under-sung area of technology, and a load of fantastic innovation being done. 📘 If you could get people in the trust and safety field to read one paper, what would it be and why? Any papers written by my colleagues in the TRC ;) For practitioners out there, there is a huge wealth of very accessible material out there, such as the TSPA's (https://www.tspa.org/) T&S Resources and events, such as TrustCon. 📘 Where do you see the biggest opportunities for T&S research to have real-world impact in the next 3-5 years? Helping organisations see the real benefits to business, and promoting cross-industry collaboration and sharing of efforts. We have the most impact when we work together — and research can be a powerful platform for making that happen.
-
-
T&S Research Coalition Q&A: Get to Know the Researchers Collaborating with TSF We're continuing our series introducing the researchers behind TSF's work––members of our T&S Research Coalition whose research is helping tackle some of the hardest problems in trust and safety. Next up: Fahim Aslam 👇 Fahim is a strategic researcher and development professional with over seven years of experience across international development, policy advisory, and cross-sector partnerships in South Asia. He is currently leading research and partnerships at the Marga Institute while consulting for Fulbright Sri Lanka, and continues to expand regional networks across South and Southeast Asia. 📘 What drew you to trust and safety research, and what keeps you motivated? Working across peacebuilding and social cohesion programs in South Asia, I watched firsthand how online spaces amplified division, spread harmful narratives, and deepened real-world conflict. The communities I worked with weren't just affected by offline tensions––they were being shaped, and often harmed, by what was happening digitally. What keeps me motivated is the realization that online safety isn't a technical problem with a technical solution. It's a deeply human, political, and policy challenge. Data tells us where the harm is. Policy determines whether anything gets done about it. 📘 In one sentence: what's your research focus? I research how communities in Sri Lanka and South Asia define and experience online harm because their reality rarely matches the policies written about them. 📘 What's a misconception you'd like to correct? That communities in South Asia are simply victims waiting to be protected by policies designed elsewhere. People in Sri Lanka are actively navigating online harm every day — in Sinhala, in Tamil — in ways English-language moderation systems were never built to understand. You cannot separate what happens online from what happens on the ground. In contexts shaped by ethnic conflict, they're the same story told on different screens. 📘 One paper everyone in T&S should read? "Understanding International Perceptions of the Severity of Harmful Content Online" — it exposes a fundamental flaw: perceptions of harm vary dramatically across cultures, yet platforms keep applying frameworks built around Western norms. Until the field accepts that harm is culturally situated, not universally defined, we'll keep building policies that protect some users while leaving others invisible. 📘 Where's the biggest opportunity for T&S research in the next 3–5 years? Three areas: localized policy frameworks as South Asian countries draft their own online safety legislation; AI-generated harm in non-English contexts — Sinhala and Tamil speakers are largely invisible to current moderation tools; and community-led safety, shifting from treating people as victims to treating them as actors.
-
T&S Research Coalition Q&A: Get to Know the Researchers Collaborating with TSF We're kicking off a new series introducing the researchers behind TSF's initiatives. Each post will feature a conversation with a member of our T&S Research Coalition. We hope these spotlights spark connection, curiosity, and collaboration across our field! 🙌 First up, from our Tooling Research Committee: Cyndie D. 👇 Cyndie is a Doctoral Researcher at the University of Edinburgh's School of Informatics, part of the CDT in Responsible NLP, and a Global Data Fellow at Childlight. With a background in data science and NLP, her work spans technology and regulatory organisations across France, Sweden, the USA, and the UK––and centers on using AI to address illegal activity online and develop child safety solutions. 📘 What drew you to trust and safety research, and what keeps you motivated? Trust and safety felt like a place where research isn’t just descriptive, but consequential: insights can directly inform systems that protect people at scale. The work is never static. [...]. I’m especially motivated by research that centers affected users and translates their experiences into practical, ethical decisions. Knowing that careful, rigorous work can reduce harm—even incrementally—keeps the field both challenging and deeply meaningful to me. 📘 In one sentence: what's your research focus? My research focuses on approaching the potential opportunities and emerging risks of generative AI models for illegal harms, especially those toward children, their applications for law enforcement practitioners, and the development of preventive technological frameworks for industry. 📘 What's a misconception you'd like to correct? A common misconception is that trust and safety and ethics are mostly theoretical but far from “real” world impact. In reality, this work directly informs and supports shaping how to prevent harm and protect people. 📘 One paper everyone in T&S should read? “Generative ML and CSAM: Implications and Mitigations” by David Thiel, Melissa Stroebel, and Rebecca Portnoff. It explains how advances in generative machine learning have impacted our field and why that matters for child safety and online harm. 📘 Where's the biggest opportunity for T&S research in the next 3–5 years? I see major opportunities in developing robust prevention and mitigation methods—especially by building close, sustained collaborations with domain experts and frontline practitioners such as trust & safety operations teams, investigators, NGOs, and regulators. Partnering this way helps ensure research is grounded, evaluated against practical constraints, and translated into interventions that actually work in deployment. You can find more about Cyndie and her work on her website: https://lnkd.in/gduHDPvJ
-
-
TSF will be co-hosting the roundtable “Towards Collaboration: Trust & Safety and Human Rights in the African Context” at RightsCon with the Trust and Safety Africa Academy! 📣 Trust and Safety and Human Rights share a common goal––but aren't collaborating effectively enough. The roundtable's main goal is to address the gap between these two fields in Africa, recognizing the region's unique context. We'll be joined by the Global Network Initiative, the Digital Trust and Safety Partnership, BSR, and the Digital Agenda for Tanzania Initiative for a mix of presentations and small group discussions. Date and time to be confirmed in late March. More details coming soon!
-
Trust and Safety Foundation reposted this
We are excited to announce that volunteer applications are now available for the 2026 EMEA Member Summit! Volunteers are an integral part of the success of our events, and we hope that you will consider joining us in Dublin, Ireland on 18 May! Because this is a members-only summit, volunteers must be members. Benefits to volunteering include: ☑️ Great way to meet the EMEA T&S community ☑️ Directly engage with and support TSPA ☑️ Bonus: comped registration to the event and T&S After Hours Reception 🎟️ As a reminder, the EMEA Summit will be a members-only summit, designed to create a more connected, collaborative, and tailored experience for our TSPA community. Instead of broad panels and presentations, the Summit will center on closed-door discussions, hands-on problem-solving, and knowledge sharing among members. Registration is opening soon! Interested in volunteering? Fill out the form here: https://lnkd.in/dgFjMrjK!
-
-
Want to stay up to date on our work? In the coming months, TSF will be launching a new newsletter featuring insights, resources, research highlights, collaborative projects, upcoming events, and opportunities to get involved across the Trust and Safety ecosystem. If you’d like to receive it once it launches, you can sign up now by submitting your email on our website: 👉 https://lnkd.in/gHVbPwJD More updates coming soon—stay tuned!
-
New matchmaking opportunity! 📣 An academic researcher at the University of Glasgow who works with consumer-facing organisations (i.e. sports clubs, fashion brands, retailers, public services) and is an expert on measuring the commercial and social impact of addressing harmful comments on social media (e.g. misinformation, discrimination, micro-aggressions, sustainability backlash) is looking for organisations interested in running short pilot projects on how to leverage their values in social media communications to tackle harmful content, depolarise controversial interactions, and cause positive social change. Interested in connecting? Email matchmaking@trustandsafetyfoundation.org. 📧
-
Trust and Safety Foundation reposted this
📢 Trust and Safety Research Conference (TSRC) is back & we're now accepting presentation applications. Mark your calendars — TSRC returns on October 1–2, 2026, bringing together 500+ professionals from academia, industry, civil society, and government to tackle the most pressing questions in trust and safety research. This is your chance to share your work with a community dedicated to making the internet safer for everyone. 👉 Apply here: https://lnkd.in/gFXuWmWw
-