We want to take a moment to acknowledge the fraught times we are living in right now. Some people are awakening to the brutal reality of systemic racism and police violence that others have long experienced. People are on the streets, protesting and demanding social justice. All of this is happening at the same time that people around the world are struggling with and dying from COVID-19, with many more facing extreme economic or mental health hardships.
While we are excited for our announcement below, which has been two years in the making, we want to be clear that these new organizations exist only against the backdrop of many others that are working on critically important issues in our world. It is our hope that we are not a distraction or retreat from those efforts, but can add to them instead.
By Adelin Cai, Alexander Macgillivray, Clara Tsao, Denelle Dixon, and Eric Goldman
Board of Directors, Trust & Safety Professional Association (TSPA) and Trust & Safety Foundation Project (TSF)
Internet communities and online services have been around for decades. As they’ve grown, they’ve needed to figure out what online behavior is fair game, and what’s not. Over time, they’ve formed teams to handle this responsibility. Oftentimes, these teams fall under the rubric of “trust and safety.” While content policy and moderation are a big part of trust and safety, and the areas that get the most public attention these days, trust and safety also includes the people who tackle financial risk and fraud, those who process law enforcement requests, engineers who work on automating product policies, and more. Today, this growing class of professionals is a diverse group. They work for large online services, small startups, and as volunteers; they sit anywhere from Silicon Valley and Dublin to Austin and Jakarta.
Their jobs are critically important — and difficult. They roll up their sleeves and wrestle with thorny issues of online trust and safety, day in and day out. The challenges they face are enormous: balancing nuanced considerations on complex policy issues; navigating problems like election interference, extremism, and harassment; and tackling spam, account takeovers, and fraud. These issues are front and center during these fraught times, as trust and safety professionals aim to combat a myriad of online abuse related to systemic racism, police violence, and COVID-19 — such as hate speech, misinformation, price gouging, and phishing — while keeping a safe space for connecting people with vital, authoritative information, and with each other. Online trust and safety is critical to healthy societal interactions, and it’s imperative to support the professionals who are doing this important work.
Today, we’re pleased to announce the Trust & Safety Professional Association (TSPA) and the Trust & Safety Foundation Project (TSF).* TSPA is a new, nonprofit, membership-based organization that will support the global community of professionals who develop and enforce principles and policies that define acceptable behavior online. TSF will focus on improving society’s understanding of trust and safety, including the operational practices used in content moderation, through educational programs and multidisciplinary research. Neither TSPA nor TSF are lobbying organizations, and will not advocate for public policy positions on behalf of corporate supporters or anyone else. Instead, we will support the community of people doing the work, and society’s understanding of it.
While the members of TSPA will be the people doing trust and safety work, we are grateful to have a broad set of founding supporters backing our launch. TSPA’s founding supporters are: Airbnb, Automattic (including WordPress.com and Tumblr), Cloudflare, Facebook Inc. (including Instagram and WhatsApp), Google (including YouTube), Match Group (including Tinder, Hinge, Match, and OkCupid), Omidyar Network, Pinterest, Postmates, Slack, Twitter, and the Wikimedia Foundation. Once we open up membership, we expect to invite the trust and safety employees of these companies, their vendor partners, and many more, to join.
We first started discussing the need for the TSPA in the months following the first Content Moderation & Removal (COMO) at Scale conference in 2018 held at the Santa Clara University School of Law. That event, and the three subsequent COMO at Scale conferences over the course of 2018 and 2019, brought together trust and safety professionals from across multiple companies for public conversations about the work they do. There was a palpable feeling that this community needed more connections.
Through TSPA, we hope to build and strengthen those connections. TSPA will be a forum for professionals to connect with a network of peers, find resources for career development, and exchange best practices for navigating challenges unique to the profession. For example, we plan to develop workshops on employee mental health and resilience, resources for policy formulation, and guides to the strengths and weaknesses of technical tools. We also will facilitate professional development, such as career advancement bootcamps, a resource library, and a job board. We are member-driven, so we look forward to executing on the things trust and safety professionals think would be most helpful.
In parallel, we are setting up TSF with a $2 million founding contribution from Cognizant, as well as contributions from individual donors. While TSPA’s focus is on the professional development of those who work in online trust and safety, TSF will focus on research relating to the work done by trust and safety professionals, and improving society’s understanding of the field of online trust and safety.
Two of TSF’s initial projects are: publishing a series of case studies from the Copia Institute that illustrate the difficult choices trust and safety professionals face, and hosting a new podcast called Flagged for Review that will interview historical and current trust and safety professionals, as well as others with relevant expertise.
Please sign up here so we can share more with you about TSPA and TSF in the coming months as we open our membership and develop our offerings. Follow us on Twitter, too. We would also love to hear from organizations and people who want to help out, or whose work is complementary to our own. We’re excited to work with you to further develop and support the community of online trust and safety professionals.