A San Franciscan's Blog Un poquito de todo

How to Actually Reinvent Content Moderation

A Generated Image of a Kenyan Content Moderator

A Generated Image of a Kenyan Content Moderator

Anyone who was around the early days of social media remembers the horrible content that circulated. Our feeds are better now, in no small part due to the millions of dollars spent by large corporate social media networks like Facebook and Twitter on content moderation.

But who are the people, the actual individuals, moderating the content and protecting billions of people from the trauma of graphic atrocities and abuse? And are they being fairly compensated and taken care of if they develop PTSD, depression and a litany of other mental health conditions related to content moderation?

Expendable Workers, Valuable Work

The sad answers to these questions is that a lot of the work has been outsourced to moderation firms like Sama and Majorel in Kenya or Atrium here in the USA. It is these third-party vendors that then hire people to moderate content for many social networks like Meta’s Facebook and TikTok. These content moderators experience horrible working conditions with none of the generous benefit packages and compensation that workers at the Big Tech companies enjoy, despite being critical to their users’ experiences.

And this abusive nature of this relationship has been recognized by courts. In California, a settlement was reached in 2021 which forced Meta to pay $85M, largely for the content moderators ongoing mental health treatment.

As the fediverse and Mastodon grow, who our content moderators are and how they are treated should be more important. We can’t only dwell on policies and tools. Yet we see big players like Mozilla and Mastodon moving quickly to replicate the same broken models, with the former already “engaging with moderation firms” as it prepares to launch its own mega-Mastodon-instance, Mozilla.social.

“Won’t Somebody Please Think of the Content Moderators?”

Mozilla’s apparent decision to consider outsourcing content moderation for their Mastodon instance is particularly frustrating. Mozilla is a nonprofit. It had $75 million dollars in assets in 2021. Its headquarters are based in San Francisco, a city that has seen poverty explode in the past few years. Mozilla.social is an opportunity for Mozilla to hire in-house content moderators and provide them with their famous world-class benefits package that includes the necessary health care and support that these content moderators will almost certainly require as they care for their instance’s users while developing its home city’s workforce.

My experience moderating social media content for the past 25 years has taught me that content moderation must be approached not only from a policy and technical perspective, but also from a moral and ethical perspective. The people doing content moderation are performing “care work” for society. To ignore the content moderators now will make the issue an afterthought later on and just replicate the abusive corporate models we have now.

I urge Mozilla Foundation, Mastodon gGmbH and any well-funded, incorporated entity to center the people who will be moderating content on their behalf as they expand their fediverse footprint.