Hearing What the Moderators Actually Do

Confused computer user

There has been much discussion in recent times about what social media companies and online platforms are doing to monitor content. For example, Facebook has moved to moderate Anti-Vaxxer content on its platform. Apple News is, of course, curated by editors. We often hear from the heads of companies about moderation, but not from the people who actually do it. Medium’s s Head of Trust and Safety spoke to people who have been on the frontline of this at a variety of tech companies. The conversation sheds a light on how decisions about content get made.

This is where the trust and safety team comes in. Most companies operating an online platform have one. It sometimes goes by other names — “content policy” or “moderation” — and comes in other flavors, like “community operations.” Whatever the name, this is the team that encourages social norms. They make platform rules and enforce them. They are at once the judges and janitors of the internet. This is not the job of a few dozen techie randos, but tens of thousands of workers, both full-time employees and contractors.

Check It Out: Hearing What the Moderators Actually Do

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.