Facebook has public guidelines, but the advice on which content moderators base their decisions is a closely guarded secret. The Guardian, however, has got hold of a copy of the 300-page document. It goes into minute detail, including dictating which emojis count as “praise” or “condemnation.”
A particular area of contention surrounds what are defined as dangerous individuals and organisations. In the leaked documents dating from December 2020, moderators for Facebook and Instagram are instructed how to define “support” for terrorist groups and other “dangerous individuals”, whether to distinguish between “explaining” and “justifying” the actions of terrorists, and even in what contexts it is acceptable to call for the use of “gas chambers”. While Facebook’s community guidelines – once almost entirely hidden from the view of users – have been public since 2018 when it first laid out in a 27-page document what it does and does not allow on its site, these newly leaked documents are different. They constitute much more detailed guidelines on what the published rules mean in practice. Facebook has long argued that to publish the full documents would be counterproductive since it would let malicious users avoid a ban for deliberately borderline behaviour.
Check It Out: Facebook’s Content Moderation Rules Revealed