Arguing That Platforms Can’t Moderate Content is a Cop Out

Facebook logo

Mike Masnick writes about Elizabeth Warren’s Facebook feud over its advertising policy that leaves room for fake information. He also says it’s “impossible” to moderate content at scale. I disagree. Facebook and the rest of Big Tech have billions of dollars. They absolutely can moderate content. They either choose not to, or put in place petty measures that don’t do anything. Perhaps the new motto for corporations should be, “If you can’t do it ethically, don’t do it at all.” Online platforms should follow the same/similar rules that broadcasters do.

And this is the point that lots of us have been trying to make regarding Facebook and content moderation. If you’re screaming about all the wrong choices you think it makes to leave stuff up, recognize that you’re also going to pretty pissed off when the company also decides to take stuff down that you think should be left up.

Check It Out: Arguing That Platforms Can’t Moderate Content is a Cop Out

3 thoughts on “Arguing That Platforms Can’t Moderate Content is a Cop Out

  • @geoduck: I think you’re holding back, man. Tell us how really feel.

    Andrew:

    We live in interesting times.

    Amongst the characteristics of these times are inconsistency, contradiction and, shockingly, hypocrisy.

    On the one hand, there is the tendency to give the benefit of the doubt and extend licence to serial offenders, like FB, to wreak havoc on societies planet-wide by failing to exercise control over their own platform from becoming an asset to hostile state actors to destabilise liberal democracies – an act of regulation within their sphere of influence and direct control as a private company that would not conflict with any laws in an open society, so long as the terms are clear, and the process is published, transparent, and open to dispute and resolution.

    On the other, we hold liable, complicit and hypocritical companies, like Apple, that acquiesce to state laws in countries whose systems we abhor, despite those laws lying beyond the sphere of influence and domain of the company, and such compliance being a requisite not simply of conducting business in said countries, but of preventing the nationals in those countries who work for Apple from the reprisal and legal repercussions of engaging in criminal conduct, as defined by said laws in their native countries. I have little doubt that, were Apple to defy Chinese law, for example, and refuse to take down apps which Apple’s own legal department concur conflict with established Chinese law, and Apple’s Chinese national employees were to be arrested, imprisoned and sent to ‘re-education camps’ (sort of like summer camp, only year-round, and instead of nature walks, story-telling round the campfire, it’s indoctrination, rock-breaking and beatings round-the-clock, but I digress), many of the same voices calling for Apple to defy local law would decry Apple for taking actions that would put their local employees in harm’s way.

    Meanwhile, FB gets the apologists’ pass because, why ever should entitled, reckless, uninformed, unprincipled but monied change agents be held accountable for the harm that they inflict and the damage that they do? Perish the thought! Yes, we need these self-appointed wrecking balls to shake up ‘the system’; and was not of FB’s original motto, https://mindmatters.ai/2018/10/facebooks-old-motto-was-move-fast-and-break-things/ ‘Move fast and break things’? All kinds of things, like rules, laws, promises, user trust…social stability? And why stop there, when economies and democracies still stand? O, the possibilities!

    Mike Masnick makes several good points, and I concur with many of the issues that @JohnKheit makes. However, @geoduck is right; FB could regulate their platform. Would it be a perfect system? No, name the extant system that is perfect. If it exists, it wasn’t human-made. That the system would be imperfect, and might have to struggle to improve over time, getting things wrong along the way should never be an excuse not to try. In FB’s universe, JFK declared, ‘This nation should commit itself to never landing a man on the moon…because it’s hard!’

    Here’s a thought, an admittedly non-Zuckerbergian thought. FB could work with legal experts and legislators to create a protocol, a protocol that might necessitate new legislation, detailing a standard methodology of how it would police its site whilst simultaneously permitting free speech, so long as that speech is not so much free from error or outright lies so much as it is not incendiary content designed to provoke outrage, hatred, or otherwise lead to disorder and violence (eg Myanmar, India, Western European countries on the receiving end of migrants, the US and youthful extremists of various types – all places where people have actually been killed following extremist FB posts). FB already has sufficient content to conduct an AI-powered analysis of which content has led to or been associated with violence, and what are its most reliable indicators. This part is not hard.

    Imperfection is not a barrier. It’s a challenge, a call to a higher standard and way of life. And those who rise to that call, not in sullen isolation, but in a spirit of cooperation and solidarity with the like-minded have seldom been condemned for trying, but will earn the benefit of the doubt, even when they fail.

  • This last weekend it struck me how disingenuous Mark Zuckerberg was being. In his testimony before congress, and in public statements from him and FaceBook, he acts like they are just a bunch of kids that had no idea this sort of thing might happen. He acts like they are shocked people would use his platform for lies, political propaganda, hate speech, violent or abusive material, etc. He maintains that editing material never crossed their mind. That it would be bad for Democracy, bad for society, if they tried to be the gatekeeper.
    Sorry but I’m not buying this line of BS.
    They have the resources, they have the technology, they have the money. If YouTube can flag one of my videos because it used a song written by the Beatles, but played on the violin by a friend of mine, then Facebook can run algorithms that would catch key words, and pick out particular imagery from videos. If Tumblr can flag images of porn (and after a shaky start their filters are working rather well now) then Facebook surely can. If Instagram (owned by Facebook) can block images of self harm then Facebook surely can.
    The question is why don’t they?
    Money.
    Zuckerberg is a greedy SOB that personifies the very worst of the Internet Generation. The same holier than thou, I’m smarter than you so your rules don’t apply to me mindset that has caused so much misery. The same Libertarian, lasses faire mindset that led to the Great Recession, Enron, Napster, the Savings and Loan collapse, and on and on.
    Zuckerberg wants to pretend that he’s just running a simple internet company, not the biggest spying, data manipulation, and data broker in the history of the world.
    Not content with selling access to anyone with money, be they hate groups, terrorist fronts, or foreign actors trying to manipulate the vote throughout the West, he wants to branch out. He wants to create his own currency to rival, and hopefully replace, national currencies.
    He is truly a megalomaniac worthy of a Bond film. He IS mom of Mom’s Old Fashioned Robot Oil.
    So yes, platforms, and FaceBook especially, can and should moderate content. If a local mall can say this store can open on their property, but not that one, social media certainly can do something analogous.
    But they won’t unless they are made to put the good of the country and society over profit.

  • We can agree to disagree. First, Apple has billions and cannot manage an autocorrect that isn’t purely idiotic. Second, the more they try to vet content, the more they are an editor, and the more liable they are for things that are on their network. ATT has billions, it doesnt mean a) it’s practical or even doable or b) they should be monitoring and editorializing everything on their network. That’s why ATT is exempt from liability for what is said on their networks. That is why facebook/google have been exempt for liability on their networks… but the more they editorialize, the more likely they will lose that liability defense, and go bankrupt from all the law suits by million people for bad things said on a network.

    That said, the companies that have not reached monopoly status are free to editorialize all they want. They should be up front about that they do so, and what their slant is, and seems like a choice for users to make on if they want to engage with a big corp brother entity.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.