Following reports of genocide in Myanmar, Facebook banned the country’s highest general and other military leaders who used the platform to promote hatred. The company also bans Hezbollah of its platform because of its status as a foreign terrorist organization designated by the United States, despite the fact that the party holds seats in the Lebanese parliament. And he bans the leaders of countries under US sanctions.
At the same time, Facebook and Twitter have stuck to the principle that content posted by elected officials deserves more protection than content from ordinary individuals. give more power to the speech of politicians than to that of the people. This position contradicts a lot of evidence indicating that hate speech by public figures has a greater impact than similar speech by ordinary users.
Clearly, however, these policies are not applied uniformly around the world. After all, Trump is far from the only world leader using these platforms to stir up unrest. Just look to the BJP, the party of Indian Prime Minister Narendra Modi, for more examples.
While there are certainly short-term benefits – and a great deal of satisfaction – to be gained from Trump’s ban, the ruling (and those that came before it) raise more fundamental questions about speech. Who should have the right to decide what we can and cannot say? What does it mean when a company can censor a government official?
Facebook politicians, and Mark Zuckerberg in particular, have for years proven to be poor judges of what is and is not an appropriate expression. From the platform breast ban its tendency to suspend users for respond against hate speech, or its utter failure to suppress calls for violence in Myanmar, India and elsewhere, there is simply no reason to trust Zuckerberg and other tech leaders to make these big decisions.
Repealing 230 is not the solution
To address these concerns, some are calling for more regulation. In recent months, demands have multiplied on both sides of the aisle to repeal or amend Chapter 230—The law that protects businesses from liability for the decisions they make about the content they host – despite serious misrepresentation by politicians who should know better on how the law actually works.
The point is, the repeal of Section 230 probably wouldn’t have forced Facebook or Twitter to remove Trump’s tweets, nor would it prevent companies from removing content they find objectionable, whether that content was publicity. pornography or Trump’s disturbing declamations. It’s the First Amendment rights of businesses that allow them to run their platforms as they see fit.
Instead, repealing Section 230 would hamper competitors to Facebook and other tech giants, and place a greater risk of liability on the platforms for what they choose to host. For example, without Section 230, Facebook attorneys could decide hosting anti-fascist content is too risky in light of the Trump administration attacks on antifa.
This is not a far-fetched scenario: the platforms are already restricting most content that could even be loosely linked to foreign terrorist organizations, out of fear that material support laws could hold them accountable. Evidence of war crimes in Syria and vital counter-narratives against terrorist organizations abroad have been suppressed as a result. Likewise, the platforms have been criticized for blocking any content apparently linked to countries subject to US sanctions. In a particularly absurd example, Etsy banned a homemade doll, made in America, because the list contained the word “Persian”.
It’s not hard to see how increased platform accountability could lead to even more vital rhetoric being suppressed by companies whose sole interest is not in “connecting the world” but in getting out of it. profit.
Platforms don’t have to be neutral, but they should play fair
Despite what Senator Ted Cruz keeps repeating, there is no requirement that these platforms be neutral, and there should not be any. If Facebook wants to start Trump – or photos of breastfeeding mothers – that’s the prerogative of the company. The problem isn’t that Facebook has the right to do it, but that – due to its acquisitions and unhindered growth – its users have virtually nowhere to go and are faced with more and more rules. issues and automated content moderation.
The answer is not to repeal section 230 (which again would hinder competition), but to create the conditions for increased competition. This is where the Biden administration should focus its attention in the coming months. And these efforts must include reaching out to content moderation experts from advocacy and academia to understand the range of issues faced by users around the world, rather than just focusing on the debate in the United States. .