In letters sent to Talk about their decisions, Amazon, Apple and Google all cited the lack of a functioning social media company to keep violent content from leaving its platform. “The processes that Parler put in place to moderate or prevent the distribution of dangerous and illegal content has proven insufficient,” Apple wrote. “Specifically, we continued to find direct threats of violence and calls for action without law.”
You can see why these companies wouldn’t want to expose App Store customers to a social media platform whose moderation system has failed to prevent the spread of harmful content. But then you must be wondering what is preventing them from banning Facebook, Twitter and YouTube. The last few years of social media history have been nothing but a relentless cycle of platforms falling short of their claims about how they control themselves. Facebook was used to facilitate ethnic cleansing in Myanmar, and with its much larger user base was almost certainly a bigger vehicle for “Stop the steal” disinformation than Parler. Journalists and academics have credible accused YouTube on the conduct of right-wing radicalization. Twitter was long popular for allowing lots of sexist and racist abuse.
All three companies have, to varying degrees, imposed stricter policies over the past year due to the coronavirus pandemic and the election. But it’s still easy to find content that seems to break the letter of the rules. Even a few days before the Capitol attack, journalists groups found on Facebook and Twitter calling for revolution. Amazon’s letter to Talk notes that the company reported 98 examples of “messages that clearly encourage and incite violence.” It’s hard to imagine that Facebook, with its larger user base, doesn’t eclipse that number.
All of this makes the decision to ban Speaking seem somewhat capricious.
“I think the public perception is that all of these scary people who have gathered on Capitol Hill, have met and continue to meet on Talk, while Facebook and Twitter are doing something about it,” Danielle Citron said. , professor of law at the University. de Virginie and an expert in online harm. “And therefore Speaking is the lowest fruit.”
To be clear, there are big differences between Parler, whose whole raison d’être is to provide an almost completely uninhibited space for expression, and mainstream platforms, which now boast of their efforts to combat certain types of misinformation and their sophisticated AI moderation. tools. Parler had a few minimum rules, including against fraud, doxing, and threats of violence. But the company’s stated mission was to create an online platform where content is governed by the principles of the First Amendment. “Talking has no hate speech policy,” Talking COO Jeffrey Wernick told me last week before the Capitol riot. “Hate speech has no definition, okay? It is not a legal term. ”
Wernick is right. The First Amendment – which, I feel compelled to remind you, applies to government, not private companies – protects a lot of material that most people don’t want to see on social media. This allows pornography. It makes it possible to glorify violence. This allows for explicit racism. And so, talk too.
Following the First Amendment, however, Parler’s policies were prima facie inconsistent with those of Apple, Google, and Amazon, even outside of the app issue. Google and Apple, for example, both explicitly to prohibit apps in their stores to allow hate speech.
Perhaps the biggest problem with Talking was that it offered a lot more leeway for the kind of material that major platforms define as menacing violence. Indeed, under the doctrine of the First Amendment, the government can only criminalize very narrow categories of speech, such as so-called “real threats” – roughly speaking, language explicitly intended to make an individual or a group feared for. his life or his safety. To say that people should rise up in arms or that a politician or celebrity should be shot would not meet the criteria of incitement or real threat. Believe it or not, this kind of speech is legally protected. (It can still make you knock on the door with curiosity from the Secret Service. I don’t recommend it.) Community Speaking Rules mirrored this standard.