You may have seen this coming almost ten years ago when #YourSlipIsShowing revealed how racist Twitter users were posing as black women on the internet. Maybe, for you, it was during Gamergate, the online abuse campaign targeting women in the industry. Or maybe it was the mass shooting in Christchurch, when an armed man steeped in the culture of 8chan was broadcast live murdering dozens of people.
Maybe it was when you, or your friend, or your community became the target of an extremist mob online and saw online anger becoming a danger and harm in the real world.
Or maybe what happened on Wednesday, when a crowd of internet-fueled Trump supporters stormed the Capitol, was a surprise.
For weeks they’d been planning their action in plain sight on the internet – but they’ve been showing you who they are for years. How much shock you feel about the power and danger of extremism online right now depends on whether or not you pay attention.
The consequences of inaction
The mob that tried to block Congress from confirming Joe Biden’s presidential victory showed how the stupidity and danger of the far-right internet could reappear in the real world, but this time they hit the center from the US government. Neo-Nazi streamers weren’t just inside the Capitol, they were putting on a show for an audience of tens of thousands who encouraged them in the discussions. The crowd was having fun making memes in the halls of American democracy as a woman –a Trump supporter whose social media story shows his dedication to QAnon“Was killed while trying to break into the offices of Congress.
The past year, especially since the pandemic, has been a giant demonstration of the consequences of inaction; the consequences of ignoring the many people who begged social media companies to take seriously the extremist memes and conspiracy theorists who thrived on their platforms.
Facebook and Twitter acted to slow QAnon’s rise to power during the summer, but only after the pro-Trump conspiracy theory could grow there relatively freely for three years. Account bans and algorithm tweaks have long been too little, too late to deal with racists, extremists and conspiracy theorists, and they rarely addressed the fact that these powerful systems worked exactly as intended.
I spoke with a small handful of people who could have told you this was going to happen for a story in October. Researchers, technologists and activists have told me that big social media companies have, throughout their history, chosen to do nothing or only act. after their platforms cause abuse and harm.
Ariel Waldman attempted to get Twitter to significantly tackle abuse there in 2008. Researchers like Shafiqah Hudson, I’Nasah Crockett and Shireen Mitchell have been tracking exactly how harassment works and have found an audience on these platforms ever since. years. Whitney Phillips spoke about how haunted she was by laughing – not only from others, but from her own too – early on in her research on online culture and trolling, when mostly white researchers and personalities treated the extremists among them as avant-garde curiosities.
Ellen Pao, who briefly served as CEO of Reddit in 2014 and resigned after introducing the platform’s first anti-harassment policy, was amazed that Reddit didn’t ban r / The_Donald until June 2020, after evidence had accumulated over the years to show that the popular pro-Trump bulletin board served as an organizing space for extremiss and a channel for mob abuse. Of course, at the time of its ban, many of its users had already migrated from Reddit to TheDonald.win, an independent forum created by the same people who were running the previous version. Its pages were filled with dozens of calls for violence ahead of Wednesday’s rally that turned into a coup attempt.
Banning Trump doesn’t solve the problem
Facebook, Twitter, and YouTube didn’t create conspiratorial thinking or extremist ideologies, of course. They also did not invent the idea of dangerous personality cults. But these platforms have – by design – given these groups the mechanisms to reach a much larger audience much faster, and to recruit and radicalize new converts, even at the expense of the people and communities these ideologies target. for abuse. Most importantly, even when it was clear what was going on, they chose the minimum of change – or decided not to intervene at all.
In the wake of the attempted coup on the Capitol building, people are once again looking to major social media companies to see how they are reacting. Focus is on Trump’s personal accounts, which he used to encourage fans to come down to DC, then praised them when they did. Will he be banned from Twitter? There are compelling arguments as to why it should.
But as heavy and consequent as it may be, it is also, in another way… no. Abuse, harassment, conspiratorial thinking and racism can still benefit social media companies that remain interested in acting only when it is too late, even without Trump retweeting and encouraging them.
Facebook has banned Trump indefinitely and has also increased the scope of moderation of their groups, where a lot of conspiracy-fueled activity lives. These changes are good, but again, not new: people have been talking to Facebook about it for years; Facebook employees have been talking about it to Facebook for years. The groups were instrumental in organizing Stop the Steal protests in the days after the election, and before that, in anti-mask protests, and before that in spreading fake news, and before that in as a central space for anti-vaccine misinformation. None of this is new.
There are only so many ways to say that more people should have listened. If you pay attention now, maybe you will finally start to hear what they are saying.