Facebook may have run out of time on Donald Trump’s posts – I’m predicting a permanent ban at some point – but the episode is just a data point in a larger crisis of toxic expression on platforms social. A lot of attention has been paid to article 230 of the Decency of Communications Act 1996, which allows platforms to moderate content without taking legal responsibility for what users post. Many people in Washington want to change or end this law. But the bigger question for Facebook and Twitter is what kind of service do they want to be? One where courtesy reigns, or one where the corners of division poison society? Saying they want to be all hearts and flowers does not mean anything. The question is what do they want make to succeed.
To November 2020 New York Times item have reported instances where Facebook has cobbled together ways to curb misinformation and generally horrible content. One, in an effort to reduce conspiracy madness right after the election, assigned what he called NEQ (Quality of the Information Ecosystem) scores to articles, with reliable journalism ranking higher than the lie and fantasy. It made for a “better news feed”. But after a few weeks, the company shut down the filing system. In another experiment, Facebook trained a machine learning algorithm to identify the type of posts that were “bad for the world,” then demoted those in people’s News Feeds. Indeed, there were fewer toxic messages. But people connected to Facebook a little less and less time spent on Facebook is Mark Zuckerberg’s nightmare. the Times looked at an internal document in which Facebook concluded: “The results were good except it led to a decrease in the number of sessions, which motivated us to try a different approach.”
I find this decision short-sighted. Maybe in the short term, people wouldn’t connect to Facebook as much. But that shortfall could prompt the company to whip up healthier features that would bring people back – and not feel so angry when they use the service. Everyone would feel better and fewer employees threaten to stop because they feel they are working for Satan.
When Facebook and Twitter first started, none of the founders suspected that their creations would be used to change public opinion, and certainly not to poison the body politic like Donald Trump. The vision was to enrich people’s lives by letting them know what their friends were doing. But as their platforms grew, so did their ambitions. Zuckerberg decided to make Facebook the ultimate personalized journal. Twitter has positioned itself as “The pulse of the planet.”
In recent years, however, it has been difficult to look away from the consequences. The choice that platforms face has little to do with what’s legal, and everything to do with what’s fair. Time and time again, to explain why someone terrible is staying on the platform, Zuckerberg invokes company policies. But Facebook turns things upside down when it invokes its own rules, as if referring to a tablet that some wacky Moses passed down. The company should more methodically examine the results of its policies, which in many cases cry false. Typically, Facebook will defend a given outcome until enough people are disgusted with what is allowed to happen on its platform. then that makes a change. It has happened with anti-vaxxers, Holocaust denial and now Donald Trump’s attempts to destroy democracy.
For now, of course, Zuckerberg is right when he says, “The priority for the whole country must now be to ensure that the remaining 13 days and the days after the inauguration are carried out peacefully and in accordance with established democratic standards.” But after that, Mark Zuckerberg and Jack Dorsey have – both in a nutshell – “a lot of work to do”.