Facebook monitoring The council published its first five decisions Thursday. The decisions are well thought out and show the board members, responsible for reviewing Facebook’s decisions to remove content and making recommendations on Facebook policies, takes his job seriously. More than anything, however, they show the futility of moderating content on networks with over 3 billion users, or nearly half of the world’s population.
The cases involve messages in five languages, and often, subtleties of meaning and interpretation. Two touch on deep-rooted global conflicts: China’s oppression of Uyghur Muslims and the current border war between Armenia and Azerbaijan. We’ve long known that the vast majority – almost 90% now – of Facebook users are outside the United States, but the scale of these cases underscores the scale of Facebook’s challenge.
Facebook has vaunted automation as a solution to this challenge, but these cases also highlight the shortcomings of algorithms. In one, Facebook’s automated systems deleted an Instagram post in Portuguese from a Brazilian user showing bare breasts and nipples. But the post was an effort to raise awareness of breast cancer, an exception to Facebook’s general no-nudity policy, and an issue that has Facebook tormented for a decade. To his credit, Facebook reinstated the post before the oversight committee heard the case; but it still highlights the problems of letting algorithms do the work. In the other case, involving a quote allegedly from Nazi propaganda chief Joseph Goebbels, Facebook memory function had in fact recommended that the user re-circulate a post from two years earlier. The old post had presumably been allowed to remain, raising questions about the consistency of Facebook’s standards for content review.
Facebook ad the creation of the board of directors in 2018, after years of criticism of its role as foment ethnic hatred, political disinformation, and other ailments. It took nearly two years to bring together the 20 members, whose decisions on specific pieces of Facebook content are said to be binding.
In one statement ThursdayFacebook’s vice president for content policy, Monika Bickert, said the company will follow the board’s decisions to restore four items, including Brazil’s Instagram post. The board also suggested changes to Facebook’s policies, which the company is expected to respond to within 30 days. Bickert said the recommendations “will have a lasting impact on how we structure our policies.”
In one case, however, she left a doubt. The board recommended that Facebook notify users when their content is being removed by an algorithm and allow remedies. Bickert said the company expects to take more than 30 days to respond to this recommendation.
Thursday’s cases may have been relatively easy. Soon: the politically heavy decision to restore Donald Trump’s account, which is sure to anger a bunch of Facebook users (and employees), regardless of the decision taken. Facebook rejected this decision to the board last week.
Taken together, the cases decided on Thursday reveal the enormity of Facebook’s challenge. Social media management company Social Report estimated in 2018 that Facebook users post 55 million status updates and 350 million photos every day; they send 9 million messages per hour and share 3 million links.
A decision on any of these positions can be extremely complex. In October, a user in Myanmar, writing in Burmese, posted photographs of a Syrian Kurdish child who drowned while trying to reach Europe in 2015, and compared the reaction to the photo to what the The user called it “a lack of response from Muslims in general to the treatment of Uyghur Muslims in China.” “