Home Technology news Everyone on Facebook’s supervisory board should quit

Everyone on Facebook’s supervisory board should quit

0

[ad_1]

Facebook monitoring The council is set to decide whether Donald Trump should be allowed to revert to a platform he used to incite racist violence.

While the board ostensibly has the power to make that decision, Facebook itself will make the final appeal. Since the creation of the board of directors in 2018, we have noted that its power is illusory. It provides cover for Facebook, a veneer of accountability, even though the company allows and encourages hate and misinformation.

The board is dysfunctional by design, which is why it has done nothing over the past year even as Facebook amplified Trump’s lies about the Covid-19 pandemic. The council’s helplessness became even more evident when Facebook allowed Trump to repeat allegations of voter fraud, which set the stage for the deadly white supremacist insurgency on the U.S. Capitol on Jan.6. It was only afterwards that the world witnessed Trump’s incitement to this violent raid. that the platform giant suspended its Facebook and Instagram accounts.

Facebook’s business model has benefited from the promotion of hate and goes far beyond those disseminated by Trump. No decision by the board will change that. If board members are serious about making an impact, they must all step down.

It’s no surprise that the impressive roster of legal and human rights experts on the board have failed to contain Facebook’s toxicity. The company designed the board to be ineffective. This supposedly independent entity exudes an appearance of autonomy and authority, but it is powerless to make any structural changes to Facebook’s deeply flawed content moderation process.

Facebook has claimed the board’s decisions will be binding, but its actions don’t inspire much confidence. He narrowed the initial scope of the board’s review to content deletions and recently extended it to content that was left in place (and only Facebook may ask the board to review other issues). Either way, Facebook controls the entire content review and appeal process, and a user must exhaust all of their options through Facebook before appealing to the board. And the company is very opaque about how it determines what content can and cannot be appealed.

The Supervisory Board takes a small fraction appeals are routed through Facebook, with a 90-day maximum arbitration process. He can speak with experts, commission reports and solicit comments from the public. Ironically, while the board touts transparency as one of its main pillars, the commenting process leaves a lot to be desired. It takes a scavenger hunt to find its open crates on its website.

Finally, the board can also make policy recommendations as part of its decisions, but these are just suggestions that Facebook can take or leave.

It’s clear that a Facebook-funded board of directors will never be allowed to tackle the hate and misinformation that drives the core of the company’s engagement business model. The board cannot hold Facebook accountable for its lackluster enforcement of its rules regarding world leaders. He cannot ask Facebook to account for his role in actively amplifying Trump’s rhetoric and undermining the health and safety of its users, especially people of color, women and others who are most often targeted by hate groups and disinformation merchants who have taken up residence on Facebook.

Facebook controls every step of the board’s operations, creating a smart trap: reinstating banned content means Facebook got ‘wrong’, allowing the board to claim its independence. But keeping controversial content off the network could suggest that Facebook is “right” and that the board is beholden to Facebook. Ultimately, the board is unable to truly influence the kind of change that would protect users and strengthen our democracy.

This is why it is time for the members of the Supervisory Board to maintain their professional integrity and resign. By rejecting the assumption that Facebook can be governed by a quasi-official entity, its members could regain some confidence in their respective fields and, more importantly, remove a barrier to doing important work.

Facebook should focus its efforts on what we know works: creating clearer standards, improving transparency, and stepping up efforts to enforce its rules fairly. We all know that is not enough. We also need public policy solutions it will disrupt this for-profit hate business model. Facebook alone cannot and cannot fix itself.


WIRED Opinion publishes articles by external contributors representing a wide range of perspectives. Read more reviews here, and see our submission guidelines here. Submit an editorial to opinion@wired.com.


More WIRED stories

[ad_2]

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Exit mobile version