Facebook Oversight Board Says Company Should Be More Transparent in Its Reporting

Placeholder while loading article actions

More than a year after its founding, the Facebook Oversight Board argued in the first of what were to be annual reports that the social media company should be much more transparent about how it decides which posts and accounts to leave and which ones to delete.

The Board of Directors, an international panel of human rights advocates, politicians and academics that oversees Facebook’s thorniest content moderation decisions, said the company has made progress in implementing implemented board policy recommendations, but needed to share more information on content removal systems. The group took aim at the opaque nature of the company’s strikes system, which gives users who violate the platform’s content guidelines a specific number of passes and a tiered penalty system before their accounts are terminated. suspended.

Facebook, which last year renamed its parent company Meta, did not immediately respond to a request for comment.

“The board is encouraged by first-year trends in its engagement with Meta, but the company urgently needs to improve its transparency,” the group said in the report. “The Board continues to have significant concerns, including regarding Meta’s transparency and disclosure of certain cases and policy recommendations.”

Facebook designed the Oversight Board as an experiment, as regulators around the world tried to craft uniform rules governing social media platforms. The company argued that the board could set content policy direction and be a model for other companies’ governance structures.

But critics have questioned whether a board without formal authority and serving at the company’s whim has enough power to force Facebook to follow its recommendations for issues plaguing its platform, including misinformation and speech. of hatred. Although the board has offered independent oversight of the company, it depends on Facebook to provide it with information, funding and the power to effect change.

The annual report highlights some of the challenges the group faces as it makes critical decisions about how the company should support free expression for its users while mitigating the harms of problematic speech. In public comments and court rulings, the board has repeatedly faulted Facebook for not giving the oversight board and users enough information to assess the company’s content moderation systems.

Facebook ban on gun sales gives sellers 10 strikes before starting them

Since its inception, the council has ruled on a series of cases, including deciding that an Instagram user’s breast cancer awareness post violated the companies’ rules against nudity, and speaking out on the issue of whether Facebook should have suspended then-President Donald Trump’s account. role in the January 6 attack on the United States Capitol.

The board said Wednesday that of the 20 cases the company and users had brought to it in 2021, it had overturned Facebook 14 times and upheld six of its decisions.

In Trump’s case, the board upheld Facebook’s decision to suspend the former president, but told the company it needed to clarify its policies regarding penalties for politicians who break the rules and take the final decision on whether Trump could return to the platform. Facebook ultimately decided to suspend Trump for two years, opening the door for him to return to the site ahead of the 2024 presidential election.

Under the rules, Facebook and its users are allowed to appeal to the Supervisory Board cases in which the company has removed posts for violating its Community Standards – rules it imposes against hate speech, harassment and other problematic content types. The decisions taken by the Supervisory Board on these cases are considered binding.

Separately, the Supervisory Board may issue policy recommendations for changes to the company’s content moderation systems, but these are not considered binding.

Overall, Meta has pledged to at least partially implement two-thirds of the council’s 86 policy recommendations, according to the report. For the rest of the recommendations, Meta said he was already doing the suggested work, would not act, or would assess the feasibility of implementing the council’s policy suggestion.

Among the most common recommendations, the council urged Facebook to give users more information about the rules they are breaking when their content is removed.

“Our recommendations have repeatedly urged Meta to follow certain central principles of transparency,” the board said in the report. “Make your rules easily accessible in the languages ​​of your users; tell people as clearly as possible how you make and carry out your decisions; and, where people break your rules, tell them exactly what they did wrong.

Facebook now notifies English-speaking users when their content is removed for hate speech and is testing this policy for content in Arabic, Spanish and Portuguese, as well as posts removed for bullying and harassment, according to the report.

The oversight board has also launched an implementation committee to assess whether the company is actually making the policy changes it says it will make in response to the board’s policy recommendations, the board said.

Facebook Oversight Board Slams Company’s Collaboration in Early Transparency Reports

Tension between the oversight board and Facebook erupted last fall when the board chastised Facebook for its lack of transparency about a program to exempt famous people from penalties for posts that violate Facebook’s content rules. the company. At the time, citing internal Facebook documents, the Wall Street Journal had reported that while the company told the board that the program only affected a “small number of decisions”, it actually included at least 5.8 million users in 2020. The board jumped , arguing that the company had not been “fully open”.

Barbara M. Stokes