People 'n' Issues
Meta watchdog overturns 70% of Facebook decisions
The independent Oversight Board appointed by Meta has released its inaugural Annual Report, with revealing findings
Share
- Click to share on Twitter (Opens in new window)
- Click to share on Facebook (Opens in new window)
- Click to share on LinkedIn (Opens in new window)
- Click to email a link to a friend (Opens in new window)
- Click to share on Reddit (Opens in new window)
- Click to share on WhatsApp (Opens in new window)
- Click to share on Pinterest (Opens in new window)
The Oversight Board, an independent panel of diverse experts from around the world holding Meta accountable for content decisions on Facebook and Instagram, says it has overturned Meta in 70% of case decisions. It has overturned its content moderation decisions 14 times and upheld its decisions 6 times.
The Board’s inaugural Annual Report chronicles the evolution and impact of the first-of-its-kind organisation, and outlines its mechanisms for holding Meta accountable. The Annual Report details the Board’s progress in ensuring Meta treats its users around the world fairly, offering a critical review of the Board’s efforts to ensure transparency and accountability at Meta. The report covers the period from October 2020, when it started accepting appeals, through to December 2021.
“The Oversight Board is proud to publish our first Annual Report and share publicly for the first time a more fulsome view into our impact on Meta in just over a year of operating,” says Oversight Board director, Thomas Hughes. “Through this Report, we’re able to demonstrate the significant impact the Board has had on pushing Meta to become more transparent in its content policies and fairer in its content decisions.
“As we move forward, it’s important to ensure the commitments Meta has made, some of which will take time, are not kicked into the long grass and forgotten about. Therefore, this Report outlines for the first time our new, data-driven approach to track how the company is implementing each of our recommendations.
“The Board will continue to play a key role in the collective effort by companies, governments, academia and civil society to shape a brighter, safer digital future that will benefit people everywhere.”
Key highlights of the Report include:
- 1.1m+ cases submitted to the Board by users and Meta.
- More than 8 in 10 user appeals to restore content concerned Meta’s rules on bullying, hate speech, or violence & incitement.
- Published 20 decisions, ranging from Covid misinformation to hate speech.
- Over half of the decisions related to countries in the Global South.
- Overturned Meta in 70% of case decisions – overturning its content moderation decisions 14 times and upholding its decisions 6 times.
- Upheld Meta’s decision to suspend former President Trump.
- Nearly 10,000 public comments from people and organisations around the world shaped the decisions.
- Made 86 recommendations to Meta.
- Some of the Board’s recommendations that Meta committed to:
- Being more specific with users when removing hate speech posts,
- Rolling out new messaging in certain locations telling users whether automation or human review resulted in their content being removed,
- Translating its rules into languages spoken by 400+ million people,
- Providing more reporting on government requests,
- Adopting a new Crisis Policy Protocol to govern its response to crisis situations.
- For two-thirds of the 86 recommendations, Meta either demonstrated implementation or reported progress, with recommendations on transparency most likely to fall into these categories.
The Board is in discussions with Meta about expanding its scope, including to review user appeals of its decisions in areas like groups and accounts. The Board is also expanding its stakeholder outreach in Asia, Latin America, the Middle East, and Africa.
The Board says it is encouraged by first-year trends in its engagement with Meta, but the company must urgently improve its transparency. As outlined in the Annual Report, the company has fully met its commitments on case decisions, has agreed to implement more than half of the Board’s policy recommendations and is increasingly answering the Board’s questions. However, the Board continues to have significant concerns, including around Meta’s transparency and provision of information related to certain cases and policy recommendations.
The Annual Report’s analysis is independent of Meta’s own reporting and, in some cases, the Board’s assessment of whether Meta has implemented a recommendation from the Board may differ from the company’s own reporting.
Read the full Annual Report here.
Share
- Click to share on Twitter (Opens in new window)
- Click to share on Facebook (Opens in new window)
- Click to share on LinkedIn (Opens in new window)
- Click to email a link to a friend (Opens in new window)
- Click to share on Reddit (Opens in new window)
- Click to share on WhatsApp (Opens in new window)
- Click to share on Pinterest (Opens in new window)
Thank you for Signing Up |