The Oversight Board, an independent entity established to assist Meta with content moderation decisions, released its response on Tuesday to Meta’s newly announced hate speech policies from January. According to the Board, these new policies were introduced in a manner that deviated from standard procedures, prompting a call for Meta to provide further clarification on its rules. Additionally, the Board requested that Meta assess the impact of these policies on vulnerable groups, publicly report the findings, and update the Board every six months.
The Board is also engaged in discussions with Meta to influence its fact-checking policies outside of the United States. Prior to former President Donald Trump taking office, Meta CEO Mark Zuckerberg began revising the company’s content moderation guidelines to promote “more speech” across platforms like Facebook, Instagram, and Threads. This initiative included relaxing hate speech guidelines that previously protected immigrants and LGBTQIA+ users.
Concerning the updated policies, the Board issued 17 recommendations to Meta, including measuring the effectiveness of a new community notes system, clarifying its position on hateful ideologies, and enhancing enforcement of harassment violations. Additionally, the Board emphasized the importance of Meta adhering to its 2021 commitment to the UN Guiding Principles on Business and Human Rights by involving stakeholders affected by the policy changes—a step the Board believes should have been undertaken initially.
Although the Oversight Board has limited capacity to influence Meta’s overarching policies directly, the company is required to comply with the Board’s rulings on individual posts under its own rules. If Meta chooses to issue a policy advisory opinion referral, which has occurred previously, the Oversight Board might gain an avenue to reshape Meta’s content moderation practices.
In resolutions concerning 11 cases across Meta’s platforms related to anti-migrant speech, hate speech against those with disabilities, and the suppression of LGBTQIA+ voices, the Board appeared critical of several new content policies introduced by Zuckerberg earlier in the year. However, the Board noted that these policy changes did not impact the outcomes of those decisions.
In two U.S. cases regarding videos of transgender women on Facebook and Instagram, the Board upheld Meta’s choice to retain the content despite user reports, while recommending the removal of the term “transgenderism” from its Hateful Conduct policy. The Board also overturned Meta’s decision to leave up three Facebook posts about anti-immigration riots in the U.K. during the summer of 2024, determining that Meta was too slow in removing content violating violence and incitement policies related to anti-Muslim and anti-immigration sentiments.