Meta, an independent group created to help with content moderation decisions, announced on Tuesday its response to social media companies’ new hate speech policy, which was announced in January.
The board said Meta’s new policy was “urgently announced away from regular procedures,” and called on the company to provide more information about the rules. Additionally, the board asked Meta to assess the impact of the new policies on vulnerable user groups, publicly report their findings, and update the board every six months.
The board says it is in discussion with Meta to shape fact-checking policies outside the US.
Just weeks before President Donald Trump took office, Meta CEO Mark Zuckerberg embarked on an overhaul of the company’s content moderation policy to allow “more speeches” on Facebook, Instagram and threads. As part of this push, Meta has rolled back hate speech rules that protect immigrants and LGBTQIA+ users across a variety of platforms.
Regarding Meta’s new policy, the board says it has issued 17 recommendations for Meta. In particular, it states that it will measure the effectiveness of the new community note system, clarify revised stances on hate ideology, and improve how it can enforce violations of its harassment policy. The board also says it has asked Meta to maintain its 2021 commitment to the UN Guiding Principles on Business and Human Rights in 2021 by engaging with stakeholders affected by the new policy. The board says Meta should have done that in the first place.
The oversight committee has limited its ability to pilot Meta’s broader policies. However, META must comply with the judgment regarding individual posts in accordance with the company’s own rules.
If Meta needs to grant the board a policy recommendation introduction (which we did a few times before), then the group may have a channel to restructure the content moderation of Meta.
The decision released in 11 cases on issues across Meta’s platform, including anti-immigrant speeches, hate speech targeting people with disabilities, and suppressing LGBTQIA+ voices, appeared to be criticising some of the new content policies it announced earlier this year. Meta’s January policy changes did not affect the outcome of these decisions, the board said.
In two US cases, including videos of trans women on Facebook and Instagram, the board supported Meta’s decision to leave content despite user reports. However, the board recommends that Meta remove the term “transgenderism” from its hateful conduct policy.
The board reversed Meta’s decision to leave behind three Facebook posts on the anti-immigrant rebellion that occurred in the UK in the summer of 2024. The board found that Meta acted slowly to remove anti-Muslims and anti-Muslims and anti-immigrant content, violating the company’s violence and incitement policies.