In 2025, Meta, the world’s largest social media company, acquired a new tone that is rebellious to what extent it accepts responsibility for the real-world harm that the platform activates.
This is widely understood as a gambit to Curry’s favor with President Donald Trump’s administration and Meta’s CEO and founder Mark Zuckerberg, but in a video on January 7th, the end of a third-party fact-checking was announced.
“We’ll be working with President Trump to push back governments around the world, chasing American businesses and even censoring,” Zuckerberg said, giving his product decisions a clear geopolitical flavour.
Zuckerberg and Meta appealed to the US constitutional protections of the right to freedom of expression to justify the company’s decision to eliminate fact-checking and moderation of reduced content on the platform. Fortunately, for us who live in a country where Meta has vowed to “push back,” we also have a constitution.
In Kenya, for example, if I represent a group of previous metacontent moderators in a class action lawsuit against the company, the post-independence constitution differs from the US and Western European constitutions due to the explicit prioritization of fundamental human rights and freedoms. The constitutions of many countries with colonial history share this in common.
We are now beginning to see how these constitutions can withstand in the global technology industry. In a landmark decision last September, the Kenya Court of Appeals ruled that content moderators can file human rights violation lawsuits against the national labor court meta.
Few people in the West understand the importance of this ruling. Meta is certainly the reason why he fought teeth and claws in court, as it certainly is, and continues to use all diplomatic tools at his disposal to resist the demands of content moderators for relief. Mehta is interested in bringing this decision to the Supreme Court.
Meta and other major US companies maintain complex corporate architectures to avoid exposure to taxes and regulations in dozens of countries they operate. They generally claim that they do not operate in a country that counts millions of users and employs hundreds to refine their products. Until now, these claims have rarely been challenged in court.
The case content moderator presented in court when he was hired by a business process outsourcing (BPO) company called SAMA and acted only as a content moderator for Facebook, Instagram, WhatsApp and Messenger from 2019 to 2023, when much of the moderation of African content on these platforms was run on Nairobi. Meta denies these workers and claims they are employed solely by Sama. Sama is a matter currently in lawsuit in a Kenya court.
These workers know that there is no obvious reversal to content moderation in the meta. The company has never taken this issue seriously, as presented in a complaint with the court. It is not enough to stop civil and ethnic conflicts, political violence, and mob attacks on marginalized communities where mobs thrive on their platforms. It’s not serious enough to pay fair wages to those tasked with making sure that it doesn’t. Harm travels in both ways. Toxic content burns real-world fears, and those fears produce more toxic content that saturates the platform.
Content Moderators are Meta’s digital cannon feed in the war with harmful content that the company had not actually committed to the fight. The case presented by Nairobi content moderators explains how they accepted the work they thought would include call centers and translation work. Instead, they ended up at Meta’s content moderation hub in Nairobi, where they exposed their days to endless torrents of streamed violence and abuse.
Many of them were forced to see the atrocities committed in their homeland to protect Meta users from the harm of seeing these images and videos. They absorbed the trauma, so others in their community didn’t have to do so, and many found this to be a noble calling.
However, this job has hit their mental health. Over 140 previous content moderators have been diagnosed with PTSD, depression, or anxiety that arise from the time of work. Another case deals with how efforts to unionize to advocate for better mental health care have been hampered. Then there was a big layoff and relocation of Facebook content moderation elsewhere.
This left hundreds of people who affected trauma and paths for human rights abuses. Meta claims it does not employ Facebook content moderators and is not responsible for it. The lawsuit is ongoing, and moderators are currently relying on courts to unravel the complexities of employment dynamics.
While fighting the lawsuit in court in March 2024, the company sent a delegation led by the then President of World Affairs, where former British deputy Prime Minister Nick Clegg met with Kenya’s William Root and legislators to discuss, among other topics, a vision for a partnership with the government that will bring about a “generative AI revolution.” At a town hall event in December, Root is referring to Meta’s former content moderation partner, Sama, “We’ve changed the law so no one can take you to court again,” referring to a bill passed by Kenya’s parliament, protecting large tech companies from cases like the future.
All of this pushback occurred well before Trump was re-elected, and these efforts appear to be an attempt to avoid accountability for the company’s labor practices and the impact of the product. But something surprising happened. It opens the door for others around the world who work on behalf of the tech industry, but the industry itself denies. The court ruled that our case could proceed to trial.
Despite the intense legal and political challenges, the fact that this case has progressed is evidence of the revolutionary nature of the postcolonial constitution, which prioritizes human rights above all else.
As our case in Kenya continues, I hope it can provide inspiration to technology workers in other colonial countries. It continues to remind the great technology that while the right to freedom of expression is an important human right, the right to dignity and freedom from exploitation is equally important.
The views expressed in this article are the authors themselves and do not necessarily reflect Al Jazeera’s editorial stance.