GHANA WEATHER

African Nations Sue Meta Over Moderator Mental Health, Sparking Tech Ethics Debate

Facebook
Twitter
LinkedIn
WhatsApp
Pinterest
Facebook
Twitter
WhatsApp

By Nana Karikari, Senior International Affairs and Political Analyst

Social media behemoth Meta, the parent company of Facebook, Instagram, and

WhatsApp, faces a wave of legal challenges across Africa for the alleged mental health

impact on its content moderators. The moderators are reportedly tasked with the

distressing job of reviewing and removing graphic content, including violence, murder,

and child sexual abuse.

As corroborated by some of its former employees, these lawsuits have ignited fierce

debate on the ethical obligations of global technology corporations toward their African

workers and the underlying difficulties in ensuring platform safety.

These cases are not just about money; they’re a sobering reminder that keeping these

platforms “safe” for consumers comes at a real human cost. It makes you wonder, what

exactly do these tech giants, raking in billions, owe to the people who are doing this

destructive work on the continent.

Ghana Joins Ongoing Legal Actions

This fundamental question of responsibility has moved beyond private concerns. It is

increasingly erupting inside courtrooms across the continent. Highlighting the expanding

scope of this legal battle, Ghana is now the latest African nation in which Meta’s content

moderators are suing the technology giant. Similar to the now high-profile cases from

Kenya and South Africa, the Ghanaian lawsuit alleges Meta breached a duty of care

owed to content moderators working in the country. They were allegedly subjected to

harmful content without proper psychological protections.

The Ghanaian plaintiffs’ accounts detail the development of severe mental health

conditions. They allege this was a direct consequence of their jobs. This situation is

indicative of the dire realities faced by content moderators across Africa. Among them

are several of the charges being corroborated by a recent joint investigation by the UK

independent newspaper the Guardian and the Bureau of Investigative Journalism.

This corroborative evidence has significantly bolstered the legal case against Meta,

underscoring the alleged failure to protect workers from being exposed to toxic material.

Consequently, a growing pan-continental concern is growing regarding the well-beingand safety of those who engaged in the critical, yet largely unseen, work of policing

digital spaces.

Ghana’s involvement in this latest wave of litigation has increased the unified chorus,

demanding greater accountability and improved working conditions for these essential,

yet often overlooked, gatekeepers in the digital ecosystem.

Legal Actions in South Africa and Kenya

Prior to the Ghanaian lawsuit, a similar legal action had already begun in South Africa.

Likewise to the case in Kenya, the South African lawsuit highlights the potential

long-term psychological effects on children exposed to harmful content. This case goes

even further to allege that African moderators were treated to a separate, inferior

standard compared to their colleagues in other parts of the world. In South Africa,

plaintiffs argue that Meta owes a fundamental duty of care to all its employees,

regardless of their geographical location, and that the provisions currently in place are,

at the very least, inadequate to meet this duty.

This first wave of lawsuits began in Kenya, where former and current moderators have

filed a class action suit against their employers based on allegations of an unsafe work

environment and negligent provision of psychological support.

Litigants in Kenya recount experiencing debilitating mental health conditions including

Post-Traumatic Stress Disorder (PTSD), anxiety, and depression as a direct

consequence of having to view highly graphic and violent content, frequently with little to

no access to emotional debriefing or mental health services. Through their legal case,

they are looking to hold Meta accountable for the harms done, and require that the

company does far more to protect the health of its content moderation staff.

Separately, Kenyan courts ruled that Meta could indeed be held accountable in the East

Africa nation for its involvement there in the proliferation of hate speech that fanned the

flames of the conflict in Ethiopia. This decision represents another victory in securing

Kenyan courts as the appropriate forum to hear this important case, defeating for now

all of Meta’s attempts to argue otherwise.

The Realities of Content Moderation

Content moderators, the majority of whom are located in and around the Global South,

share their stories to shed light on the challenging nature of their jobs. These

moderators are literally the first line of defense against a tidal wave of harmful content

that would otherwise spread like wildfire across the internet. They are allegedly exposed

to a continuous flow of abusive, hateful, and violent content.

These conditions, the plaintiffs claim, caused traumatic symptoms, insomnia, and

emotional distress, which was exacerbated by what appears to be a refusal to provide

proper psychological support.

Meta’s Stated Position

Meta maintains the claim that it prioritizes the safety and well-being of its global content

moderating workforce. To support this, the company frequently promotes its features,

which it says include on-site wellness coaches, immediate access to therapy, and

resilience training programs. Meta has argued that these measures should be

considered as part of the company’s ongoing attempt to create a safer workplace and

completely train its moderation staff.

In response to the lawsuit, a representative from Meta’s law firm said that Meta was

taking these accusations very seriously and is dedicated to creating a safe and inclusive

environment for all content on its platforms.

Criticism and Potential Legal Arguments for Meta

Critics argue that these current measures Meta has put in place will not be sufficient to

shield contractors from the severe psychological trauma experienced by employees of

their platforms, especially in nations where mental healthcare is not readily accessible.

Meta’s possible legal defense might include arguing that the company is in compliance

with laws from every jurisdiction in which it operates. Secondly, Meta might argue that

the content that is flagged is in fact indicative of the global community standards and

legal variations. The tech giant might also highlight clauses in its Terms of Service and

contractual agreements with moderation contractors, where workers acknowledged the

potential exposure to harmful content.

Also, in defense, Meta will probably point to its key investments in programs that

support mental health and its ongoing efforts to improve these resources.Meta might also show the multi-dimensionality of moderating a global platform with

billions of users and illustrate the challenging process of enforcing uniform international

standards.

Their legal strategy will have to prove they exercised enough due diligence in trying to

mitigate the amount of harm that has occurred and what they provided or offered was

within their reasonable reach. That is, if plaintiffs argue otherwise that this isn’t enough.

Broader Implications for Tech Accountability in Africa

These high-profile legal actions in Ghana, South Africa, and Kenya are part of a larger

movement for multinational technology corporations operating in Africa and their

accountability to those whom they harm. They perform a vital task in interrogating the

ethical obligations these tech companies owe to their African workers and advocates’

demands for stronger regulatory oversight.

As the digital space continues to expand further onto the African continent, ensuring

equitable production conditions and adequate mental health support for those who take

on these demanding jobs of protecting online communities is even more critical.

Dealing with Complex Legal and Ethical Issues

The ensuing litigation will certainly be filled with nuanced arguments about duty of care,

adequacy of Meta’s support infrastructure, and long-term mental health impacts of

content moderation work.

It will be up to Ghanaian, South African and Kenyan courts to listen to the evidence

provided by each side and determine the balance between the operational needs of a

global technology platform and the fundamental rights and welfare of its workforce. This

is a fundamental principle of building an inclusive workplace and a moral imperative that

tech companies must correct their shortcomings on mental health related to content

moderation, plus a commitment to comprehensive, culturally-competent,

easily-accessible support systems.

Now that Ghana has joined this legal fight, it is even more imperative to take a

consistent and much fairer approach to worker welfare through Meta’s operations

throughout the continent.

A Changing Legal Landscape

These massive legal battles looming over Ghana, South Africa, and Kenya highlight a

growing call for tech giants to take responsibility for the damage they are causing on the

African continent. It’s a matter of shining light on corporate accountability these

companies have to the people they employ on the continent and demanding tougher

regulation and oversight.

If these suits succeed, their positive outcomes could serve as a powerful precedent for

raising labor standards to the benefit of the workers in the multinational tech industry.

These combined legal actions initiated by the three African countries may prompt other

jurisdictions to examine the labor practice and provision of mental health treatment of

content moderators.

On the other hand, if these suits fail, the ramifications would be disappointing to the

pursuit of greater tech responsibility on the continent and worldwide. A negative ruling

would mean that current legal protections are inadequate to cover the unique situation

of content moderators. It may embolden multinational tech giants to continue business

as usual in terms of labor practices and mental health care. A defeat would eliminate

the pressure for improved standards, and it would be more difficult for workers in other

regions with the same conditions to seek legal remedy, possibly bringing long-overdue

improvements in their own workplaces to a halt.

SOURCE:

A joint investigation with the UK’s Guardian and the Bureau of Investigative Journalism.

More Stories here

Leave a Reply

Your email address will not be published. Required fields are marked *

ADVERTISEMENT