Ethiopian NGOs Say Facebook Ignored Warnings About Hate Speech
[music]
Melissa Harris-Perry: For the past several months, the Ethiopian government and opposition militias have been negotiating terms of a peace deal. The deadly civil war came to an official end in November of last year, but not before it claimed more than 600,000 lives and forced more than 5 million people to flee their homes.
Speaker 2: "Hospitals still need medicine. People still need food. They need water. Their needs remain extremely high."
Melissa Harris-Perry: In the wake of the violence and loss, many Ethiopians have called for accountability, from Facebook. They're saying that hate speech and calls for violence that circulated on the platform inflamed the conflict.
Abrham Amare: Now or never to act on that platform to reconsider its conduct, especially in that of the content moderators, even though it will not bring our happiness back, at least it will try to heal some of the damage that we're experiencing.
Melissa Harris-Perry: That's Abrham Amare, one of the people who filed a landmark lawsuit against Meta, Facebook's parent company, back in December. We're going to hear more from Abrham later in the show. At first, we're going to talk about a recent investigation by Insider that gives some insight into how Facebook's content moderation procedures failed to stop the spread of violent posts during the war. With us now is Tekendra Parmar, the Tech Features Editor at Business Insider, who led this investigation. Tekendra, thanks for joining us here on The Takeaway.
Tekendra Parmar: Thank you for having me.
Melissa Harris-Perry: All right, your investigation examined Facebook's Trusted Partner program. Can you explain what it is and how it's supposed to work?
Tekendra Parmar: Yes, totally. Facebook is pretty opaque about how the Trusted Partner program is supposed to work. What they do say publicly is that this program is key to its efforts to improve policies, enforcement processes, and products to help keep users safe on their platform. What this means in practice is that Facebook works with 400 such NGOs and civil society organizations across the world who help it monitor the spread of inciting content and trends, and how people are using the platform to incite violence or spread misinformation.
These local experts also have a special reporting channel through which they can flag harmful content. Facebook is reliant on their linguistic and cultural expertise to help it understand how hate speech, misinformation, and disinformation trends are developing.
Melissa Harris-Perry: Basically, because Facebook doesn't speak the language, it's relying on these trusted partners to flag the problematic content for them. Help me understand how that's different from, for example, how content moderation and policy expertise works in places like the US or in the EU, where presumably Facebook does speak the language,
Tekendra Parmar: Facebook does have content moderators in Ethiopia's main languages, but the main role of these trusted partners is to be the experts in the room who can also tell you how language is developing around an issue. For example, with hate speech, rarely is hate speech ever so transparent, and from like, go kill this person. It's usually encoded language that you would need someone with the expertise in that region to explain to you how this might be inciting.
I think the one thing that I want to highlight is, when Facebook is accused of proliferating genocide, as it is in Ethiopia, it often promotes its relationship with the civil society organizations to signal to the press that they are, in fact, taking this issue seriously. My reporting shows, internally, they are often ignoring the advice of these very experts.
Melissa Harris-Perry: Okay, why ignore the advice if these are the folks who you've brought in the room to be the experts?
Tekendra Parmar: Yes, that's a very good question. I think there are various ways in which we can answer that. I think, on one level, Facebook's Trusted Partner program, at least, from the partners that I have spoken to, they have gotten the impression that it is quite understaffed despite being a very essential program. The other thing that we need to think about is hate speech as was revealed in the Facebook Files revelations is one of the biggest overhead costs for Meta.
Meta will deny this, but I think this is a reasonable argument that these experts have made is that when Meta says we have an economic incentive to moderate content, because our advertisers are primary customers, do not want to be showing their advertisements around the same with child pornography or videos of war crimes or what have you. I think that is true to a certain extent. That is definitely true in the United States or Western Europe, where the platform derives the majority of its advertiser revenue. When you look at countries like Ethiopia, or the Global South, in general, is referred to as rest of world, and rest of world countries may come in--
Melissa Harris-Perry: Wait a minute. Who refers to it as rest of world?
Tekendra Parmar: This is the accounting term that a lot of tech companies use in order to describe the Global South. If you ever looked through their investor documents, you will usually see a country breakdown, or a continent breakdown, North America, Europe, and then you'll see this large bucket called rest of world, and rest of world includes places like Ethiopia, it includes places like India, Sri Lanka, all of these places that may not fall in the general bucket of where these companies see their primary customers or audience really being.
If you look at the revenue that Meta is generating from these places, it's just about 10% of the company's total revenue. What that means is, the economic incentives may just not be there for Meta to be spending as much moderating content in places like Ethiopia as it might be in the United States.
Melissa Harris-Perry: Tekendra, you've given us a good sense of what the incentives or lack of incentive structure might look like for Facebook Meta. Help me to understand what the job of content moderation is like for the people doing the work.
Tekendra Parmar: One of the things to remind ourselves of, this job is very hard. It's hard because human language develops over time, and so while these platforms will want to have you believe that their AI is so sophisticated, that the way that it's trained, it's going to scoop up all this hate speech and misinformation, and the programs are going to do it, that's just simply not true.
You need a human in the machine because those humans are the ones who are able to actually spot trends and advise on that. Now, we've known for a while that content moderation in terms of mental health is a very difficult job. This is true for the content moderators that are working for outsourcing firms like Genpact, Accenture, or Samasource, one of the major providers of outsourced moderators for Facebook in Africa. All of these moderators have reported some degree of post-traumatic stress after watching countless hours of graphic videos.
Now imagine that you were doing this job as an NGO worker. Some of these trusted partners that I spoke to haven't even taken Facebook's money to do this job. They believe that it's their civic duty to do it, and so they do it, basically for free. Imagine these people doing this job basically for free, advising Facebook as experts, and watching countless hours of their country embroiled in civil war.
It has the same mental health impact that we've been seeing in the actual labor force of content moderators where some of the trusted partners that I spoke to talked about having symptoms of post-traumatic stress disorder. One of them described not being able to hang out with their friends or family afterwards just because the content was too overwhelming, that they were watching every day. It was compounded by the fact that they knew Facebook wasn't doing anything about it.
Another one told me that they had to remove all the knives in their apartment because after watching so many horrific videos of people using machetes to go after each other, they couldn't bear to have knives in their apartment. Those are all pretty serious symptoms of post-traumatic stress disorder,
Melissa Harris-Perry: The trusted partners were telling you that they were being ignored. What does that suggest to you about the kind of responsibility that Facebook has?
Tekendra Parmar: That's the overarching question behind this $1.6 billion hate speech lawsuit against Facebook, and Kenya for its role in proliferating ethnic violence in Ethiopia. I think what my reporting shows is that Facebook has these levers that it can use in conflict zones like Ethiopia, but it's choosing to ignore them. At least in some instances, this has had deadly consequences.
One of the trusted partners that I spoke to said they were flagging content around the death of Professor Meareg Amare who was the subject of this lawsuit at least a month prior to his eventual demise. This professor was eventually gunned down outside of his home. His killers were chanting the same slurs and citing the same misinformation that was on Facebook while they were assassinating him. What role Facebook has eventually as a journalist? I'm not sure I can say. That's for the lawsuit to play out. This is what I can say is that the levers that Facebook had ignored.
Melissa Harris-Perry: Tekendra Parmar from Business Insider and former Takeaway intern. Thanks so much for talking this through with us, Tekendra.
Tekendra Parmar: Thank you.
Melissa Harris-Perry: We've got more on this story coming up. It's The Takeaway.
[music]
We've been discussing hate speech on Facebook during the Ethiopian Civil War. Since February, the Ethiopian government has blocked Facebook and other social media, but that doesn't mean the effect of the platform is over.
Abrham Amare: Its impacts still are occupying the whole atmospheric condition in the city. Hello. My name is Abrham, second son of late Professor Meareg Amare Abrha.
Melissa Harris-Perry: Abrham Amare is working towards his PhD in Peace and Development Studies. His father, Dr. Meareg Amare, was a prominent professor of analytical chemistry.
Abrham Amare: He devoted his time and his energy to his children for the betterment of us, and he had a special condition with his father. He was a civil, law-abiding person that able to compromise science and religion. He was a religious person at the same time, so he was an amazing father.
Melissa Harris-Perry: The professor was Tigrayan. It's an ethnic group prominent in the Tigray region where the conflict was centered. Human rights investigators documented campaigns of ethnic cleansing against Tigrayans and atrocities perpetrated by Tigrayan militia, as many factions capitalized on years of ethnic hostilities.
In late 2021, a post appeared on Facebook, accusing the professor of corruption and theft, and included his photograph and his address. The photos spread. Facebook users called for violence against the professor. Days later, a group of men attacked him outside his home and shot him.
Abrham Amare: Due to that of the incitement and calling for violence against our father on this Facebook page, the militias, they more target him.
Melissa Harris-Perry: Abrham says he reported the posts but more than a year later, they were still up.
Abrham Amare: Once I have seen those posts, I just report to Facebook to remove it but nothing happened. It shows how the company see Africans as our concerns or nothing.
Melissa Harris-Perry: Abrham, his sister and his mother, have all been forced to relocate, each of them to a different continent. Abrham is now in the United States where he's applied for asylum and hopes to restart his scholarly studies, but his father's body remains somewhere in Ethiopia, in an unmarked grave.
Abrham Amare: There will not be any chance for me to return back, and I don't have anything there.
Melissa Harris-Perry: Despite this final attempt to dehumanize him, Professor Amare's legacy promises to endure for his family and for the country. He authored four widely used chemistry textbooks. He often appeared on local TV to encourage young students to pursue scientific study.
Abrham Amare: You can imagine how this tragedy brought a big impact upon the family and the country at large.
Melissa Harris-Perry: Last year, Abrham and others filed a class action lawsuit against Meta in Nairobi, Kenya. It alleges that Facebook's algorithms prioritize hateful and violent content, and that Meta has failed to devote the necessary resources to moderating this content across the continent of Africa. They're asking Facebook to reinforce content moderation, practices and staff, and create a restitution fund of $2 billion for victims of violence.
Abrham Amare: Nothing will bring back our father. We just demand that Facebook to be safe in Ethiopia. There should not be any other family to suffer the same as us.
Melissa Harris-Perry: In April, a Kenyan court granted the plaintiffs leave to serve Meta in California.
Abrham Amare: Now or never to act on that platform to reconsider its conduct. Especially in that of the content moderators, even though it will not bring our happiness back, at least it will try to heal some of the damage that we're experiencing.
Melissa Harris-Perry: Just to note that we reached out to Meta, Facebook's parent company for comment. If we hear back, we'll put that comment on our website at the takeaway.org.
Copyright © 2024 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.