The Unseen World of Content Moderation
BROOKE: In an average minute, 100 hours of video are uploaded to YouTube. 2,460,000 new Facebook posts appear. And 433,000 new tweets are tweeted. All of these services and others like them rely on a massive workforce of content moderators to flag the illegal or profoundly offensive. They are the line that stands between users and videos of beheadings, child pornography, animal torture, illegal solicitation, and all manner of the unimaginable and unspeakable. WIRED contributor Adrien Chen headed to the Philippines, where it seems, much of content moderation happens.
CHEN: Part of it is just because it's really cheap to hire these people there and there's already this call center infrastructure. And if you talk to a lot of content moderation companies, they'll tell you that the Philippines, because they were a former US colony, they have a better sense of what Americans find objectionable or not.
BROOKE: You mean they have a more direct line to the US psyche?
CHEN: Yeah, that's what they claim, and I have talked to some people who've mentioned that a company had tried to outsource to India, and it just didn't work because they weren't able to parse what is just a risque photo and what is not allowed by these companies.
BROOKE: Do you have an example of that that can be used on the radio?
CHEN: The story somebody told me was, we would get videos and photos of people in bikinis, and if a bikini is a certain length, then it's not allowed and if it's another length, it is. And I guess the people in India were just nuking all bikini photos.
BROOKE: You said that it's a lot cheaper to hire people abroad, which seems obvious, but exactly how cheap is it?
CHEN: I talked to one guy who the least he was offered was $300 a month and after working at another place for many months he ended up making around $500 a month.
BROOKE: Not a living wage here.
CHEN: No, definitely not, although it's actually a pretty good wage in the Philippines.
BROOKE: So you actually went to Manila. You mentioned a company called Whisper. Describe what Whisper does and then talk about the day-to-day work of a content moderator.
CHEN: Well, Whisper is an app that allows you to share secrets anonymously. One of the challenges with that is that if you don't moderate if very heavily it's just going to devolve into all sorts of abuse and illegal things. And so they moderate every single thing that's posted on their service. Most services will just do things that are flagged by moderators, but every single photo or post that's on Whisper will be seen by somebody in the Philippines. There's this big white board where they have written down the different categories of things they're looking for, you know: porn, underage photos, violence, hate speech. They actually also get a regular report from their headquarters of what's kind of going on in the US, in pop culture. Things they should be on the lookout for. So, if Justin Bieber does something, then you know they'll know that it's not just people bullying somebody named Justin or something.
BROOKE: You talked to a number of people who've done this, and it starts out seeming like a pretty good job, somebody got to watch Battlestar Galactica on one screen, while just casually casting his eye on the photos streaming on the other screen. Describe what some of those experiences were like and how they might've changed.
CHEN: It's important to note that some of this is being done in the US by kind of recent college graduates who don't really have any other option for employment in the tech industry. It seems like an easy way to make a pretty good wage. It's you know, not super labor-intensive, it's not working in food service, but one guy that I talked to who did this, when he started out he was able to just kind of absorb all of these images, but after a while it ended up psychologically wearing him down and this was compounded by the fact that he realized that he wasn't going to get a job at Google through this. And so it was kind of the combination of, oh, I'm just this disposable thing, that's supposed to be soaking up all of this horrible stuff and also the actual content that he's having to see.
BROOKE: What you're soaking up is the darkest side of the human id.
CHEN: I mean it's basically the worst images you can imagine. Child abuse, really extreme violence--
BROOKE: Committed on helpless people, or helpless animals. People who seem to be doing it for the sheer pleasure that they derive from it.
CHEN: Yeah, the guy who moderated for YouTube he said that for some reason the animal abuse was always the hardest for him. He saw a video of somebody setting a dog on fire to discipline it and that's something that stuck with him.
BROOKE: One psychologist you spoke to in Manila, she likened the after-effects of too much of this work to PTSD.
CHEN: There are just such vivid and horrific images that you can't really help but dwell on it. I also met a quality assurance representative named Maria and she worked for a cloud storage service based in the US. And her job was to basically double-check the actual agents who were doing the screening to make sure they didn't miss anything, and they were filling out of the forms right. But she was more concerned about the people she was supervising, because she's seeing that some of them have become very paranoid, especially as regards to their children. They don't want to leave their kids with babysitters, because they've seen such horrible things in their job.
BROOKE: I think one of the things that struck me is that this work demands human beings, clued into American mores, and laws. This has to be done by brute force of eyes and clicking fingers, is there no alternative to human moderation?
CHEN: Well, everyone I talked to said there was no way a robot could do all of this. They can come up with programs and algorithms that make it more effective and streamline, but there's always going to be someone who has to look at it. And also, the kinds of moderation that's going on is getting more and more nuanced and complicated, and so I think you're always going to need people and probably more and more people as time goes on.
BROOKE: So, more than the estimated hundred thousand people to immerse themselves in human depravity to protect us from it. You don't see any way around it?
CHEN: I talked to a professor named Sarah Roberts who's the only person that I know of doing serious research on this, and she said, it's kind of a question of human nature. And until we solve that problem, this problem's never going to go away.
BROOKE: Adrien thank you very much.
CHEN: Thank you.
BROOKE: Adrien Chen is a contributor to WIRED magazine.