BOB GARFIELD: Historically, the media have always struggled with self-restraint. In new media, the issues get more complicated because such platforms as Google, Twitter and Facebook are global on a globe with many different standards of permissible expression. The most obvious conflicts occur in countries where authoritarian regimes suppress political speech. But the most universal challenge, at play everywhere, every day is the policing of hate speech. Social media sites must navigate between an American free speech tradition that prizes openness, a European tradition more eager to restrict incendiary rhetoric and authoritarian governments baldly trying to stamp out dissent.
The power to eradicate speech is not only in the hands of commercial companies, it is carried out by small groups of men and women who together govern global expression on a grander scale than any one government ever has. Writing in the New Republic, George Washington University Law Professor Jeffrey Rosen says these people are called the deciders – a sly nods to President George W. Bush. Not Rosen's word choice. The term, he says, came from the folks at Google.
PROF. JEFFREY ROSEN: The Google people were struck by the fact that Nicole Wong was the one who was woken up in the middle of the night by requests from the Turkish government to remove offensive videos posted by Greek football fans or by Thai officials who didn’t like videos insulting the King of Thailand, and just the scope of power that she had as the ultimate decider convinced them to give her that rather grand name.
BOB GARFIELD: As I understand it, Google and Twitter and Facebook are acting based only on complaints. They're not crawling their content looking for stuff to stifle.
PROF. JEFFREY ROSEN: Most of their work is done through human first responders, the people who are charged with responding to user complaints. When I visited the YouTube headquarters a few years ago and walked around and was told to identify the first responders, I had trouble picking them out. Everyone looked the same. There were all these 22-year-old kids wearing flip-flops and t-shirts hunched over their laptops. So I, I gave up. I said, I don’t know who the first responders are.
And my guide said eventually, look at the porn flickering on their laptops. They’re making the initial decisions, but if they’ve got a tricky case, then it’s escalated up to the deciders in the Silicon Valley. Nicole Wong who used to be at Google is the Twitter decider. At Facebook, one of the deciders-in-chief is called Dave Willner. And those are the people who will make the final decision.
BOB GARFIELD: And when they get home their significant other says, how was your day at work dear, and ugh, what a day. I had a – I had three single nipples and a accusation of fascism.
PROF. JEFFREY ROSEN: Dave Willner at Facebook is married to another Facebook employee who leads the User Safety Team responsible for child protection and suicide prevention, and one imagines they have rather heady dinner chatter.
But Dave Willner is a fascinating story. He's not a lawyer. He joined Facebook right out of college, where he was an anthropology major. And his first job was to work night shifts in the Help Center, to answer emails from users about how to use the photo uploader. He was soon, within a year, promoted to head the User Content Team, and then he was charged with basically coming up with standards for Facebook to adjudicate free speech for the entire world.
Willner eventually hit on the idea the standard should be as easy to apply by first responders as possible, and that's how Willner came up with Facebook's most important decision, which was to define hate speech in a way that banned attacks on groups defined by religion, ethnicity, or sexual orientation, but not attacks on the leaders of those groups. So that's why Facebook decided not to take down the Innocence of the Muslims video, because they concluded that was an attack on the Prophet Mohammed, but not an attack on Muslims in general.
BOB GARFIELD: So where does that leave these companies, you know, let let's say in the Middle East, where insults against the sitting government are often illegal, or, or Turkey, where it's a crime to insult Kemal Atatürk, who’s been dead for 75 years? How do they operate there?
PROF. JEFFREY ROSEN: Turkey is a fascinating case study. Google was blocked in Turkey for several years because Greek football fans posted videos on YouTube, accusing Kemal Atatürk of being gay. Well, that's illegal in Turkey. It's an insult to Turkishness. So Google said, okay, we'll just take down those videos in Turkey. We’ll block access to them for users with Turkish Internet protocol addresses. That wasn’t enough for the Turkish prosecutor who wanted the content to be removed for the entire world. Google refused to do that and, as a result, Google was blocked in Turkey for years.
But the punch line of the [LAUGHS] story is the Turkish Prime Minister got up and said in public, well sure, Google is blocked, but it's not too hard [LAUGHS] to get around the firewall.
[BOB LAUGHS]
I’ve done it myself.
BOB GARFIELD: [LAUGHS] All of these social media powerhouses have adopted something much closer to the US standard than that of other democracies around the world. Do you think it’s because these are American-based companies, and that these deciders have American values or just what?
PROF. JEFFREY ROSEN: Part is that they’re Americans. Many of them went to US law schools or, or studied the US tradition in college. Another important fact is that it happens at the moment, at least, to coincide with their business model. If the Internet were Balkanized and if they were required to be censors-in-chief for the entire world, sort of deciding on a case-by-case basis what stayed up and what came down, their entire business model would be challenged.
But that’s not to say that- that these companies will continue to embrace these standards in the future. Just recently, Facebook, for example, under great pressure from women's groups who thought that speech that was offensive to women was being left up on Facebook, issued a new policy guidance basically saying, we’ll go back to the drawing board, we’ll talk to legal experts and we’ll require that hateful jokes about women be identifiable and not anonymous and maybe we’ll change our standards in the future.
So that just reminds us that it's not as if there's a tremendous political constituency for this beautiful American First American standard. And if it turns out that it's commercially inconvenient to allow hate speech in the future, then these companies, which ultimately are responsible to their shareholders, not to the US Supreme Court, might end up changing their tune.
BOB GARFIELD: Well, Jeffrey, thank you very much for joining us.
PROF. JEFFREY ROSEN: A pleasure. It was good to be with you.
[MUSIC UP AND UNDER]
BOB GARFIELD: Jeffrey Rosen wrote, “The Deciders” for the New Republic.