DAVID FOLKENFLIK From WNYC in New York, this is On the Media. I'm NPR media correspondent David Folkenflik, sitting in for Brooke Gladstone. This week, we learned that two cases are heading to Supreme Court that have the potential to upend the way the Internet works.
[CLIP]
NEWS REPORT It starts with 23 year old Noemi Gonzalez. She was one of 130 people who was killed in Paris during an ISIS terrorist attack way back in 2015. [END CLIP]
DAVID FOLKENFLIK The first is Google v Gonzalez.
[CLIP]
NEWS REPORT Her family is arguing in court that YouTube helped to spread ISIS's violent message because its algorithms suggested extreme content to users based on the previous videos that they had watched. [END CLIP]
DAVID FOLKENFLIK The other one is Twitter v Taamneh, which claims that platforms are aiding and abetting acts of terrorism by spreading ISIS content online.
[CLIP]
NEWS REPORT And it's a big test of an old law Section 230. [END CLIP]
DAVID FOLKENFLIK That's the name of the quarter-century-old law that shields tech companies from liability for any content posted on their platforms.
[CLIP]
NEWS REPORT In a nutshell, it says, If I tweet something very slanderous about you, you can sue me, but you can't sue Twitter. This case that the Supreme Court just granted says, okay, but what if my tweet is so hilarious and gets so many likes that their algorithm puts it at the top of everyone's feed and shows it to everyone? Then could you sue Twitter for amplifying my slanderous tweet? [END CLIP]
DAVID FOLKENFLIK Section to 30 tweets platforms as publishers. And so it offers the kinds of protections that publishers get. But it's had criticism lobbed at it from the left and the right.
[CLIP MONTAGE]
NEWS REPORT Section 230 is really one of the cornerstones of free speech online.
NEWS REPORT Critics of yours will say this is just a bunch of legal acrobatics that protect pimps and protect traffickers while kids are being raped.
NEWS REPORT Don't allow individuals to fund rhetoric and content in our country that's being used to incite our people to violence. [END CLIP]
DAPHNE KELLER In a world without 230. We would see people who believe they've been defamed by a MeToo post, for example, bringing lawsuits saying, hey, take this down because it is defamation and I believe you are liable for it.
DAVID FOLKENFLIK Daphne Keller directs the Program on Platform Regulation at Stanford Cyber Policy Center. She says that it's been known for some time that Justice Clarence Thomas has been wanting to challenge 230 and has been practically begging for someone to make the case.
DAPHNE KELLER And the plaintiffs in these cases, I think, very cleverly framed their petitions to the Supreme Court to say, hey, Justice Thomas, this is the case you wanted, but I don't think it is the case he wanted. I think these two cases just have complicated wrinkles that make it much harder to resolve than a classic 230 case.
DAVID FOLKENFLIK Why is this concept of legal liability? Why is this immunity so vital for them?
DAPHNE KELLER Because they are processing such a vast amount of speech. I think the latest stats are hundreds of hours of video uploaded to YouTube every minute and their capacity to review it all and figure out if it's legal is nil. All of them are trying very hard to enforce policies that they have themselves against things like violent extremism. And we can see that they don't do a great job. And that's because it's hard. If we want to have a world where there are platforms where ordinary people can go and share their baby videos or their funny cat pictures or their political opinions immediately, you know, without some lawyer reviewing it first, then we do need the platforms to have some form of immunity.
DAVID FOLKENFLIK On the left, it seems like there's a lot of confusion. I feel like there are contradictory arguments that big tech has too much influence, too much power, but also a lot of pressure from the left. They should be throwing Alex Jones off and similar arguments about Donald Trump, which of course, got him bounced from Twitter and a couple of other places. Are there any signs of a coherent platform on the left on what reforms could look like or where things should land?
DAPHNE KELLER I don't think there is a coherent proposal that spans the left politically, and I think one of the biggest tensions is sort of between responding, using the tools of competition and saying these platforms are too big, they have too much concentrated power, we want to reduce that power, on the one hand, and on the other hand saying, oh, they have all this power and we want them to use it to get rid of illegal speech or to get rid of vaccine disinformation. And you can't have both.
DAVID FOLKENFLIK How does the right think about these major social media platforms?
DAPHNE KELLER The reversal of positions on the right here has been head spinning because if you rewind a few decades and look at their positions about broadcast companies or cable companies, the Republican position was always that these are private property owners, they can decide what speech they want to carry. The government shouldn't come along and tell them what to do. And indeed, if the government is coming along and telling them what to do, there's a real risk that that will introduce political bias into what we see on broadcast or on cable. So it was part of the GOP's official party position for decades to oppose the Fairness Doctrine, which was a doctrine about making broadcasters carry certain speech. And now Republicans are constantly calling for a Fairness Doctrine for the Internet. Now that there are big private companies with tremendous influence on speech run by people they perceive as too liberal. They are very interested in taking away that power and in saying this private property needs to be shared with the public and the government can tell you what to do with it.
DAVID FOLKENFLIK There's another related case that you say is likely to come before the court sometime soon. Last month, the Fifth Circuit upheld a law passed in Texas which claims to be making platforms act as common carriers. It says if platforms moderate content, they can't treat speech differently based on its viewpoint.
DAPHNE KELLER I take that to mean and I think most people take that to mean you have to give identical treatment to content that says the Holocaust did happen and content that says the Holocaust did not happen. The sort of logical version of what common carriage would mean for platforms is carry all the speech unless it is actually illegal. Which again leaves you with the beheading videos and the pro-suicide content and all the other sort of terrible, lawful-but-awful material that's on the Internet.
DAVID FOLKENFLIK Lawful but awful. That should be translated into Latin and be used as the motto for our times.
DAPHNE KELLER I don't think the Texas lawmakers understood what floodgates they were opening. I think they were imagining that some speech that offends liberals would be kept online, and that made them happy. And I don't think they were imagining that their grandmothers would be encountering barely legal porn on Facebook or that their children would be encountering pro-anorexia videos on Tik Tok. There is some denial about just how extreme the consequences of a common carriage obligation would be.
DAVID FOLKENFLIK You're saying, look, if it's not totally broken, don't make major changes. Maybe you find ways to get tech companies to take different steps.
DAPHNE KELLER I think that's right. It's not that CDA 230 is a perfect law. I can imagine improvements to it, and we have a really interesting model in the European Union. They just finished a big piece of legislation called the Digital Services Act that does offer a different model for how to go about this through sort of very careful, specific takedown obligations and opportunities for the affected user to appeal. Transparency, so you can tell if there's disparate impact and what platforms are taking down. You know, there are other models out there. That's not what Congress has been proposing. The proposals to amend CDA 230 that we see in Congress tend to be very ham-fisted and really not to have thought through the likely consequences of the rules that they're creating.
DAVID FOLKENFLIK There's also a bill making its way through Congress called the Journalism Competition and Preservation Act. I don't think it's very likely to pass anytime soon, but it seems like another attempt to acknowledge and regulate the role of big tech in forcing it to kick money to news outlets that they get content from. Tech journalist Charlie Warzel speculated in his newsletter last week that we're looking at the beginning of the end of the Internet as we've known it. Is that right? Is the Internet on the brink of fundamental change?
DAPHNE KELLER It might be. We are seeing more Balkanization than we ever have before. Meaning that different countries are imposing different laws and sometimes erecting barriers so that you can't see content from another country. We are seeing moves toward much tighter regulation in the EU. This new law they've passed will set up regulators that get annual risk assessments and risk mitigation plans from the biggest platforms. And then the regulators can come back and tell them how to mitigate risk better. We are absolutely seeing a change in how the biggest platforms will be regulated. I hope that leaves the smaller creatures in the undergrowth still relatively free. You know that there will be smaller websites and smaller places people can go to exchange information that aren't coming under such a tight regulatory grip in ways that vary from country to country. You know, for better or for worse. But I don't know. It's very hard to know what the future of the Internet looks like.
DAVID FOLKENFLIK Daphne Keller, thanks so much for joining On the Media.
DAPHNE KELLER Thanks for having me.
DAVID FOLKENFLIK Daphne Keller directs the Program on Platform Regulation at Stanford's Cyber Policy Center. Coming up, the Murdochs go to court. This is On the Media.
Copyright © 2022 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.