The Hate Debate
Marshall: The Honorable, the Chief Justice and the Associate Justices of the Supreme Court of the United States. Oyez. Oyez. Oyez. All persons having business before the Honorable, the Supreme Court of the United States are admonished to draw near and give their attention, for the Court is now sitting. God save the United States and this Honorable Court.
[music]
Jad Abumrad: This is more perfect, I'm Jad Abumrad. Today, a little bit of a departure-- maybe it's more of an addendum. We just released an episode about Citizens United, which is one of the most argued about cases in recent history. This was a case really about the First Amendment and whether we should limit the First Amendment in any way.
Now, in that case, you had this tension, you had Justice Kennedy who is saying, "Don't touch it, don't touch it, don't touch it. I don't care if all this money comes into our political system, don't touch the First Amendment," but you had Justice Souter saying, "I don't know, maybe we should touch it because sometimes your freedom impinges on my equality. Shouldn't those two things be in balance?"
It's a fascinating and important conversation and it was one that we were having internally as a team while we were making that episode. At a certain point, we decided let's actually take that conversation and make a live event around it. This is something that we did a few times in the run up to season two. We would have these public debates in the WNYC Greene Space. I want to play you some excerpts from that debate that we had about free speech.
We did a classic debate format; one person took one side, one person took another. Our questions were; should the government do more in this day and age to limit free speech? Should Twitter do more? Should Facebook do more? Should all of us collectively be doing more to stop dangerous hateful speech? Is this precisely the moment when we have to stand up louder and prouder and protect that speech, even if we hate it?
Before we got into it, I took a poll of the audience. Everybody who thinks that your right to free speech, especially online-- people say some bad things fine, but your right to free speech should remain pretty much unlimited. Those of you who feel that way, make some noise.
[applause]
Jad: All right you guys are thunderous over here. Let's just do it one more time, so I can just get more accurate. Those of you who don't think it should be limited, go.
[applause]
Jad: Now, those of you on the flip side, those of you think there should be some clear, hard limits-
Audience member: Boo.
Jad: Easy, easy. Those of you think there should be some clear limits on what you can say online, make some noise.
[applause]
Jad: All right. That gives us a good sense of where we're starting. Let me introduce our debaters for round one. The question is, should the government limit online free speech? Taking one side of that question is Mr. Elie Mystal, our legal editor at More Perfect and editor of Above The Law, a site for legal news. He is on one side of the stage and of the question on the other, Mr. Ken White, a First Amendment Litigator Criminal Defense Attorney at Brown, White & Osborn in Los Angeles.
He has joined us here from the left coast. He's a former federal prosecutor. He runs the free speech and criminal justice blog, popehat.com. Give it up for Ken.
[applause]
Jad: All right, Let us begin. We'll start with you, Ellie. Is there something wrong with the First Amendment? Would you say?
Elie Mystal: No, I don't have a problem with the First Amendment. It was a beautiful thing written for white people who wanted to overthrow the government. It's fine. I have a problem with absolutists who want to elevate threats, harassment, and calls for genocide to the level of a sacred right. I do not think that the First Amendment prohibits us from preventing a Nazi from getting a permit to really any more than I would think that the Second Amendment prevents us from having a sociopath not get a gun permit.
Absolutism is absolutely wrong on this issue.
[applause]
Jad: Okay. Elie Mystal, strong beginning. Ken, what do you think?
Ken: Well, I don't know what absolutist Elie is talking about. The last one I know is Hugo Black and he died 1971. We have well-established narrow exceptions to the First Amendment. They are narrow for a reason, we got them narrowed on the backs of the powerless being suppressed by the powerful.
All of the types of restrictions that Elie would like are ones that have historically been used against communists, against labor protesters, against war protesters, against minorities, and everyone else. The Nazis aren't the ones in danger from the types of restrictions that Elie is suggesting he'd like.
Jad: There are the two basic positions.
[applause]
Jad: Let's get the debate started. Elie, start us off. Explain why you think that hateful speech, fake news shouldn't be protected by the First Amendment.
Elie: Ken just admitted, just agreed that we already regulate speech at some level. Really, all we're debating about tonight, the only thing that's even up for debate is where we want to draw that line. Ken will draw that line so it protects Nazis. I would draw that line so it protects us from the Nazis. Let's start with a pretty simple example. Fire. Just kidding. There's no actual fire.
I'm sure you've all heard that the thing that you can say is that you can't shout fire on the crowded theater but actually, under our current laws, I probably can because our current standard is that what is unprotected are things that lead to direct incitement of imminent lawless action. That's a very high bar. I can probably say, "Fire." What I probably can't say is, "Fire, kill who you must to survive."
[laughter]
Elie: That would probably get me in trouble. The fire analogy comes from an older standard, older than the one that I just quoted. It comes from Oliver Wendell Holmes, who some of you might have heard of, and his standard when he used the "you can't falsely shout fire in a crowded theater" analogy, his standard was false and dangerous. Speech that is false and dangerous is not protected by the Constitution. I think that's where the line is. I think that's an eminently reasonable line.
I think that we had 150 years of a free republic with that line. I want the line where dangerous lies are not protected by the Constitution.
Ken: I don't want the government deciding what's a lie and what's true. May I remind you, we are currently led by a president who thinks that global warming is a Chinese hoax to corner the tungsten market. That's why I don't want the government deciding what to suppress based on its decision about what is true or not.
[applause]
Ken: Now, Elie refers to the fire in the crowded theater, Justice Holmes' famous quote, let's remember what he was talking about. He was using that quote, "You can't shout fire in a crowded theater," to justify jailing a man who was protesting World War II by handing out flyers suggesting that people resist the draft. That was the clear danger that the government saw.
Now, if you don't think that it's plausible that the government would be suppressing the same type of speech now, if you gave it the power, if you handed it to them out of fear of Nazis, then just look at what happened after the protests this last year. The alt-right and neo-Nazis rose, there were massive protests in response, and our largely Republican dominated state legislators leaped into action.
In 17 places, they proposed heavily punitive anti-protest bills, including for charming examples, making it easier for you to get off if you run over a protester in your car. That's what the government does with the power to suppress speech when you let the government decide what's true.
Elie: I think you just proved that our current First Amendment standard doesn't do bull to actually protect protesters. All it does is protect Nazis. You want to talk about the Oliver Wendell Holmes case. Let's talk about where our current standard comes from. It's relatively recent, 1969 Brandenburg v. Ohio. Now, what was that case? I said, 1969. You probably thought, "Oh, it's probably like civil rights and yes, and they were making it," no, it was for Klansman.
Brandenburg was a Klansman. He was all making Klan statements. Somebody arrested his ass for being a Klansman. He got convicted for inciting violence and the court said, "He's just a Klansman." We really need a new standard that protects the right of Klansmen to threaten Black people in 1969.
Ken: You see, Elie, you know that that's not the right case. That's the one that's best for your argument. The right case is 12 years-
[laughter]
Ken: The right case is 12 years earlier, Yates versus United States, people convicted for becoming members of the Communist Party, under the theory that some ideas can be punished as clear and present danger even when there is no imminent advocacy of wrongdoing. Yates built the wall that eventually Brandenburg completed. Brandenburg's the outlier. Yates is the one that shows how the power is consistently used by the government.
Elie: Can you explain to me a standard that allows me to stop Klansmen? That's what I want.
Ken: Yes.
Elie: If you can explain to me how I can make Klansmen not [unintelligible 00:10:17] in the field, then I think we're going to agree more than we disagree.
Ken: Absolutely.
Elie: It's a misnomer to suggest that the First Amendment is here to protect minorities. Are you kidding me? The constitution to even think about Black people until the Thirteenth Amendment, I think, as we all know.
Jad: You're saying that you would like to change the standard so that-- Well, help me understand.
Elie: I want to [unintelligible 00:10:37] the standards [unintelligible 00:10:38] less scary for people.
Jad: What would the standard be?
Elie: I can give you an example. The president is a Kenyan. That's false, but that's not particularly dangerous, and so we can let that slide. Hillary Clinton is running a pedophile ring out of a pizza shop. Do not pass, go, do not collect $200. That is both false and demonstrably dangerous.
Jad: Those are two very clear examples, but the idea of falseness and danger can get pretty squishy.
Elie: Sure, it can.
Jad: Can I call up an example, if you guys don't mind. The Daily Stormer, which is a very popular neo-Nazi site, there was a situation where they basically took a Jewish woman, a real estate agent-- that's the image right there, you can see it on the screens, and they superimpose it on image of Auschwitz. They published her name. They published her kids. They said hateful things like, "We will drive you to suicide." They call [unintelligible 00:11:32] , troll off on her. Does that qualify for you, Ken? Would you limit that kind of speech?
Ken: I think a lot of the comments sent to her were true threats. A reasonable person would see them as statements of actual intent to do her harm. I think some of the speech about her meets the incitement standard that is intended to and likely cause imminent lawless action against her, but ideas, however hateful can't be true or false. It's not for the government to regulate whether ideas or opinions are true or false no matter how unspeakable they are.
Elie: No, that is how we got here, ideas can be true or false. Climate change, real, true idea. Climate change is not real, false idea. We can make these distinctions. I don't think that we need to-- your standard requires. I have, unfortunately, because I am Black on the internet, I have, unfortunately had to deal with some true threats, some not true threats. I'm trying to wrestle with this issue.
When I go to the cops to try to ask for protection, I'm trying to wrestle with this issue of what's actually protected speech and what's actually not protected speech. My problem with the current standard is that it basically waits until they start shooting at me before they stop them. I want to stop them before they start shooting. I want to stop them before they start driving their cars into crowded protesters because by then it's too late.
Ken: I want them to stop too, but here's the problem, with the history of America being what it is, with the power having been used in the past being and what it is, what possesses you to think that if you give this broader power to attack speech to the government it's going to be used the way you want it to be?
Elie: I'd rather have this debate in 2020.
[laughter]
Ken: It's a date.
Jad: Now, you've heard Elie and Ken's points. The question is, did you change your mind? Who thinks the government should limit what we say online? Let's hear some noise.
[applause]
Jad: Those of you who actually leaned farther in that direction, over the course of this argument, let me hear from you guys.
[applause]
Elie: [unintelligible 00:13:48] claps.
Jad: You got a few. Those of you who do not think there should be limits placed from the government by us online, let's hear it.
[applause]
Jad: I think we may have a winner for the first round. I'm going to declare that you, Ken White, are the winner for the first round. Give it up for Ken White, First Amendment attorney, former federal prosecutor and founder of popehat.com. Thank you for joining us, Ken.
[music]
Jad: Coming up, we're going to shift the question a little bit. Instead of asking what should the government do about free speech, should it limit it or not, we're going to ask what should Twitter do? What should Facebook do? With all the fake news that's happening, all the hate speech that's coalescing online, should they limit free speech more than they are? That is coming up after the break. This is More Perfect. I'm Jad Abumrad, stay with us.
[music]
Jad: This is More Perfect. I'm Jad Abumrad. Let's get back to our debate, our free speech debate at the WNYC Greene Space. Round two, we're going to take that same basic question that we asked in round one, but now we're going to transpose it. Whatever we think about the First Amendment, it does place limits on the government but not so much on Twitter or Facebook. The question is; should Twitter and Facebook or other social media companies severely limit online speech-
Audience member: No.
[laughter]
Jad: -or shouldn't they? I want to pull you guys for this again so we have a baseline to start from. Those of you, people watching on Facebook, do you think the site of which you are on right now should aggressively limit the speech that you might type? Take the online poll. Those of you in the audience, same question; should Facebook and Twitter be allowed to severely limit online speech? Define it as you will.
[applause]
Jad: Those of you think, "Hell no."
[applause]
Jad: I get a mixed sense of where we're at in the audience. Here to debate this topic with Ellie is Corynne McSherry, legal director of the Electronic Frontier Foundation, which is committed to defending civil liberties in a digital world. Give it up for her.
[applause]
Jad: Corynne, let's start with you. What do you think about the prospect of a Twitter or a Facebook stepping in to take down lies and take down hate speech?
Corynne McSherry: I think it's a very dangerous path that unfortunately we're already well along. In moments of crisis-- and we're in a moment of crisis right now, we look to simple solutions for very complex problems and we are often sorry. That is where we are right now. The Internet grew up the way it did for mostly good, I would argue, because the platforms and the intermediaries mostly stayed neutral.
If we have a world in which Facebook, Twitter, Google, Instagram put themselves in the position of a court and decide what speech should be up, what speech shouldn't, we're going to walk down a dangerous path. Those decisions, those tactics will inevitably be used against speech that we would support for one thing. They will be inevitably used eventually by governments. Private censorship does not stay private. It becomes public censorship almost inevitably.
The third reason is really practical. They're already doing it and they're doing it badly. All kinds of lawful speech is being taken down every day. Google and Facebook can't save us from the Nazis. We have to do it.
Jad: Thank you Corynne. Elie, what do you think?
Elie: Yes, the First Amendment does not apply to Twitter or Facebook. Anybody who tells you that they have a constitutional right to say what they want on Twitter is an idiot. The Twitter trolls want-- they don't just want free speech, they want consequence free speech. They want to be able to say their vile trash and still keep their jobs and still keep their homes, and still get the girl. Screw these people. We should have Twitter at least at the level of a Jets game.
[laughter]
Jad: Those are the basic sides, let's start the debate. Corynne, kick us off.
Corynne: The problems here are legion. I'm going to start with the ones that I just touched on briefly before. The reality is that we can all target people that we hate right now. If we think that the rules that Twitter and Facebook and all those guys are going to come up with aren't going to be used against speech that we support, we are foolish. It's already happening. Community standards complaints are used against valuable speech all the time.
I know because I hear about it every day in my job. Then the related problem to that is when you get your lawful speech taken down, you don't have any options. You don't know how to get your stuff put back up. We have courts but we don't have a right of appeal. We don't have any challenge.
These platforms have the right to host any speech they want. They actually have the First Amendment right to host any speech they want, but as users, we want them to use that right wisely. That's not happening right now.
Elie: As a user I want them to stop Nazis. That's really all I'm concerned about. I want them to find a Nazi and stop them from expressing their hateful views on Twitter.
Corynne: That is a dream that you're having. They can't, they can't. That's foolish. No, sorry.
Elie: Are you Nazi? Yes. Goodbye, done.
Corynne: You know why I know they can't, because they're trying and they're failing over and over. They cannot tell the difference between hate speech and reporting on hate speech and so accounts get taken down and suspended when they're doing perfectly lawful things.
Elie: One of the reasons why this is so important that we demand better from Facebook, from Twitter, from Reddit, is that the reason why we're seeing so many more Nazis now is because these platforms have allowed them to organize. There was a reason why the Klan was on the decline 20 years ago because you, because wearing a hood and going out to meet your friends in the middle of a field like Brandenburg did, wasn't really how the modern society was going.
Then Twitter and Facebook and these sites and Reddit came along and now they have a way to talk and talk to each other and realize that, "No, I actually hate Black people too." "Oh, so do I? Let's hang out." No, screw these people. There's no constitutional reason why Twitter should allow them to exist or Facebook or whatever. There's no business reason why Twitter or Facebook or whatever should allow these people to exist. Get them the F out.
Jad: One of the things I think about is-- One of the things we heard in the wake of Charlottesville was that a lot of these folks got radicalized online. Why would the prospect of them getting radicalized online? What would balance that out in terms of the failure that these sites are doing? I'm curious to hear you talk more about that.
Corynne: A couple of things. I do just want to respond to this real question. My view is if white supremacists and Klansmen and Nazis are organizing, I way prefer they were doing it out in public where I can see them and I can challenge them and I can respond to them and law enforcement will say the exact same thing. People who fight terrorism say it's much better for the people to speaking publicly, for the radicals to be radicalizing where you can see them.
They're going to organize anyway. Would you rather they do it in secret or in the open? I prefer the open.
Elie: I would rather them do it in secret. I would actually rather them go and find and make their own Nazi website. Make their own Nazi thing so that whenever I get Ken to agree with me, whenever the government is ready to stop these people, they will have all pre-registered. They would have all said, "Hey, look at us."
Corynne: Yes, and all of those people-
Elie: "We're here on nazimeet.com." Boom and we can go get them.
Corynne: Great. We can continue the silo conversations that we're having right now, which is a big part of why we ended up in these conversations.
Elie: Yes, I would like to be siloed from Nazis.
Corynne: I think that sounds very nice. It's a good talking point, but in reality, I think that's very, very dangerous for our society. We need people to be talking to each other. When they only talk to people who agree with them, they never change their minds. Now, to your point. I didn't mean to ignore your point.
[applause]
Jad: Yes.
Elie: That has proven to time and time again to be not true. Again, I feel like that is such a happy-clappy white version of this story. "Oh, if we just talk to these people, we can convince them that maybe Black people shouldn't be sent off to prison camps."
Audience member: It happened.
Elie: Once or twice and the rest of the time, they're running cars into people. It doesn't happen nearly not long enough.
Corynne: Do you know why we have gay marriage equality now, because people talk to each other. It's not the only reason, but it helped, but I want to answer Jad's question because I think what you're asking is for an example of why I'm worried about how the moderation happens.
Jad: I want to gauge your worry against Elie's worry.
Corynne: The way that it works now, and the way that it's likely to continue to work is that the social media companies employ a combination of humans and mostly algorithms to try to figure out what's bad speech and what's good speech, and they mess it up. They'll end up taking down this statement, "All white people are racist," as an example of hate speech, but they won't take down-- If you might show the previous one.
This from a Congressman who said, "Not a single radicalized Islamic suspect should be granted any measure of quarter, et cetera, et cetera," nasty stuff. They can't tell the difference and that's what happens. A hat tip to ProPublica. I hope you guys are all ProPublica supporters and fans because they're great. They did a detailed study to look at Facebook's policies and they found out that among other things, they're training their moderators to, in some instances, protect white men over Black children.
That's where we are right now. That's what we want to endorse? That's what we want to encourage? I don't think so.
Elie: I will stipulate that there are many examples of them getting it wrong. They get it wrong. They're not great at this job yet. We live in a real world where the actual-- I'm talking about Twitter cops, but we live in a world where the actual cops get it wrong every fricking day. In my most radical statements, I'm not saying let's get rid of the cops because they don't know what they're doing. No, I'm saying let's get better cops.
For Twitter, I'm saying let's get better Twitter cops so they don't get it wrong so many times, but you want to talk about letting the perfect be the enemy of the good. Just because Twitter and Facebook have not gotten to the level yet where they're able to affectively police these people doesn't mean they should just stop trying.
Corynne: Well, we have. Where we are right now is thousands of accounts are being suspended every day. Let's just say a relatively small percentage of those are for perfectly lawful speech. That's a lot of lawful speech. That's a lot that we have authorized Twitter and Facebook and everyone else to take down and encourage them to. Keep in mind-- I want to say one more thing that I said before but I really want to emphasize it.
Once we start down this path, if you think that this is going to stay within the decision makers at Silicon Valley, you are dreaming. That's bad enough. I'm not actually sure why we all want Silicon Valley to make decisions about what speech is okay for all the rest of us. Even that aside, it's not going to stop there. Governments are going to come in. When they see that Google, Facebook, Twitter can easily take down accounts. They're going to say, "Could you do that for us?" This doesn't stop.
Elie: Somebody needs to stop these people and I refuse to believe that we live in a country where that is impossible.
Jad: Let's take a question here in the back.
Audience member: If Facebook emailed you and said you can be in charge of what's considered speech that is either left up or is taken down, you could build whatever team of people would you accept that? Would you think that that could create something that you would be satisfied with or not satisfied with?
Corynne: Oh, if I was queen of the world. It's hard to turn that down. Even I would have trouble in all instances being perfect about what was lawful speech and what wasn't speech. That actually isn't my main concern, is that even I could then potentially be required by a government to then use that algorithm for other purposes and that would be really dangerous.
Here's the one thing that I would say and this is where I think we agree, is that if I was queen of the world and I was running any of these companies, one of the things I would absolutely do is put in much better processes for people to appeal, for people to challenge when things are taken down wrong. This isn't just a speech issue, it's a due process issue because let's face it, of course, these aren't official government forums.
We can all understand that but nonetheless this is how we talk to each other. These are our public spaces and in those public spaces, it's really important when your account gets suspended, when you get taken offline to be able to get back up if what you're doing is perfectly legal. Right now, the reality is and I know this because I hear from people all the time, it's very confusing. You don't know who to appeal to. You don't know why you're taken down half the time, and you don't know what to do.
Jad: Let's take a question on the far right.
Audience member: Hi. I just wanted to get your opinions on money because I hear a lot of talk about this being a speech issue or not. I think for platforms, the social media platforms, it's really all about money and it's about followers and young kids that are getting rape threats and threats and that they eventually end in suicide. I think that this has to do with money. I think there's a bigger issue here and I just don't hear anyone talking about it and I just wanted to know what you both thought about that.
Corynne: I think that that's really a real pressure point because a lot of these companies-- actually, genuinely so feel uncomfortable making money from hate but unfortunately, we still have a problem. I'm going to give you an example from an article I just read yesterday. It's a long piece about Google and how it runs advertising and search and so on from Talking Points Memo.
Talking Points Memo mentioned that one of the problems that they have because these processes are so opaque, they survive because of Google advertising. Them too. They're a legitimate site trying to do good for the world. They survived because of Google advertising. They keep getting penalized for hate speech because they're reporting on hate speech, specifically Dylan Roof situation. It's not easy to disentangle.
Elie: No, it is because we agree that the robots are bad but I think that we can all agree that Talking Points Memo is a decent site. Info Wars on the other hand if Google and Facebook and whatever slam them. Why would that be so hard? Here's the other thing, if you really don't think that we yet have the technology and the resources necessary in order to police these sites better, how about we go the other direction? How about we just out people?
If you Twitter are going to tell me you can't tell who's threatening to kill me, just tell me who it is. Just tell me who it is and I will handle it myself. What's wrong with that?
Corynne: See now, he's just trying to piss me off.
[laughter]
Corynne: We're talking about is now a step further, it's social media companies and intermediaries by the way all the different people that you interact with, they take it upon themselves to out you, to pierce your anonymity. That is profoundly, profoundly dangerous. Anonymous speech is the most probably the most important form of political speech that we have. The ability to speak especially online without fear of retaliation means that you have the ability to speak your truth. If we out people, if we accept that social media companies should be judge and jury over that, should just expose people to the world without any choice, without any recourse because once you're outed, there is no appeal.
Elie: Because the thing that we used to have as a society to protect ourselves from these people was called shame. We could shame them into being part of the herd, and if they didn't want to be part of the herd, we could know who they are and say, "Hey, guess what? You're no longer part of the herd." Shame is a powerful weapon that we used to have in Twitter has taken it away from us and that is why these people are allowed to multiply.
Corynne: That weapon was also used to persecute minorities all over the-
Elie: Everything was always used to persecute minorities at some point.
Corynne: It's still used to persecute minorities.
Elie: The fact that something has been used to persecute minorities doesn't mean that it can't also be used to stop Nazis. That's just-- clocks were used to persecute minorities when they weren't paid by the hour, but clocks are still a good thing.
Corynne: The one thing we have always understood in this country and this is before the First Amendment is the importance of anonymous speech.
Jad: Let me just jump in for a second. We have a question here on the right.
Audience member: I just wanted to say that like someone says something about, "Is there a moral reason that Twitter or the government should lean towards free speech?" I personally am someone who used to have [unintelligible 00:31:25] views and I was raised as fundamental Christian as you could get. My views about gay people had I spoken them on the internet, probably would have put off some hate speech alarms. It was not shaming that changed my mind.
I encountered people who were engaging, who treated me like a person, even though had back then, there had been Twitter, I would have been in troll and it changed my mind. I don't know if you guys are familiar with the Westboro Baptist church, they fought a Supreme Court case and won, they have really the worst views of any group that considers themselves Christian, that I can think of. Their person who ran their Twitter is a friend of mine, Megan Phelps-Roper.
She has this great story about how using Twitter to essentially like spread terrible hate speech, saying things like, "Thank God for AIDS, for killing gay people." It was through Twitter and through the arguments she got in and then through the relationships that she got in that she found a way out of that bubble she lived in and now is out in the world doing amazing work.
If what you want, Ellie, happens that troll that you want to shut up, that Klansman you want to get rid of, he doesn't go away. The mold grows in the shadow and it's only in the sunshine. It's only when you get it out in the open and we have these conversations. As a former believer in some of this stuff, like don't lose heart. Like we can have our minds changed. We can be convinced of the truth.
Elie: I respect your story and I'm very glad that you were able to get to where you are. However, it turns out that I believed what you want me to believe for a good, oh, I don't know, 28, 29 years of my life. I am a 40 year old Black man. I am sick of being the educational PBS afterschool special for racist white people. Gay people are sick of being the ABC afterschool special for white people. Women are sick of being the afterschool special, trying to teach the white man why they also should have rights.
It is simply no longer acceptable for you to expect other people just trying to go about posting their dinner recipes on Facebook. It's ridiculous for you to think that we should still have the burden of educating you. You should go get educated somewhere. That can't be on us all the time and I'm willing to do it. I'm willing to do it here.
I'm going to do it in public. I'm actually willing to go to a bar and have a drink with people that I can't stand, but at some point when I just want to get on Facebook and see the Mets score, I shouldn't have to hear your bullshit.
Corynne: I don't actually think that was what he was saying at all.
Elie: Entirely possible.
Corynne: Someone should say that with a microphone. I think he was just saying silo's bad.
Elie: That's what I'm saying. That is what you're saying. Silos are bad. We should all be together and then, no.
Corynne: I think he saying we don't talk to each other, nobody's mind ever changes.
Jad: I'm going to jump in now. I think I think Elie and Corynne have done all they can to persuade you guys. Who thinks that Twitter and Facebook and such should take a strong hand in severely limiting online speech? Those are you think so, clap.
[applause]
Jad: Those of you who disagree with the asshole clapping to your left, make some noise.
[applause]
Jad: I believe that means that you are the victor.
Corynne: The Internet wins.
Jad: Thank you to our debaters Ellie Mystal, More Perfect's legal editor and executive editor at Above The Law. Thanks to Corynne McSherry from the Electronic Frontier Foundation and Ken White from popehat.com. This episode was produced with Elaine Chen and the very excellent folks at WNYC's Greene Space. We had mixing help this week from Louis Mitchell. Supreme Court audio is from Oyez, a free law project in collaboration with the Legal Information Institute at Cornell.
Leadership support for More Perfect is provided by the Joyce foundation. Additional funding is provided by the Charles Evans Hughes Memorial Foundation. See you next week.
[music]
Copyright © 2021 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.