Does Social Media Turn Nice People Into Trolls?
BROOKE GLADSTONE This is On the Media, I'm Brooke Gladstone. This was a historic week for tech on Capitol Hill. Charged with ammunition from the Facebook whistleblower, lawmakers are making moves to regulate tech companies.
[CLIP]
SENATOR BLUMENTHAL Big Tech now faces that Big Tobacco jaw-dropping moment of truth. [END CLIP]
BROOKE GLADSTONE House Democrats unveiled a bill that would chip away at Section 230 of the Communications Decency Act, a law that has shielded tech platforms from consequences for the content that flows across their domains. This bill came one week after the Facebook whistleblower Frances Haugen, gave damning testimony in front of Congress about Facebook's culpability for today's hostile political environment.
[CLIP]
FRANCES HAUGEN When we live in an information environment that is full of angry, hateful, polarizing content, it erodes our civic trust. It erodes our faith in each other. It erodes our ability to want to care for each other. [END CLIP]
BROOKE GLADSTONE The bill is called the Justice Against Malicious Algorithms Act, and really, the name says it. All the Facebook algorithms are mean, and they're making the people who use Facebook mean too.
[CLIP]
FRANCES HAUGEN The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world. [END CLIP]
BROOKE GLADSTONE Part of why Francis Horgan's testimony had such an impact was because it confirmed what many people know or feel they know. Like the feeling that Facebook's algorithms tricked them into sharing disinformation.
[CLIP]
FRANCES HAUGEN When people are exposed to ideas that are not true over and over again, it erodes their ability to to connect with the community at large because they no longer adhere to facts that are consensus reality. [END CLIP]
BROOKE GLADSTONE But how well do we really understand the problem of misinformation? According to Michael Bang Petersen, a political science professor at Aarhus University, not very well. He directs the research on Online Political Hostility Project, which found that algorithms aren't making people meaner online, they were already mean when they logged on.
MICHAEL BANG PETERSEN I think that many of us have the intuition that social media platforms create psychological changes. And that idea is something that's also represented in the Facebook files, but the research that we have been doing suggests that that might actually not be the case.
BROOKE GLADSTONE We do know that these algorithms leverage neurotransmitters like dopamine to get people to engage. It seems to work, and people feel it. So you aren't arguing that your data suggest that it doesn't have an emotional impact?
MICHAEL BANG PETERSEN No, that is not what I'm saying, but I think it's very, very important to understand what exactly social media are doing. Social media potentially can impact us turning nice people into trolls. But we and our research finds that is not really the way that social media works.
BROOKE GLADSTONE You know, I've always said what social media makes you is more of what you are going to be anyway. I think we're in fundamental agreement about that. I guess it's a question of degree. If you have permission or you find permission online for views that even you suspect are unacceptable or reprehensible, you will be encouraged to act on them or express them.
MICHAEL BANG PETERSEN I think that's absolutely true. One way to think about social media in this particular regard is to turn some of the ordinary notions that we have about social media upside down. And here I'm thinking about the notion of echo chambers. So we've been talking a lot about echo chambers and how social media are creating echo chambers, but in reality, the biggest echo chamber that we all live in is the one we live in in our everyday lives. I'm a university professor, I'm not really exposed to any person who has a radically different worldview or radically different life from me in my everyday life. But when I'm online, I can see all sorts of opinions that I may disagree with, and that might trigger me if I'm a hostile person and encouraged me to reach out and tell these people that I think that they are wrong. But that's because social media essentially breaks down the echo chambers. I can see the views of other people, what they are saying behind my back. That's where a lot of the felt hostility of social media comes from. Not because they make us behave differently, but because they are exposing us to a lot of things that we are not exposed to, in our everyday lives.
BROOKE GLADSTONE Wow, you have really flipped that on its head. Because most people think that because we can tailor our news feeds and the people we speak to online much more than we can in real life, that living online isolates us more than even living in bubbled communities. But you're saying not so, and that will supercharge hostility.
MICHAEL BANG PETERSEN Exactly. So the research that we've been doing shows that the real difference between online and offline political discussions is that when it comes to online discussions there you are seeing a lot of strangers being attacked and being the target of hostility. But you don't see that offline. In our offline lives, there is a lot of hostility as well, but that happens behind closed doors in private. It happens in bars where we cannot hear what's going on, but we are exposed to all that when we enter the online realm.
BROOKE GLADSTONE When you started this research, you proceeded from the presumption that nice people become angry when they log on to social media because of this weird online environment.
MICHAEL BANG PETERSEN But when we began to actually do the research, we did not find this huge group of people who report to be nice in face-to-face discussions, but hostile in online discussions. Rather, we found the exact same thing as we found with political violence in general that it's particularly individuals who engage in it. And what really characterized them is a personality that is focused on acquiring as much status as possible.
BROOKE GLADSTONE You said that you didn't notice people who were nice offline being jerks online, but there are plenty of anecdotal examples, I can give, of somebody who writes an unbelievably nasty letter. And if you respond politely, generally, they'll respond incredibly nicely after that and maybe even sometimes apologize because they're not used to thinking of other people online as people. You don't think that the internet enables or disinhibited people to the point where people who at least act nice in the real world act differently online.
MICHAEL BANG PETERSEN I think that the same kinds of processes that happens online also happens offline. You can easily find people who apologize for stuff they said in face-to-face discussions as well. I do think that there is one difference which is important sometimes when people are writing on Facebook or on Twitter, for that matter or other social media, they are acting as if they're sitting down at the bar with their friends when no one is listening. And the key difference is that that's not how social media works. On social media other people are listening, other people are seeing what you are writing and people will react to that. And when they do it, you sort of realize, 'Oh, I said something I shouldn't have.' But but it's not because people don't know what they are saying. They know exactly what they're writing and they know exactly that. This is something that hurts if other people read it.
BROOKE GLADSTONE You also discovered some common characteristics in people who share misinformation online. They know more about politics and are more digitally literate. They spread misinformation, you conclude, because they simply hate the other party more. How did you determine that?
MICHAEL BANG PETERSEN We got consent to connect survey data with people's behavior on Twitter, and then we looked at the kinds of information that people shared on Twitter and what was predictive of that sharing behavior in terms of psychological profiles and political profiles. The people who are sharing misinformation are not ignorant. They are used to navigate social media and the internet. They know more about politics than the average person, but where they're really different from the average is they have much more negative feelings towards members of the other party. And that's really what it's predicting. Not only their sharing of fake news, but also the sharing of real news. They want to derogate people that they don't like, and they are sort of actively searching for information that they can use for that purpose.
BROOKE GLADSTONE So you're suggesting that sometimes they spread misinformation that they are fully aware is false, but it serves their goals?
MICHAEL BANG PETERSEN It's not. That people look at information and then make a firm evaluation saying this is false. But I will share it anyway. It simply is not what is relevant to the decision. What they look at is this useful for the particular purpose?
BROOKE GLADSTONE Did you find any correlation between political parties and the tendency to share misinformation?
MICHAEL BANG PETERSEN There is a much greater risk of sharing misinformation if you are Republican than if you are a Democrat, and that is something that we have been spending a lot of energy looking into trying to understand. Why is there this difference? Some past research, which has found the same, has been arguing that, well, we know that there is a relationship between education levels and party choice. So potentially this is because Republicans have slightly lower education and therefore are not knowledgeable about what is true and what is false. But that's actually not what we are finding. The key difference is that the kinds of news that are available for these political purposes of Democrats and Republicans are different.
BROOKE GLADSTONE Wait a minute. What you're saying is that if Republicans want to find stories that are more negative toward Democrats, they are likely to go to sites where that stuff abounds. And there's also more misinformation on those sites.
MICHAEL BANG PETERSEN Exactly. So we analyzed huge amounts of news from all over the spectrum, and we found that the only sources which are extremely critical of Democrats and positive towards Republicans are these fake news sites. A lot of mainstream news media, at least in this period that we've been analyzing, which was during the Trump presidency, are portraying Republicans in more negative ways than they're portraying Democrats. And that means that if you are a very committed Republican who are looking for this kind of ammunition, you have the motivation to move to fake news sites and find that ammunition.
BROOKE GLADSTONE Does that mean that mainstream news is biased against Republicans?
MICHAEL BANG PETERSEN That is not something that we can conclude from this research. In general, it's extremely difficult to actually conclude that a media bias exists because what you could argue is that it's not a bias in the media, it's a bias in reality, so to speak. That during this particular period of American political history, there was a lot of negative things to report on the Republican Party exactly because of the behavior of Donald Trump, and that that was what drove the difference in reporting
BROOKE GLADSTONE Is the basic conclusion here that misinformation or disinformation isn't as big of a problem as we may think it is.
MICHAEL BANG PETERSEN Misinformation is not in itself a big problem, so that's the good news. But the bad news is that it's probably a symptom of a much worse problem. And here we again come back to the polarization in society because that is really what's driving the sharing of misinformation. I think we have been focusing a lot on the symptoms. Fox News, Trump, Facebook, but I think that there's some evidence that suggests that rising inequality over the last decades have been a fundamental driver of political instability in the U.S. and beyond. It's a problem in many Western democracies. That is at least where I would start to sort of look for solutions.
BROOKE GLADSTONE What about the attack on the Capitol on Jan. 6? Misinformation may not be the most important thing in the big picture, but it was misinformation about the election being stolen, mostly online, where people were organized that got them to the Capitol.
MICHAEL BANG PETERSEN I would say differently. I would say that we had individuals who were predisposed for violence due to frustrations that have originated elsewhere, and these people were essentially looking for a signal of when to engage in that violence. And the sharing of misinformation about the election was essentially the signal that they were looking for. So the misinformation about the election served a coordination purpose, but it wasn't such that people were manipulated into doing something that they wouldn't have liked to do otherwise.
BROOKE GLADSTONE And social media made that possible.
MICHAEL BANG PETERSEN Again, what social media does is that it connects people, back in the days you could have frustrated, violence-prone individuals in each town, but nothing really came of it because they couldn't be connected, but now they can connect very, very easily. And that means that they can all sit there and wait for the signal of when is the time to act, and social media helps that coordination happen.
BROOKE GLADSTONE Which brings us back to the centrality of something like Facebook.
MICHAEL BANG PETERSEN For sure. And my point is certainly not that social media is not playing a role, but I think it's extremely important that we are specific and precise about the exact role that they are playing. Social media is used as a tool for violence prone individuals to accomplish the particular goals that they have, but it's not that social media as such, is responsible for those feelings of frustration and that violence proneness.
BROOKE GLADSTONE So then do you think Congress has a role in disrupting the kind of connectivity that helps amplify voices of division and hate?
MICHAEL BANG PETERSEN That is a very, very important discussion. It's also a very, very tough discussion because the discussion about content moderation naturally also invites discussions about freedom of speech and how difficult it is to figure out what is true and what is false, because these things change. For example, the so-called lab leak conspiracy theory with regards to the coronavirus may not be such a conspiracy theory, after all. So the discussion about content moderation is very, very difficult. But I think we need to figure out how we can disrupt this kind of activity for individuals that seek to use it for purposes that are destructive. The connectivity of social media is a tool that can be used for good or for bad. And we need to figure out how we can make sure that it's mainly used for good.
BROOKE GLADSTONE Michael, thank you very much.
MICHAEL BANG PETERSEN You're welcome.
BROOKE GLADSTONE Michael Bang Petersen is a professor of political science at Aarhus University in Denmark.
Coming up, belief in salvation was once the province of religion. But computer science has transferred faith to the God and the machine. This is On the Media.