BOB GARFIELD: This is On the Media. I'm Bob Garfield.
BROOKE GLADSTONE: And I'm Brooke Gladstone. This week abounded with stories of fake news.
[CLIPS]:
FEMALE CORRESPONDENT: People who got their election news on Facebook might have been looking at more fake stories than real ones.
MALE CORRESPONDENT: When it comes to news, not everything in our feed is legit.
MALE CORRESPONDENT: If it makes me laugh, then I might share it, if it’s very interesting –
FEMALE CORRESPONDENT: Google and Facebook are responding to accusations that fake news influenced the outcome of the election and fuels divisiveness across the country.
[END CLIP]
BROOKE GLADSTONE: Craig Silverman of Buzzfeed looked at the top fake election stories that performed best on Facebook - that is, had the most likes and shares - and compared them with the 20 top-performing real news stories. He found that in the three months leading up to Election Day the top fake election stories generated more engagement on Facebook than the top real news stories. Of the top five fake stories on Buzzfeed’s list, four were anti-Clinton, with headlines like, “Wikileaks confirms Hillary sold weapons to ISIS.” And one was pro-Trump, about Pope Francis endorsing him for president.
In contrast, the stories from real news sites that prompted the most Facebook reactions tended to favor Hillary Clinton. Silverman covers hoaxes and online misinformation for Buzzfeed and he concedes that the sample was limited by the data he could glean from Facebook.
CRAIG SILVERMAN: Certainly, we’re not claiming that in total fake news sites and fake news beat out mainstream news because, obviously, there's a huge amount of mainstream news, and fake news sites put out daily stuff but not at the same pace. And so, by specifying that it's viral we’re really talking about the top 20 hits from the fake sites and fake stories versus top 20 from about nineteen mainstream news organizations.
BROOKE GLADSTONE: Okay. Now, you had a chart that showed right around Election Day the fake news got a lot more engagement than the real news, which would lead any casual observer to think that it was being consumed at a greater rate.
CRAIG SILVERMAN: Well, it was definitely being engaged with at a greater rate, and, and this is where we have to be really clear about what Facebook gives us and, and what it doesn't give us. We know that the fake stories were getting more comments, more reactions, more shares, in total. We don't know, for example, that they necessarily got more traffic. In fact, it's probably safe to assume that those [LAUGHS] mainstream sites which are, you know, very big and have loyal audiences were still getting bigger traffic, even though their bigger stories were getting less engagement on Facebook.
And that was one of the like really surprising things to me. One part is the spike in engagements for fake news when you get into that, you know, August until Election Day period. But, at the same time, from February through ‘til Election Day, the trend line on engagements for those top stories from mainstream outlets was consistently going down, and I don't really know why there was such a decline.
What are some of the possibilities? Well, one of them is just simply were the stories that they were reporting just not as interesting to people on Facebook? Another one that people have raised, and that's really impossible for us to know, is that in about the middle of the summer Facebook did announce a change to its newsfeed algorithm where it was aimed at surfacing more kind of personal updates and personal news to people. And, you know, there was a decline after that.
BROOKE GLADSTONE: The algorithm changed, why?
CRAIG SILVERMAN: What Facebook said is that for the newsfeed, rather than having it totally dominated by, you know, news articles and, and content from outside sites, they wanted to tweak the algorithm to make sure that people were getting the updates from their friends and family members and others that were really important in their personal life.
BROOKE GLADSTONE: Right and that may very well have diminished the stream to Facebook of legitimate news sites.
CRAIG SILVERMAN: You would think a tweak like that might affect, you know, mainstream news as well as fake news because they would all fall into the category, to a certain extent, at least, [LAUGHS] of being news. And that's why I want to be cautious about assuming that that was a factor or a big factor. And, you know, we know that Facebook is the biggest driver of audience that there is. It’s bigger than Google now. And so, when Facebook tweaks its algorithm and a reduction in the engagement for these top 20 hits started to really take hold, inevitably, I think, people in media are going to look at that and speculate about it.
BROOKE GLADSTONE: Well, given that Facebook tweaked its algorithm to make it more responsive to people's personal networks and their personal interests, rather than things coming in sort of over the transom and that heightened engagement with fake news, do you see this as a media problem, a Facebook problem or a human problem?
CRAIG SILVERMAN: Maybe it’ll sound like a copout but I’m gonna check all three boxes. I think that at the core of this there's complexity. You know, I've been seeing some people saying that, oh, fake news gave Trump the election. And we have no evidence to support that. He probably won for a variety of reasons. And fake news spreading is probably doing so for a variety of reasons.
We've always had hoaxes, we’ve always had misinformation. The introduction of Facebook and the scale that it has - there's more than 1.7 billion people around the world who log in to Facebook every month - combining that with the natural human behavior, where we love information that confirms what we think and believe but like to stay away from stuff that doesn't, I mean, you put the human behavior with the platform, with the tendency for misinformation that's always been there and I think we have an environment that is rooted in what's been going on for long time but is different because of how big the platform is.
BROOKE GLADSTONE: Craig, thank you very much.
CRAIG SILVERMAN: Thank you.
BROOKE GLADSTONE: Craig Silverman is a reporter with Buzzfeed News.
[MUSIC/MUSIC UP & UNDER]
BOB GARFIELD: Now, just under 45% of Americans get their news from Facebook but Facebook CEO Mark Zuckerberg insisted that 99% of what they see on the site is authentic and that it was, quote, “crazy” to say that it could have swung the election. But, that said, he vowed to make changes.
John Herrman covers the media for The New York Times. John, welcome back to the show.
JOHN HERRMAN: Thanks for having me.
BOB GARFIELD: Facebook and Google announced that they’d each prevent fake news websites from getting access to their ad networks, trying to starve them of the revenue they need to keep in business. How does that work and will it work?
JOHN HERRMAN: So for Google to ban a site from AdSense can have a real effect. AdSense can be used to promote a website, as in a website owner or a news site owner could promote their content on AdSense. But, more probably, AdSense is used to put ads on the website, to put ads under the lies, if that’s what you’re publishing. For Google to knock people out of that ecosystem is definitely a hit.
Facebook publishers don't tend to make money that way. The small hyper-partisan Facebook pages, they lead to outside websites. They use Facebook to gather people; they don't use Facebook to make money from those people. For that, they might use Google on their website or, more likely, they’ll use sort of a low-rent ad network, the boxes under articles with skincare tips and conspiracy theories and stuff like that.
BOB GARFIELD: Meantime, I'm just curious by what means anybody expects Facebook actually to, first of all, monitor the entirety of [LAUGHS] what’s shared on its platform. It’s not as though they have thousands and thousands of fact checkers. They don't. What would it have to do to intervene in order to keep the phony stuff out of the system?
JOHN HERRMAN: There are things that Facebook could do to address this problem, as it's been narrowly defined. They could take outside links to websites that have reached a certain number of shares, have an editorial team apply some sort of scrutiny to those things and then eventually penalize the sites that they’re coming from, if they are repeat offenders. This is possible.
They could likewise rely on a sort of user-reporting framework, like they do for harassment or, you know, if we’re gonna compare it to another giant online marketplace, an eBay rating system. The editorial team, the sort of internal snopes, is something that could be applied to things that look like news or purport to be news articles. These are all things they could do. It would give Facebook the ability to sort of close off this conversation and say, we've done something, while next to those articles you have image memes that contain false information that been shared a million plus times that don't claim to be news articles but they claim to tell you facts.
Videos that have been shared millions of times that contain obvious falsehoods, maybe they have little Chirons on the bottom. Do those then come under the umbrella of the news administration or news moderation team? There are just the endless sorts of almost fake, almost news that such a solution wouldn't have to touch and that these proposed solutions don't really ask it to touch. And so, I do feel like the fake news discussion, as it narrows down onto these sites that look like news sites that publish stories, I feel like that could be leading us into kind of a cul-de-sac.
BOB GARFIELD: So how do we get back onto the highway? Do you have any ideas as to what Facebook or separately Google or Twitter can do to keep its users from being deceived, or worse?
JOHN HERRMAN: I mean, no. It [LAUGHS] – I don't mean to suggest that this is impossible. I just mean to suggest that that conversation is one that, that we really haven't started to have. The big conversation is about Facebook's accountability in the vast spaces in which it operates. And Zuckerberg's initial response, the most defiant response, was saying it’s absurd to call us a media company just because there's news on our platform or an aerospace company because we’re flying drones or a B2B company because we sell software to businesses.
And yet, I think the challenge to this and the challenge for this whole discussion is to ask why that's absurd. Why - and this needs to be defended - is Facebook not simply all of those things?
BOB GARFIELD: So wherever you turn in the last few days the conversation is fake news, fake news, fake news, kind of closing the barn door after the cow had fled. How is it that we weren’t hip to this and sounding the alarms, oh, I don’t know, six weeks ago?
JOHN HERRMAN: A lot had been written about the subject, and also anyone who used Facebook could have easily intuited that a lot of what they were seeing was coming from sources that might not be credible but, in each specific instance, might be borderline absurd. This was sort of part of the broader problem of people simply not believing that one candidate could win over the other. It made fake news into a curiosity or a worry. Only now, in retrospect, is it being seen widely as this deep problem.
BOB GARFIELD: John, thank you so much.
JOHN HERRMAN: Thanks for having me.
BOB GARFIELD: John Herrman is a reporter for The New York Times.