[music]
Melissa Harris-Perry: This is The Takeaway, I'm Melissa Harris-Perry.
Frances Haugen: I'm here today because I believe Facebook's products harm children, stoke division, and weaken our democracy.
Melissa Harris-Perry: That's Facebook whistleblower Francis Haugen, testifying earlier this month, before a Senate subcommittee.
Frances Haugen: The company's leadership knows how to make Facebook and Instagram safer, but won't make the necessary changes because they have put their astronomical profits before people.
Melissa Harris-Perry: Haugen was discussing how the world's largest social platform allows for the spread of misinformation and exploits the fears and insecurities of minors and vulnerable populations.
Frances Haugen: This is not simply a matter of certain social media users being angry or unstable or about one side being radicalized against the other. It is about Facebook choosing to grow at all costs, becoming an almost trillion-dollar company by buying its profits with our safety. Buying its profits with our safety.
Melissa Harris-Perry: This week, Facebook is facing its latest round of scrutiny after a collaboration involving journalists from 17 American newsrooms started publishing a series of stories they are calling The Facebook Papers. These newsrooms are sifting through thousands of pages of internal documents, initially obtained by Haugen. The reports are both wide-ranging and alarming.
They cover how the social media platform, which also owns Instagram and Whatsapp, helps spread misinformation, does little to moderate violent content in some parts of the world, and has failed to crack down on content related to human trafficking. Today, we're going to take a look at one slice of this reporting. An analysis from the New York Times shows new details on the role that Facebook played spreading misinformation ahead of the 2020 election and election conspiracies ahead of the January 6th attack on the Capitol.
Facebook's CEO, Mark Zuckerberg, testified before Congress in March, downplaying Facebook's role in the January 6th attacks.
Mark Zuckerberg: We did our part to secure the integrity of the election, and then, on January 6th, President Trump gave a speech rejecting the results and calling on people to fight.
Melissa Harris-Perry: Just days after that insurrection, Facebook's Chief Operating Officer, Sheryl Sandberg, said this in an interview with Reuters.
Sheryl Sandberg: Well, we know this was organized online. We know that. We, again, took down QAnon, Proud Boys, Stop the Steal, anything that was talking about possible violence last week. Our enforcement's never perfect so I'm sure there were still things on Facebook.
Melissa Harris-Perry: The New York Times report shows that employees repeatedly raised red flags about the mismanagement of content and far-right groups. For more on this, we have Ryan Mac, technology reporter at The New York Times. Welcome to The Takeaway, Ryan.
Ryan Mac: Thanks for having me.
Melissa Harris-Perry: What are some of the things that you found in this analysis of Facebook's role, relative to January 6?
Ryan Mac: There's thousands of pages to these documents that would come through. For this specific analysis on the election and January 6, there were many red flags that Facebook employees had raised, not just in the days or months before the election and before January 6, but in the preceding years, about how Facebook can radicalize and polarize some users. That was what we focused on in building the story, as well as some of the details surrounding what was happening inside the company in those crucial days following the election, and ahead of the Capitol riot.
Melissa Harris-Perry: Now, Ryan, it's always possible that there are simply negative, unforeseen externalities that occur, as a result of how any given company does business. Is that what we're looking at here, with Facebook? Is this unfortunate, maybe horrible even, but not anything that could be expected, or is there something somewhat more malicious or negligent going on here?
Ryan Mac: I think this has to be understood that Facebook actually did allocate a lot of resources toward protecting the election. This was an effort that Mark Zuckerberg had outlined as a priority for the company, heading into November 2020. They'd been preparing for months, they had done things like booked a voter information center, they had registered voters, they had done [unintelligible 00:04:35] policies to go after voter misinformation.
I guess even still with all those initiatives, with all those efforts, with all that focus, the company's employees still found ways by which the election was being undermined on their platform. I think with that, it just shows the scale at which Facebook operates. Even though it's catching, let's say, most of voter misinformation, some stuff was seeping through.
This was outside of the documents, this came up in our own reporting but for example, in the days following November 3rd, election day, we had a post from a data scientist, who said that voter misinformation, election fraud claims, actually made up for 10% of all political views of content in the US, which is a staggering figure, if you think about it. 10% of all things viewed about politics in the US was on claims of election fraud. That's the scale that we're talking about here when we're looking at the stuff on Facebook.
Melissa Harris-Perry: Say a bit more about-- From these papers in particular, what employees were flagging internal to the company.
Ryan Mac: Sure. One of the studies that we wrote about, as well as other outlets, goes all the way back to July 2019. In July 2019, there was this internal report titled Carol's Journey to QAnon. An internal researcher basically created a fake account-- Not a fake account, a model account modeled after a conservative mom in one of the Carolinas. The account was started by following a couple news pages, Fox News, and I believe Sinclair Broadcasting was one of the other pages.
They just followed the process by which Facebook would recommend content to that page, through its recommendation algorithms. Within three days, that account started to get content for the conspiracy theory QAnon, which showed how fast a newly created account could go from following something like Fox News to QAnon. Those recommendations continued for the next couple weeks, to the point where the account was being recommended QAnon groups, QAnon pages.
That was a key study or I guess a key document in showing how Facebook's algorithms can polarize or radicalize people very quickly. That was one key document.
Melissa Harris-Perry: I want to dig in on that for just a second because we've talked a bit about algorithms on this broadcast before. I'm again fascinated by this idea that you got to show up, you follow what is-- for whatever opinions people have, Fox News is a mainstream news network, and within a few weeks, you're in these more radical spaces. Does that happen on the left as well? If I were to show up as a democratic mom in Pennsylvania or something, would the algorithms then push me to some set of radicalized accounts on the left or is it really a problem that is not just polarizing but polarizing in one direction?
Ryan Mac: I think every user is different but this same researcher actually ran a similar study doing just that. Modeling an account after someone on the left, following left [unintelligible 00:08:11] news sources, and also received recommendations for pages that spread misinformation. Things like sharing old news stories as something that was new, memes that were not quite correct about Donald Trump, that kind of stuff, that in turn, also polarizes people on the left.
In that report of that account, I didn't see anyone getting pushed to, let's say, a conspiracy theory, for example. It does show that on both sides of the political spectrum, Facebook's algorithms, whether that's the algorithm that pushes you groups you should join or pages you may like, can push people further in a direction away from where they started.
Melissa Harris-Perry: I think for many of us who are outside the tech world and watching this, it feels like there's an ebb and flow rhythm to Facebook regularly showing up in Congress, testifying, new reports. Then everybody signs into their Facebook account and talks about it. [laughs] Like, "Oh, my God, can you believe Facebook?" while posting on Facebook. I'm wondering, given the sort of, as you would talk about the size and scale, or we might just talk about the ubiquitous way that this platform is used, are we seeing any meaningful incentives for Facebook to shift things like its algorithms?
Ryan Mac: If you look back at earnings which happened [unintelligible 00:09:48] second-quarter earnings. This is a company whose products are now used by 3.5 billion people around the world. Its profits grew, I think more than 15%, around 17% yesterday, to $9 billion. This is a massive company and in spite of all this coverage, all these controversies, controversies that have dated back to the election of 2016 and to Cambridge Analytica in 2018. This company still continues to grow.
I think that's the dissonance that we talked about here with its financial performance and its performance with how many people use the thing to what we talk about, what we read in the news, what we see in these papers. I don't know, maybe there's a lag in that catching up to the public perception of the company. I don't know, I think a lot of people understand this coverage, but at the same time, they continue to use its products, whether that's Facebook, the blue app, or WhatsApp or Instagram. I think they're like you said, ubiquitous in our lives.
Melissa Harris-Perry: This is just super quick here, but this is not just a US issue, there's also some global issues, what's going on with Facebook in India?
Ryan Mac: Ton of issues in India. Where to start? One of the findings that we had in our stories was that it didn't allocate enough resources to moderating all the different languages in India, allowing far-right's pages and accounts to continue to spread misinformation and hate, despite its rules, and in spite of employees flagging the issues. Of course, India's Facebook's largest market, so it's a sign of things to come in that country.
Melissa Harris-Perry: Ryan Mac is technology reporter at The New York Times. Thank you so much for helping us understand the Facebook papers.
Ryan Mac: Thank you.
Copyright © 2021 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.