Whistleblower Exposes Facebook's Prioritization of Profit Over People
[music]
Melissa Harris-Perry: Welcome to The Takeaway. I'm Melissa Harris-Perry. I'm here at the WNYC studios with the morning production team. First, let me introduce you to my director, JC. Say hi, Jay.
JC: Hi, Jay.
Melissa Harris-Perry: Jay, I need your help with telling this next story. I want to review some of the big moments over the last 15 years, and I just want you to keep us up to date on one statistic as we move along. Let us know the number of Facebook users worldwide.
JC: You got it.
Melissa Harris-Perry: September, 2008, there's a subprime mortgage crisis. It's shaking the foundation for the American economy. Lehman brothers collapses. Senator Barack Obama, the first Black nominee of a major party pulls ahead in the presidential race.
JC: At this point, Facebook has 100 million users worldwide.
Melissa Harris-Perry: Let's skip forward. November, 2011, a massive nor'easter hits the east coast, it's leaving millions without power, the first skirmishes of what will become the Arab spring begin, and 20 are killed in a Syrian uprising.
JC: Now Facebook has 845 million users worldwide.
Melissa Harris-Perry: Now it's January, 2015, 12 journalists are murdered by a mass gunman who stormed the offices of Charlie Hebdo in Paris and Boko Haram captures a major town in the Buma region of Nigeria.
JC: Now we're up to 1.5 billion.
Melissa Harris-Perry: Come forward to January, 2017, when reality TV star Donald Trump is inaugurated as the 45th president of the United States, despite never previously holding public office.
JC: Up, up, up, 2.1 billion users worldwide.
Melissa Harris-Perry: Then December, 2020, the world has shut down, and the global Coronavirus pandemic has claimed 14 million lives.
JC: All the way up to 2.9 billion users worldwide.
Melissa Harris-Perry: Then Jay, Monday, October 4th, 2021 at noon Eastern.
JC: Facebook has zero, zero anything. Facebook is just down.
Melissa Harris-Perry: Once again, we've been in the billions and then Monday we're at zero.
[music]
Melissa Harris-Perry: It might seem silly to describe the outage of a social media networking site as disastrous, but across the past 15 years, the exponential growth of Facebook has been relentless. Massive economic and political shifts have taken place domestically and internationally. The world has lost tens of millions to famine, war, disease, pandemic, new leaders and nations have emerged and all along, Facebook has done just one thing, grow. The Facebook, IG, WhatsApp outage was remarkable. This six hour mass outage was the longest period the world's billions of users have experienced a no Facebook world since its worst downtime of nearly 24 hours back in 2019.
See, the timing of this 2021 downtime fueled suspicions and prompted conspiratorial theories, although it was harder to spread these theories, what with Facebook being down and all. The day before the mass outage, Facebook made headlines when former employee turned whistleblower, Frances Haugen, went public with evidence that Facebook knowingly chose profits over public safety. Tuesday, the day after the world lived six hours without liking, sharing, or Live-ing anything, Congress heard Haugen's testimony.
Frances Haugen: I'm here today because I believe Facebook's products harm children, stoke division, and weaken our democracy. The company's leadership knows how to make Facebook and Instagram safer, but won't make the necessary changes because they have put their astronomical profits before people.
Melissa Harris-Perry: Elected representatives from both parties expressed outrage, frustration, and a willingness to respond with regulation.
Speaker 4: This research is the very definition of a bombshell. Facebook and big tech are facing a big tobacco moment.
Speaker 5: I think that it will be this Congress and this subcommittee that is going to lead the way to online privacy, data security, section 230 reforms.
Melissa Harris-Perry: Just what can Congress do? After all, they're only 535 members of the house and Senate each making about $175,000 a year. Meanwhile, Facebook has over 2--
Jay: Billion users worldwide.
Melissa Harris-Perry: CEO, Mark Zuckerberg, well, he makes more than 25 million a year. Can David regulate this digital Goliath? Here with more is Cecilia Kang. Cecilia is a national technology reporter for The New York Times, and co-author of An Ugly Truth: Inside Facebook’s Battle for Domination. Cecilia, welcome to The Takeaway.
Cecilia Kang: Thanks so much for having me.
Melissa Harris-Perry: I want to start with the outage. Is there any reason to believe that it is at all connected to this week's whistleblowing in media and in Congress?
Cecilia Kang: There's not evidence yet. Facebook has said this has to do with errors related to their servers that connect to the sites, the apps, Facebook, Instagram, WhatsApp, Facebook messenger, and we have to take them at their word at this point that it is an IT basic malfunction problem, but it was an extraordinary outage. The timing could not be worse, but there is no conclusive evidence that it's linked to the whistleblower.
Melissa Harris-Perry: I was wondering about that idea that the timing couldn't be worse because I'll say in our production meeting with The Takeaway team, one of the first things that emerged on Monday was a conversation about, maybe what we need is these individual boycotts, people to go off of Facebook. It's almost like the next day, Facebook, or later that day, Facebook closing down almost felt like, "Hey guys, just so you know, this is what the world would feel like without us."
Cecilia Kang: I definitely saw on social media, as well as talking to people in real life, that people felt that it was a refreshing break in the US. I should say that in other countries where Facebook is so necessary, particularly WhatsApp and Instagram and Facebook, actually all the apps, for communication, that really, the outage highlighted that Facebook is a utility in so many countries and for so many users around the world, because that is really their on-ramp onto the internet and their only way of communicating.
Melissa Harris-Perry: If it is, in fact, the utility and a really odd one, or one that we don't quite have maybe another example of, this global utility that is nonetheless potentially going to be regulated in the US, should Congress, in fact, be regulating something, if it's a utility?
Cecilia Kang: I think that the evidence is mounting. Based on my own research with my colleague, Sheera Frenkel for our book, as well as what the whistleblower Frances Haugen has revealed, is that change will not come from within. The company is led by one individual, and that individual has set, Mark Zuckerberg has set a culture that has prioritized growth and has prioritize engagement, getting people to come back more and more and more, and really aiming and designing all of the technology tools to get people to engage in a way that growth is prioritized. The collateral damage, or the consequences of that is that you do have very toxic and harmful content that tends to thrive on these apps.
It's a dangerous situation that change is not coming from within, because internally, Mark Zuckerberg has said, even last night, that all of the criticism is misguided, the company is misunderstood and they're really digging their heels in. The only way to see change has to be external. Increasingly, you're hearing experts across the aisle and lawmakers across the aisle, I should say, saying that it has to be through regulation and it has to be a regulation that really strikes at the heart of the business model that Facebook has.
Melissa Harris-Perry: Let's pause for a moment and go back to Haugen a bit. Can you talk about what kind of research and documents she has?
Cecilia Kang: Yes. Frances Haugen, who was a product manager on the civic integrity team, obtained tens of thousands of documents. Most of it has to do with research that was done internally, that show that the company has known and has been worried about the flood of toxic misinformation and just toxic content that tends to be prioritized on the site. Specifically, there are documents related to Instagram and the harms that Instagram poses to teenagers and research that shows that teenagers say simply, they don't feel good about themselves after using Instagram. One in three teenagers, survey said, that they felt like their body image was worse from using Instagram.
There's other research that shows that Facebook decided to prioritize very engaging or agitating content leading to the-- ahead of the January 6 riots. There is other research that showed that Facebook has systematically under-resourced international content moderation in countries where disinformation on Facebook has thrived, such as Ethiopia and Myanmar. It was a really wide array of different type of research with the same central theme and pattern that emerged over and over again, which is that Facebook has known for a long time that problems exist.
Very, very deep problems exist within the systems of the company and that misinformation and harmful content tends to thrive, and that Facebook has, in many times, ignored these warnings, and has certainly not disclosed all the research that Frances Haugen brought to the public.
Melissa Harris-Perry: I'm wondering if in some ways that fact that Facebook knows that they did the research, that the research demonstrated these harms, and then did not either adjust their business model or even inform the public so that the public could make new consumption choices. Is that what makes that parallel that we heard about Big Tobacco, this idea that they knew?
Cecilia Kang: Yes, Melissa, I think that's exactly right. I think that you can quibble with whether Facebook can be compared with tobacco and cigarettes. What you can't quibble with and what seems like a very clear parallel, is that Facebook has said one thing to the public, which is that its products are safe, and that the net effect of Facebook's role in the world is that it's positive, that it connects people and people feel good about it. That's the message that they've held since the beginning of Facebook in 2004.
What it also shows is what we've seen through this whistleblower’s leaks, and through reporting by many reporters, is that Facebook has systematically not disclosed that it internally has known so much more about the dangers that Facebook poses and that the messaging points are only one very small piece of the story of Facebook, which is that Facebook is internally grappling with a lot of the dangers and harms that its platform produces.
Melissa Harris-Perry: How is Facebook responding to these documents and to this whistleblowing, are they claiming it's not true, are they claiming that these are one part of the documents, but we haven't seen the rest, what's their defense?
Cecilia Kang: All of the above. First of all, they are trying to discredit Frances Haugen, the whistleblower, and saying that she's only worked there for two years and she did not have direct reports, and does not have expertise in some of the areas that she commented on in the hearing yesterday in the senate panel. Mark Zuckerberg yesterday posted on his Facebook account, an internal message to employees where he really dug his heels in and said, "Look, this is not a company that we recognize the way the media is portraying us. It's not a company that we know to be truly who we are. The research that Frances Haugen has made public is taken out of context, and it is illogical," he said, "That we are trying to prioritize profits over safety."
He says that advertisers, which is the business, Facebook is in the business of advertising. "Advertisers don't want to put their brands and their ads next to hateful content." I will note, Melissa, that thousands of brands have protested Facebook and have tried to get Facebook to solve its content problem. They indeed do not want to have their brands up against this content. What they also say, these advertisers say, is that Facebook is so dominant that even though they're so upset about the way that Facebook manages its content, they have no other choice but to advertise on Facebook, because that's where all the eyeballs are.
Melissa Harris-Perry: Cecilia, let's just dig in here a little bit. Explain maybe a little bit more about this unlawful monopolization discourse?
Cecilia Kang: Right now, the Federal Trade Commission is seeking to break up Facebook through a lawsuit in the US District Court. They're central argument is that Facebook has become a social media monopoly, and it uses its monopoly power to quash rivals and to ultimately harm consumers. Facebook adamantly disagrees. Interestingly, Frances Haugen, the whistleblower, says she does not believe the anti-trust is actually the right remedy. That by breaking up Facebook, you'll be starving the individual pieces of Facebook, if it were to be broken up from the resources it needs to fight this problem of misinformation and hate speech.
Anti-trust is one of many solutions that's being bandied about in Washington, and you are seeing real action with the federal government trying to fight that way. There are real questions as to whether, and I've come to believe in my own reporting, there's real questions as to whether Facebook and the business model, which is a business model basically built on attention and agitation, if that can be solved by breaking it up into pieces, and if breaking it up really just creates more of problematic Facebook like companies. It's a really interesting debate. Maybe the solution is that all of the above has to happen, anti-trust action and regulation.
Melissa Harris-Perry: Say a little bit more. I’m fascinated about the challenge of trying to regulate something where the fundamental model is getting eyeballs through agitation, which it does seem will ultimately always lead to a race to the bottom in terms of content.
Cecilia Kang: Yes, the challenge in doing that is that in Washington so far, regulators have looked at the symptoms of the problem like data privacy abuses, the spread of vaccine misinformation, and tried to regulate those specific things but they haven't gotten to the core of the technology that enables those things to happen. Really, the bad seed or the real sickness, which is the algorithms that amplify the most agitating content, which is often this harmful content.
Yesterday, I heard something quite different. That was at the hearing, I heard lawmakers discussing very substantively, with Frances Haugen, the idea of regulating the algorithmic amplification of this business model and regulating it by forcing more transparency. That could start with forcing the company to share information about how it decides to rank within its newsfeed and decides to amplify certain content through the likes and the shares and the comments, and how that kind of system, the algorithms, allows toxic and harmful content to spread so rapidly, by making that available to researchers, then researchers can show how the problem of misinformation is spreading and that sunlight is the best disinfectant could be one solution.
That was, I think, the first start. I think Facebook will be very reluctant to try to reveal too much about how its news feed works, especially to the public, and they have actually shut down a lot of research. They do enable, and they do have lots of partnerships with other researchers, hundreds, in fact, but very recently, Facebook cut off researchers at NYU who were studying the spread of political misinformation in the 2020 election.
Melissa Harris-Perry: Cecilia Kang is national technology reporter for The New York Times. Thank you so much for joining us today.
Cecilia Kang: Thank you so much for having me.
[00:17:02] [END OF AUDIO]
Copyright © 2021 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.