Facebook Whistleblower Senate Hearing Recap
![](https://media.wnyc.org/i/800/0/l/85/2021/10/AP21278634156142.jpg)
( Matt McClain / The Washington Post via AP, Pool )
[music]
Brian Lehrer: It’s The Brian Lehrer Show on WNYC. Good morning, everyone. Is Facebook is bad for democracy, is bad for children, is bad for racial and gender equality as it was portrayed to be in congressional testimony yesterday by whistleblower Frances Haugen?
Should the government take action to reduce Facebook’s potential for personal and social destruction? If so, what would that government action be? If the right and left are both so angry at Facebook, but for politically opposite reasons, censorship of the right versus enabling of right-wing disinformation how do we get to the truth about its real impact? Listen to this from yesterday’s hearing, the very conservative Republican senator Roger Wicker of Mississippi.
Roger Wicker: Talk to an opinion maker just down the hall a few moments before this hearing. This person said, “The tech gods have been demystified now.” I think this hearing today, Mr. Chair, is a part of the process of demystifying big tech. The children of America are hooked on their product. It is often destructive, and harmful, and there is a cynical knowledge on behalf of the leadership of these big tech companies that that is true.
Brian Lehrer: Roger Wicker from Mississippi, and here’s the very liberal democratic senator Edward Markey of Massachusetts.
Edward Markey: Here’s my message for Mark Zuckerberg, your time of invading our privacy, promoting toxic content, and preying on children and teens is over.
Brian Lehrer: Can they both be right? Is this the only thing they agree on and if so why? Here’s the whistleblower herself on with documents she took from the company, as well as her own Frances Haugen’s own opinions like this one.
Frances Haugen: The algorithms are very smart in the sense that they latch on to things that people want to continue to engage with. Unfortunately, in the case of teen girls, and things like self-harm, they developed these feedback cycles where children are using Instagram as to self-soothe, but then are exposed to more and more content that makes them hate themselves.
Brian Lehrer: Mark Zuckerberg posted a response last night denying that the company puts profits over the safety and well-being of its users, certainly, with respect to children, he said. He said the argument that “We deliberately push content that makes people angry for profit is deeply illogical.” From Mark Zuckerberg, we’ll explore that. On MSNBC this morning, the head of the NAACP, Derrick Johnson, saw Facebook’s algorithm as contributing to white supremacy, and the violence that it spawns.
Derrick Johnson: Facebook is the super spreader of hate. We said this last year where we help organize stuff a for-profit with Anti-Defamation League, Color of Change and with the others, because we begin to see whether it was the synagogue in Pittsburgh or Black church in Charleston or Louisville, Kentucky, the common denominator was Facebook.
At some point, we’re going to have to address it get in front of the bear of not allow Zuckerberg a group to say they got to do some modifications. In fact, we should be talking about antitrust right now, because it is too big of a company to spread the type of harm not only to individuals and communities but to our democracy.
Brian Lehrer: “Not only to individuals but to our democracy,” NAACP President Derick Johnson. Notice there that he called for the government to include antitrust measures. Will that fix the polarization and white supremacy problem breaking up Facebook into several smaller companies? What else might be done?
With me now is New York Times National Technology Correspondent, Cecilia Kang. She is also the co-author with Sheera Frenkel of An Ugly Truth: Inside Facebook’s Battle for Domination. Cecilia, thanks for some time today, and what I’m sure it is a very busy day after that hearing with so many reactions coming in. Welcome back to WNYC.
Cecilia Kang: Thank you, Brian. Happy to be here.
Brian Lehrer: About the documents first that whistleblower Frances Haugen revealed I think the most attention before this week has been on how Instagram hurts the self-image of girls and the documents reveal the company knew it and didn’t change to address it. How much is that backed up by the documents you’ve been able to see?s
Cecilia Kang: The focus has been on the teens in Instagram. Research that Facebook has done internally that shows as you said that, for example, one out of three teenagers said to Facebook’s researchers that they felt worse about their body image after using Instagram. There were 16% of users in the UK who said that those who had suicidal ideation could trace those thoughts and feelings directly to Instagram some pretty disturbing findings.
There was a lot of other research, Brian. There’s a whole mountain of research that shows a pattern of the company researching and knowing of harms, such as the spread of disinformation in countries like Ethiopia, in Myanmar that lead to genocide, and not putting the proper safeguards in place.
Other documents, memos of research that look into how the algorithms have amplified far-right groups in the US that have led to things like the organization of the capital riots. There has been research in memos and emails from people within Facebook, that disgust warnings that the company does not have the misinformation of vaccines under control. All of this was never disclosed to the public until Frances Haugen came out and leaked the documents she had obtained.
Brian Lehrer: Zuckerberg’s defense, that it’s deeply illogical to think that Facebook’s algorithm rewards content that makes people angry. Do you know how he would develop that pushback to say that the algorithms do not do that?
Cecilia Kang: You know what’s really interesting about that statement last night, which he posted on his Facebook account, is that he talks about how it’s illogical because advertisers would simply not stand for having their brands and their ads placed alongside hateful and toxic content. He says, “This is just a bad business decision to have this kind of content spread throughout the sites and the apps.”
What’s really interesting about that, Brian is that advertisers agree, they say, “We definitely do not want to be alongside this kind of content.” Facebook has a huge problem and they’re not incentivized to change that problem. A lot of advertisers joined a boycott about a year ago called Stop Hate for Profit. Their pull point and it included really big advertisers like Procter & Gamble and Verizon, and many, many hundreds of smaller brands as well, that said, “We’re taking a stand and we will for one month, at least not advertising the site.”
They all came back and they all said, “The big problem actually, is that the company is just too dominant and powerful. We cannot withstand or we cannot afford to not advertise on Facebook because of the scale and the number of eyeballs Facebook has.” They point to this as an antitrust problem. Mark Zuckerberg was saying that this is illogical because the advertisers are there. The advertisers are saying, “No, this is absolutely a problem and we’re only there because you’re one of the only places where we can get as much reach as we need to.”
Brian Lehrer: What about how he would push back on the idea that it’s bad for children? He also took exception to that and as opposed to last night.
Cecilia Kang: What Facebook and Mark Zuckerberg last night have emphasized is that among all the research, there are also some positive findings that many teenagers say that they find that connecting with people on Instagram has been a net positive thing for them. Facebook has really tried to train the focus of the public on those findings. Remember, the most difficult and the most troubling findings were about bad self-body image, as well as just generally a lot of teens saying they feel really poorly about themselves after using Instagram, and they feel glued to Instagram.
I think what’s happening here is that Facebook has this pattern where they often through their public relations tried to emphasize the good and they reject and to deny anything, any accountability, or any of the more negative aspects that are brought to light by, for in this case Frances Haugen, the whistleblower. Numerous, numerous news media stories and including in our book where we show pattern of one thing after another of the company really deflecting the blame on others or denying that they truly have these core problems.
Brian Lehrer: Listeners for any of you who may be confused why are they talking some about Facebook, some about Instagram in the same breath, Facebook owns Instagram, same company. Listeners, what are your evolving opinions about Facebook, experience of Facebook as a force for good or evil in your own social network circles, including Instagram?
Preferences for regulating or not regulating in specific ways, we’ll get more into that, or anything else you want to say or ask with Cecilia Kang, National Technology Correspondent at The New York Times and co-author of An Ugly Truth: Inside Facebook’s Battle for Domination. 646-435-7280 or post your comment to question on Twitter, there’s no hashtag for this segment, just tweet @BrianLehrer, we’ll see your tweet and use relevant ones on the air.
Cecilia, if Facebook is pushing America toward polarization, it sounded yesterday like the exception to that is bipartisan unity on criticizing Facebook. We played those clips of Senators Wicker and Markey. How much of that did you hear yesterday?
Cecilia Kang: I heard incredible unanimity between Republicans and Democrats on the need for Congress to act with legislation, to regulate internet companies like Facebook. For the last 30 years, there have been no new regulations on the internet companies. The only regulation that was created about in 1996 was actually to shield internet companies like Facebook from lawsuits, for the content that they host.
There is unanimity how Washington can regulate this company and really make sure that they really craft the right regulation is the big question going forward. There are a lot of options, and I heard really interesting proposals at that hearing as well.
Brian Lehrer: Would it be accurate to say, before we get into those specific proposals, that Democrats think Facebook enables right-wing disinformation too much and Republicans think it censors right-wing or conservative content too much because both things we would think couldn’t be true?
Cecilia Kang: Yes. That’s absolutely true, but you know what happened yesterday, Brian? Was that the hearing, there was less conversation about the specific content that the two sides are targeting their anger towards, if you will. The conversation was much more about the systems in place at Facebook that amplify harmful content. That was new.
That was absolutely new to hear members of Congress like Senator Blumenthal, Marsha Blackburn, John Thune from South Dakota and et cetera, talking about algorithms and ranking systems behind the newsfeed of Facebook and Instagram that amplified the most agitating content, and this was really key. This is what Frances Haugen had talked about a lot. This is what her specialty was at Facebook was the ranking systems behind the newsfeed getting into, if you will, the belly of the beast of how Facebook operates was a really new conversation.
There were proposals that were brought up by Senators, as well as Frances Haugen on addressing how Congress can force or compel a company like Facebook to make public and transparent how its algorithms work. To compel Facebook, for example, to share its algorithms and ranking systems with researchers so they can understand the prevalence of misinformation and how hate speech spreads. This is really key.
Facebook operates in many ways like a black box. That was something that Republicans and Democrats agreed with, they all said, “Facebook should no longer be a black box that has these secret algorithms in place that amplifies certain harmful content. These systems and the way it works should be more transparent.”
Brian Lehrer: To that point, here’s another clip of where blower Frances Haugen in her testimony yesterday.
Frances Haugen: Almost no one outside of Facebook knows what happens inside of Facebook. The company intentionally hides vital information from the public, from the US government, and from governments around the world.
Brian Lehrer: You reported, Cecilia, that she called for Congress to seek other data from the company that could lead them to the right kinds of regulation. What kinds of that data would that be? Is that clear?
Cecilia Kang: She wasn’t specific, but I think it was generally in the realm of research and findings on how the company operates its ranking systems, how it prioritizes engagement over, say, healthy content. To understand such as like that Instagram report, why it is that a teenage girl who expresses interest in, say, weight loss or healthy food is pushed toward content that amplifies or that encourages eating disorder and self-harm.
These are real-world things, Brian, that happen. In fact, Senator Blumenthal read the text messages of a parent whose teenager absolutely endured that kind of experience, where on Instagram, they were pushed toward more harmful and harmful content, a rabbit hole if you will, of harmful amplification.
Brian Lehrer: Let me read you two opposite tweets that have come in from listeners as our first listener comments in this segment. One says, “I deleted Facebook the day after Trump got elected because my newsfeed was so toxic. I decided I didn’t want to carry that negativity.” That’s one.
Here’s pushback from another listener who writes, “Is anyone reminded of all the congressional hearings and hysteria over comic books, video games, movie ratings? What was the result of any of that? Not much. This congressional show is going to result in nothing. Energy could be spent on bigger issues.” What do you think about that last comment, Cecilia, that labels it as hysteria that has attached in previous generations to other popular media?
Cecilia Kang: I would say that this is a different type of technology and media if you will. This is a company that has a technology that amplifies content faster and in a more pronounced way than any other technology before Facebook often likes to compare itself to the telephone or broadcast television.
They say that every time a new communications tool like Facebook has been introduced in history, there has been mass hysteria and about the disruption that that new to communication tool brings. After using the telephone for a while and after watching broadcast TV, the public usually calms down and realizes that NetNet, these communications tools are positive, but Facebook is very different.
Facebook has inside the systems, amplification, and ranking systems that amplify the most agitating content. Facebook is in the business of agitation, and that could be positive agitation or negative agitation, agitating fear, or agitating happiness. It’s that nuanced sort of middle of the road reaction where people don’t react actually much is what they don’t want because that’s not what brings you back.
That’s really different in the sense that Facebook is able to measure your likes, your comments, your shares, and put that kind of content, through its algorithms and ranking system to the top of other news feeds and promote the most emotive content. It’s a very different kind of technology and a very smart person named Renee DiResta, an academic explains it this way, she says, “Facebook talks about its adherence to freedom expression sort of principle, but freedom of speech,” she says, “does not mean freedom of reach. It’s the reach that differentiates Facebook from other platforms.”
Brian Lehrer: Wouldn’t Facebook argue that the reach is based on a kind of democracy that social media allows. The reach is dependent on how many individual people find it interesting enough or meaningful enough to pass on to other individual people.
Cecilia Kang: Yes. I would say that Facebook puts its thumb on the scale, that does not necessarily democratize the most popular content. What I mean by that is they have created carve-outs or exceptions for political leaders and celebrities. I’ll stick on political leaders, for instance, that allow, for example, the former President Trump to spread misinformation or to lie, in general, and to not actually take down his account or posts. This is before they, ultimately, after the January 6th rise did decide to finally remove his account, at least for the next two years.
This sort of an exemption allows for often the politicians who understand the power of Facebook’s reach and its tools to amplify sometimes the most harmful and the most dangerous rhetoric. That’s one thumb on the scale.
The idea that Facebook only allows, and the other thing I would say is that Facebook allows for misinformation, disinformation to travel, particularly in emerging market countries where they are present, but they don’t have enough people on the ground monitoring for harmful content and misinformation. In the case of Myanmar and Ethiopia, they have so few people, even today, after years of ethnic violence in these countries monitoring for disinformation that has encouraged ethnic violence.
There may be popular speech, but there’s a role for the company. There’s absolutely popular speech that spreads widely, but Facebook is pushing for the most agitating content and often not putting the safeguards in place and rewarding those who are able to generate the most agitation, like political leaders who understand the power of the tools that Facebook has.
Brian Lehrer: Desiree in Park Slope you’re on WNYC with Cecilia Kang National Technology correspondent for The New York Times. Hi, Desiree.
Desiree: Yes, hi. I have used tech my whole life. I happened to be a Gen X, so I have experienced possibly every kind of technology communication device across my life. I think that we’re in a very particular position as Gen X-ers to look at how media affects us and affects young people. The thing that I’m not hearing anybody talk about in regard to Facebook and Instagram is people. The importance of critical thinking skills, the importance of people learning how to vet information, none of which is Facebook’s responsibility.
Facebook is responsible for its software, it’s platform, but at the end of the day you’re still dealing with people, you’re still dealing with, do I listen to the crazy man on the box, on the corner telling me information about healthcare, or do I talk to a doctor? That’s a decision that you have to make in real life. It’s a decision you have to make when you use technology.
I would love to see some emphasis on how we train young people and older people to use social media to vet things that they find on the internet, the difference between the worldwide web and the internet information that is vital to making these kinds of reach issues less impactful. Whether you have an algorithm that pushes certain kinds of content or not, there’s still a person making a decision of whether or not to look at it.
Brian Lehrer: Cecilia, talk to Desiree.
Cecilia Kang: I think it’s such an important point, Desiree. I’m glad you brought that up because media literacy is incredibly important. I do think that actually the reporting by The Wall Street Journal and many others, as well as the revelations from these documents from the whistleblower have really elevated the conversation so that we understand what is actually happening within Facebook. That results in why you see what you see when you log on to the app.
Deseret: Can I just say one thing? I’ve tweeted this at Brian, but I remember on a show called Family in the ‘70’s, there was a character named Buddy who was a teenage girl who was not very feminine. They dealt with some of the same issues that people are seeing young people are dealing with looking at Instagram like now. After school specials built that those same issues, those issues are always going to be there when teenagers are looking at things that they aspire to. That’s not a Facebook thing. That is a how we [unintelligible 00:23:26] to think about themselves. [crosstalk]
Brian Lehrer: Desiree, thank you so much. Go ahead.
Cecilia Kang: I think what is, again, very different is that teenager who might’ve looked at a fashion magazine in the 1980s, and look, I’m a Gen X-er too, 1980s or 90’s would maybe put that page out of that magazine on her wall and that would have affected her thinking. What Facebook’s algorithms do, and this is again, what is different.
I don’t actually think it’s actually analogous to the afterschool specials and what was happening in this whole scenario, teens of decades past is in the case of Instagram, and this was mentioned in the hearing you have teenagers who will express, for example, “I think I like fitness,” and suddenly they’re pushed to content where it becomes much, much more invasive as well as potentially harmful related to things like, “Okay, you like fitness, maybe you like dieting. Maybe you like extreme dieting. Maybe this push you to content that’s related to eating disorders and self-harm.”
Those are real world examples of how the systems, the ranking systems take place and how they work. That’s different than taking out that fashion magazine, which of course, was the problem. This isn’t phenomena that has existed for the longest time. What’s different now is there’s a technology that facilitates even deeper and more troubling voyages, if you will, or journeys for teenagers into often darker places.
Brian Lehrer: Desiree, thank you. Keep calling and treating us Brian Lehrer on WNYC with Cecilia Kang, National Technology Correspondent for The New York Times and author of the book, An Ugly Truth: Inside Facebook’s Battle for Domination and Charlie in Rockaway you’re on WNYC. Hi, Charlie.
Charlie: Hi, Brian. I am the site manager for a late [unintelligible 00:25:26] in the Northwest in New Jersey. Facebook is really a problem in my community. Facebook, we call them the Facebook warriors, in my community there’s like a core of 50 people that volunteer their time to make it a better community, three legs, two pools, a country club building four fields, et cetera.
Whatever these volunteers do reminds me of the [unintelligible 00:25:51] Roosevelt quote about people watching someone else in the ring doing the work and just negatively commenting on the efforts, but they themselves not doing anything to improve the community. It’s gotten so bad that the board has decided not to respond on Facebook and to hopefully let all these Facebook Warriors postings die in the vine and it actually works.
Brian Lehrer: Charlie, do you have an example?
Charlie: One example is the president of our community volunteered his times to renovate an old drop ceiling, because there was no money in the budget to replace it with a new drop ceiling. The Facebook Warriors bashed him for wasting his time trying to repair an old drop ceiling rather than buying a new drop ceiling and their unaware of how much money is allotted to those repairs.
Brian Lehrer: Go ahead.
Charlie: Two years ago, they would have negative postings about our life guards, which are just teenage kids, high school kids.
Brian Lehrer: Thank you. That’s interesting, Cecilia, even in what might be likely to be a pretty cohesive community, although I guess if you’ve ever served on a New York city co-op board, maybe wouldn’t think so, but a pretty cohesive community of homeowners around a lake and other recreational facilities and things like that. He’s the site manager for social media there and he finds it’s pulling people apart, even at that level. That’s not white supremacy and rigged elections, even at that level he sees that.
Cecilia Kang: Yes, it’s really interesting. These groups are quite vital for communities like the callers. There are very few places to host this kind of the interaction between people in a small group and groups are in fact some of the fastest growing parts of Facebook, the Facebook groups. There is absolute utility.
I think what he’s talking about is something a little bit different than what we’ve discussed so far in the show, which is the way in which people present themselves and interact online. There is a level of I don’t know, decorum that has lost when it comes to how we communicate online.
This is not a Facebook problem. This is a problem for a lot of places, and my next door group is pretty rancorous, or sarcastic. You see this in many ways, the way that people communicate on Twitter oftentimes it’s the kind of way that they would never communicate in real life. There is something about the way that the normalization of really poor communication and behavior online which is different than just a Facebook problem, but you definitely see it throughout Facebook and these groups. These groups are quite important for communities as well as for Facebook’s business growth.
Brian Lehrer: As we come to the last stretch of conversation that we’re going to have here, sort of a money clip from whistleblower Frances Haugen from her testimony yesterday. The one that the media keeps replaying we tried to put in some different ones along the way first here, but these are the 15 seconds that everybody’s hearing.
France Haugen: I believe Facebook’s products harm children, stoke division, and weaken our democracy. The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people.
Brian Lehrer: It’s the last part of that, that I’m interested in, Cecilia. “Won’t make the necessary changes because they put astronomical profits before people,” what changes could Facebook be making internally to solve the problems we’ve been disscussing? What types of new regulation that you were mentioning before that they started to come up with yesterday are now being discussed?
Cecilia Kang: Frances Haugen as well as some lawmakers themselves said that what Facebook could do is to change the way that it amplifies the most agitating content to make sure that harmful content especially misinformation and hate speech does not spread rampantly across the site. Putting enough people in place like in the division that she used to work in that was actually dissolved after the 2020 election according to her, to police for misinformation and to make it a huge priority.
They were saying actually, she and lawmakers, that Facebook can continue to be very profitable, not crazy, crazy insanely profitable as it is today, but still very, very profitable by making these changes to what kind of content travels the most across the apps. Change will not likely come from within as evidenced by the Facebook posts that Mark Zuckerberg made last night, where he really dug his heels in and said, “This is just untrue. A lot of the assertions that the whistleblower’s making and a lot of the research has been taken out of context.”
Facebook will say that they embrace regulation, but I think it’s very important to know that it’s only specific type of regulation that they will embrace. What I heard yesterday was lawmakers, and of course, Frances Haugen saying that the only thing that can happen going forward is external pressure and that has to come through regulation. It could be data privacy regulation. It could also be curbs on a speech liability that is essentially law known as Section 230 of the Communications Decency Act that it shields companies like Facebook from being sued for the content that they host.
It could also be from what Frances Haugen suggested is compelling or forcing more transparency of how the systems work, allowing researchers to get into Facebook’s algorithmic ranking systems to see why misinformation travels so rapidly and studying those sorts of systems to bring into the public, to inform more conversation with lawmakers. There is, of course, also antitrust action that’s happening right now in Washington with the Federal Trade Communications lawsuit to break up Facebook.
There are a menu of things that are being considered. I will also say, Brian, that I’ve been covering this for a really long time, and I have gone to many, many hearings including those with Mark Zuckerberg and Sheryl Sandberg and other leaders of Facebook and nothing has yet been done. There has been no new law for the internet since 1996. [crosstalk] Yes, go ahead.
Brian Lehrer: I just going to say I know you got to go but the thing most likely to happen seems to be some kind of antitrust action. I’m just curious if you think that would stop Instagram from allowing destructive ideas to spread about girls’ self-image or stop Facebook from being a platform for vaccine or climate disinformation or hateful white supremacy or antisemitism as the NAACP president said in the clip we played that it’s a super spreader for. What is making Facebook smaller do to stop the individual smaller branches of Facebook from doing the same thing?
Cecilia Kang: I think, first of all, antitrust is going to take a long time. The lawsuit’s going to take many years to actually wind its way through this court battle. The antitrust argument is really interesting and I do think it’s separate from looking and trying to address these harms that were discussed yesterday.
It’s all absolutely tied back to the dominance of Facebook and you’ll see that advertisers, as we talked about, feel like they can’t get off because Facebook is so powerful and important for them and their business. Antitrust is part of that conversation, but I just don’t know if you’re going to address the systemic problems of Facebook’s business model and how its system works if you’re breaking it apart into companies that still have that same business model.
Brian Lehrer: Cecilia Kang, National Technology Correspondent at The New York Times, and co-author of An Ugly Truth: Inside Facebook’s Battle for Domination. Thank you so much for coming on this morning, Cecilia, we really appreciate it.
Cecilia Kang: Thanks for having me.
Copyright © 2021 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.