Who Owns the Future?
BOB GARFIELD This is Bob Garfield with a year-end bit of inconvenient truth. In 2020, 1.2 percent of our listeners donated to On the Media. So this is for the 98.8. We're not money grubbing here - that cash literally keeps this show afloat. And before the Times Square ball falls, we need you please, to bouy us. Go to OntheMedia.org and hit the donate button, or text OTM to 70101. Thank you and happy holidays.
DINA SRINIVASAN Facebook really flipped that switch in June of 2014, the same month that Google exited the social media market.
BOB GARFIELD A new antitrust lawsuit says killing competition enabled Facebook's devouring of user data. From WNYC in New York, this is On the Media, I'm Bob Garfield. What that case can't possibly address, though, is Facebook's responsibility for bodily harm. Even domestic terrorism.
CAROLE CADWALLADR Is the equivalent of the meeting house where the white supremacist met up, swapped notes and advertised for new recruits.
BOB GARFIELD Nor does antitrust protect invasions of our lives tracked everywhere, even by objects. Eventually, to be monetized.
SHOSHANA ZUBOFF It could be your dishwasher. It could be your television set, could actually be the mattress on your bed.
BOB GARFIELD Surveillance capitalism where our future behavior is the product. It's all coming up after this.
BOB GARFIELD From WNYC in New York, this is On the Media, Brooke Gladstone is out this week. I'm Bob Garfield.
It's beginning to look a lot like Sherman, Senator John Sherman, that is, who in 1890 sponsored the antitrust law that bears his name. 130 years ago, he pronounced, quote, If we will not endure a king as a political power, we should not endure a king over the production, transportation and sale of any of the necessities of life.
[CLIP]
VOICEOVER On February 18th, 1902, without any warning, the president ordered his Justice Department to file suit against one of the trusts in which J.P. Morgan had a major interest: The Northern Securities Company. Its goal was the monopolistic control of all of the railroads between the Great Lakes and the Pacific Ocean. [END CLIP]
BOB GARFIELD Since then, other powerful and repressive kings have been dethroned from Standard Oil to Eastman Kodak to AT&T.
[CLIP]
NEWS REPORT AT&T stock was as good as gold. It wasn't dependent on our currency. It was gold. Now it's gone. [END CLIP]
BOB GARFIELD That was 1984, this is now. 20 years into the digital century. Big tech reigns supreme, all but unchecked by law and regulation. The so-called duopoly of Google and Facebook, valued at just under two trillion dollars between them, control 35 percent of the 600 billion dollar global advertising market, not to mention ever more of our personal human lives. This is Harvard Professor Emeritus Shoshana Zuboff in the documentary The Social Dilemma.
[CLIP]
SHOSHANA ZUBOFF Facebook to discover that they were able to affect real world behavior and emotions without ever triggering the user's awareness. They are completely clueless. [END CLIP]
BOB GARFIELD Then in the past 10 days came a storm. First, week before last, 48 states and territories, along with the Federal Trade Commission, filed suit against Facebook. That suit alleges that Facebook bought up rivals with the explicit intention of stifling competition.
[CLIP]
NEWS REPORT Legal filings include an email from Mark Zuckerberg in 2008 in which he allegedly said, quote, Better to buy than to compete. [END CLIP]
BOB GARFIELD And then this past week, a second thunderclap when Texases attorney general announced new antitrust charges against Google. The suit claims that Google, in a conspiracy with Facebook, abused its market power to chip away at consumer privacy protections and rigged the advertising market. But if you think the trustbusting Senator Sherman has come out from decades of hiding, that's not quite the case. For decades, antitrust doctrine has been fixated to the exclusion of everything else on harm to the consumer, as measured by out of pocket costs. Social media, mind control and the erosion of democracy do not fit into that calculation.
DINA SRINIVASAN That's sort of the traditional metric that we've used to bring antitrust cases and to really understand and measure consumer harm. It's been those price hikes that really hit consumers pockets.
BOB GARFIELD Dina Sreenevasan, author of the 2019 paper The Antitrust Case Against Facebook and a legal consultant in the Texas suit, is on the leading edge of an evolved antitrust doctrine based on harms not necessarily inflicted at the cash register, for example, invasion of privacy on a grand scale.
DINA SRINIVASAN I just found it so interesting. You know, why is it that the communications utility in the 21st century that all consumers use essentially conduct's something similar to surveillance? You sign up for Facebook and Facebook, not only monitors your communications on Facebook, but even, you know, when I go to, for example, The New York Times in the morning, Facebook is making a record of that and it is extracting from consumers the permission to basically track them across the Internet. And it didn't seem obvious to me that consumers would sign up for that proposition as something that they really liked.
BOB GARFIELD It's also a bit ironic because as I understand it, in the beginning, Facebook was favorably compared to, let's say, MySpace an early social network, where your personal profile was at least at one point public. And Facebook was theoretically an antidote to that.
DINA SRINIVASAN That's right. If you go back in history, you see how Facebook entered the market with very firm privacy promises. It got users to choose Facebook over other competitors in the market and only after it gained market power and competitors exited the market, was it finally able to extract this sort of surveillance term from consumers at large.
BOB GARFIELD And you can chart that. Facebook's growth goes up, privacy protections straight down?
DINA SRINIVASAN That's right. And the fascinating thing about the New York attorney general's suit last week is that internal communications confirmed that that's indeed how Facebook internally was considering strategic moves about decreasing users privacy. I don't remember another case that has been brought in a market where the price is zero, and the government is deciding to defend the people based on things like a lack of innovation in the market, a lack of choice. Everybody uses Facebook to stay in contact with their friends and family and then just privacy harms.
BOB GARFIELD So let's talk about that suit. New York and 47 other attorneys general along with the FTC. Now, it's not just this new doctrine authored by you and others. The case is obviously built on particular Facebook alleged anti-competitive behavior. And there are some smoking guns. What are they?
DINA SRINIVASAN The New York attorney general's suit touched upon internal communications where Facebook was wanting to decrease user privacy. But executives internally said, you know what, we have a lot of competition from Google right now, so let's hold off and let's wait for competition to exit. How did Facebook get to the point where it can monitor communications, what you read, what you research, what you buy across the web? And the thing that I track and the research is that Facebook really flipped that switch in June of 2014, was the same month that Google exited the social media market. The other thing is that the cases both from the states and from the FTC challenge at their core, Facebook's acquisitions of WhatsApp and Instagram. And if we go back in history, one of the interesting things about, for example, WhatsApp is that things were said at the time of that merger that allowed people and the public to believe that their WhatsApp communications were going to remain very private.
BOB GARFIELD And that was the whole point of WhatsApp to begin with, right? That your messages would, if I recall correctly, expire after a certain amount of time in order to protect your privacy.
DINA SRINIVASAN That's right. After Facebook acquired WhatsApp, Facebook got consumers to agree to having Facebook use WhatsApp metadata. It's not the contents of what you and I communicate about, but it's the timestamp, the fact that we talked, how long we talk for and where we were when we had that conversation. Metadata can reveal a lot about what people are talking about and the contents of those communications. For example, if I go to the San Francisco Golden Gate Bridge and I'm standing on the edge of the bridge at ten, fifty, nine p.m. and I make a call to a suicide hotline, you don't really need to know the contents of that communication. You already know exactly what's going on. I would argue that that is a degradation of privacy when it comes to, for example, WhatsApp communications.
BOB GARFIELD Now, on the subject of the acquisitions of Instagram and WhatsApp, which is very much at the heart of this case. Facebook has said now just wait one second here. We sought approval for these deals years ago and the FTC gave it. Now, it's pretty funny to have that accusation from a company that for its entire history has made promises to users and advertisers and media and government and then unilaterally reneged. But still, it is a compelling point. How do the plaintiffs justify revisiting those approvals?
DINA SRINIVASAN There's a journalist at Wired, Gilad Edelman, who I think put it best. You know, we don't have a no-backsies rule in antitrust. We don't have a situation where these agencies are precluded from going back and challenging the acquisitions, you know, based on a current and more robust understanding of how these acquisitions affected competition in the market.
BOB GARFIELD On Wednesday, Texas Attorney General Ken Paxton, he announced another suit against Google and Facebook with a video on Twitter.
[CLIP]
KEN PAXTON Google repeatedly used its monopolistic power to control pricing, engage in market collusion to rig auctions in a tremendous violation of justice. Let me put it this way, if the free market were a baseball game, Google positioned itself as the pitcher, the batter and the umpire. [END CLIP]
BOB GARFIELD Dina you actually helped in this investigation, which alleges that Google struck an unlawful agreement with Facebook in order to stave off competition in the advertising market. Collusion, in other words. It says Facebook granted Google access to millions of Americans, end to end encrypted WhatsApp messages, photos, videos an audio files.
DINA SRINIVASAN Google and Facebook make all of their money from online advertising. And when you understand that and you understand that a company can make so much more money and online advertising, if they have access to users personal information, it shouldn't come as a surprise that that's how these companies operate.
BOB GARFIELD Not the typical price fixing, but so-called privacy fixing.
DINA SRINIVASAN Yeah, it's a great term of art. The example I like to give here is, you know, just like two chicken farms can't get together and set the prices of chickens in the market. They also can't get together and fix the quality of anything else. You know, OK, let's only sell white chicken meat and let's not sell dark chicken meat. That sort of fits in the same bucket of prohibited behavior. And the suit sort of makes this analogy when it comes to privacy.
BOB GARFIELD Tell me if this is an oversimplification of the business model, get people in your platform and keep them there, get as much personal data from them as you possibly can, and then use that personal data or abuse it to squeeze the greatest amount of profit from every single ad transaction and along the way get an unfair advantage as a participant on both ends of those transactions.
DINA SRINIVASAN You've hit it on all fronts, exactly.
BOB GARFIELD Well, gosh, that makes the old AT&T seem like a bunch of do gooders.
DINA SRINIVASAN You see a lot of parallels between the problems we had in telecom markets and the problems that we have when it comes to Facebook and in the new social media markets. Back in the day, it used to be that you could use Facebook Messenger and I could use AOL ICQ and you could send me a message. There was sort of network interoperability between Facebook and other platforms, which is what we have with telephones today, right. You can use Sprint, I can use AT&T and they have to patch a call through. But we have all these examples when it comes to Facebook where they gain market power and then shut down interoperability. And now it's a closed platform. And so what is everybody going to do? They're going to sign up for Facebook and everybody is going to get stuck in that platform.
BOB GARFIELD So let's put these suits in context of how big a deal are they?
DINA SRINIVASAN They're a huge deal. I mean, look, these are some of the biggest market cap companies that we have in America right now. These are also our Silicon Valley darlings. They're the bastions of innovation and tech, so the I think it's a big deal. On multiple fronts, from a competition perspective, from a privacy perspective, from a civil liberties perspective.
BOB GARFIELD This is the big kahuna.
DINA SRINIVASAN This is the big kahuna!
BOB GARFIELD Dina, I want to thank you very much.
DINA SRINIVASAN Thanks for having me on. It's been my pleasure.
BOB GARFIELD Dina Srinivasan is a leading researcher in the field of antitrust and a fellow with the Thurmon Arnold project at Yale. Coming up, a very skeptical David versus a big bully of a Goliath. This is On the Media.
[BREAK]
BOB GARFIELD This is On the Media, I'm Bob Garfield. Even if the Federal Trade Commission and most of the state's attorneys general succeed in breaking up Facebook, other questions remain about the ongoing damage from Facebook, Instagram and WhatsApp. Harms unaddressed by even the most modern interpretations of antitrust law. The invasion of privacy on a global scale, the manipulation of politics, the manipulation of emotions and behavior, the proliferation of toxic content. No matter how creative our legal interpretations, none of that is the stuff of antitrust law, nor have we any reason to expect action from Facebook itself. Mark Zuckerberg has repeatedly professed concern, promised to be better and gestured toward self-regulation. But the structural hazards within Facebook's algorithm and very business model have not been addressed, merely window dressing. Its oversight board, for example, first floated two and a half years ago as a guarantor of independent governance, has only now this month revealed its first so-called cases for consideration.
CAROLE CADWALLADR At present, the only thing that Facebook is allowing this board to adjudicate on is take down decisions later. All of these other cases, which just do not qualify them, Facebook says, cannot be heard by the board, including the issue of what actually stays up on Facebook.
BOB GARFIELD Carol Cadwallader is a journalist for The Guardian and The Observer, who became a Pulitzer finalist last year for her reporting on the Cambridge Analytica scandal. Late this year, she became a co-founder of the self-proclaimed real Facebook Oversight Board, a shadow court organized by academics and advocates to actually address the harms I've mentioned. A few weeks back, they announced their first three cases. Number one, former Trump campaign manager and former Cambridge Analytica board vice president Steve Bannon.
CAROLE CADWALLADR A month ago, Steve Bannon on Facebook Live, called for the beheading of Dr. Fauci. He said that he wanted to see his head on a spike. You know, a lot of people by any reasonable standard would say that that is in breach of Facebook's own terms and conditions and he should have been struck off the site. That didn't happen. And Mark Zuckerberg has very much resisted that.
BOB GARFIELD You can argue that head on a pike is more colloquialism than threat, but there is no arguing that Facebook at its worst isn't an incitement machine. We can look at the pogrom against Muslim Rohingya in Myanmar ginned up by the nation's military on Facebook and carried out by Buddhist nationalists who were given free basic Internet access by Facebook in its attempt to corner the Internet market there. Or consider, for another example, the deadly shooting this summer in Kenosha, Wisconsin.
[CLIP]
NEWS REPORT The key question here is how did people like Kyle Rittenhouse know to go to Kenosha on the night of August the 25th? The loved ones of those who died in Kenosha say Facebook is partly to blame for allowing posts that were mobilizing militias and activating white supremacists. [END CLIP]
[CLIP]
NEWS REPORT Specifically, this event posted by the militia group Kenosha Guard asking for patriots to take up arms and defend Kenosha from evil thugs. Some users responded with open threats of violence, one writing, I fully plan to kill looters and rioters tonight. Concerned users reported the page, but Facebook did not take it down.
[CLIP]
ZUCKERBERG It was largely an operational mistake. The contractors know the reviewers who the initial complaints were funneled to, didn't basically didn't pick this up. [END CLIP]
BOB GARFIELD If Zuckerberg mistakes were made excuse sounds familiar. That's what Nixon spokesman said too. Still, Facebook is among the defendants in a federal lawsuit from four people who protested that night, including the surviving partner of the late Anthony Huber, Hannah Gittings.
CAROLE CADWALLADR When Kyle Rittenhouse started shooting people, her partner incredibly bravely he was unarmed, went to try and tackle him, and he was shot dead. Hannah feels that Facebook is complicit in this white supremacist violence, is the equivalent of the meeting house where the white supremacist met up and swapped notes and announced where they'd be converging and advertised for new recruits. And to talk about the fact that her partner died as a result of an operational mistake. It's just so dehumanizing.
BOB GARFIELD The plaintiffs will have to settle for a federal court of law because Facebook's Supreme Court will give them no hearing. Nor will the censored critics of authoritarian regimes have their day in Facebook court, and so theirs is the real oversight board's second case.
[CLIP]
NEWS REPORT Vietnam has threatened to shut down Facebook in the country if the social media giant refuses to bow to government pressure and censor more political content on its platform. The company complied with the government request in April to increase censorship of anti-state posts for some 60 million Vietnamese users. [END CLIP]
CAROLE CADWALLADR Facebook, faced with a billion pounds of lost revenue, complied with their wishes. And they've taken down the accounts of these journalists and dissidents. Again, in the terms and conditions of Facebook's oversight board that these people are not allowed to appeal to the oversight board. I think it raises a really interesting case about the relationship between authoritarian governments and Facebook and the compromises and backroom deals that are being made. I think a lot of other countries around the world are looking at what's happened in Vietnam and sort of seeing something that they might want to try in the future.
BOB GARFIELD Other countries. And as it turns out, Facebook itself. When Cadwalladr is real oversight board, was launching the company she says tried to stifle their free speech by bigfooting a nonprofit donor.
CAROLE CADWALLADR Facebook discovered that this project, the real Facebook oversight board, was in progress and a PR guy from using corporate communications rang our funders Aluminite and harangued them basically and asked why they were funding this and suggested that they shouldn't be. And they were quite shocked by that. We were quite shocked at that because remember, the oversight board is meant to be independent from Facebook. Here we had the kind of this heavying coming from Facebook trying to cut off the funding of a new journalistic enterprise. You know it was bizarre and sort of corporate bullying. And then it's gone on from there, actually. Our website got taken off the Internet as a result of a takedown notice from Facebook. They came back to us and told us that was an accident.
BOB GARFIELD So many accidents. And also for Cadwalader, a sort of heartbreak over the very existence of Facebook's oversight panel. The members, she says, are respectable scholars, yet they have signed on to an enterprise of limited scope, crisscrossed with red lines.
CAROLE CADWALLADR I really, really worry about institutional capture. I worry about tech money everywhere. I worry about the fact they won't even reveal how much the people on the oversight board are being paid. I worry about the fact that one of the most pre-eminent First Amendment scholars in America, Robert Post, is being paid two hundred thousand dollars a year to be one of its trustees. I worry about the fact that they got a head of the human rights school in Oxford University to be another of its trustees. All across America, there are academics at universities who, if they don't depend upon tech money, they depend upon access to data or access to be able to research this stuff. So there is a whole industry which is supporting big tech. There is a very smart column the other day by Emily Bell. On the oversight board, she said, what would happen if everybody just withdrew their labor from the platforms if all of these academics and journalists and NGOs just ignored them, put their tools down? All these fact checking organizations, you know, they all lend legitimacy to the platforms. We become part of this industry.
BOB GARFIELD It's a cliche at this point to call big social media the social experiment that it is. But still the human trials continue and human institutions totter all with Facebook's knowledge. In the aftermath of November's election, it seemed as if Facebook finally realized that its algorithm and the disinformation it feeds might be another accident waiting to happen and experiment about to spin out of control. According to The New York Times Kevin Roose, reliable news sources like the Times and NPR suddenly displaced incendiary nonsense in what momentarily relieved Facebook employees called Nicer News feeds. Yeah, that didn't last long.
CAROLE CADWALLADR It's one person like literally flicking a switch and then they flip the switch back on again. I mean, it's absurd when you think about it, but I think we just because this is all happening so so gradually we are the frog which is being boiled, we just sort of accept this stuff. But along the way, it has almost destroyed a number of our democracy's. Yours being arguably still in the firing line there.
BOB GARFIELD This is a reference to the 2016 presidential election in which Russian hackers and domestic political campaigns used Facebook behavioral data, some released by Facebook illegally to sow division and suppress the African-American vote. As we enter what may become an era of antitrust struggle between the marbled, flawed government offices and the Disneyland-esque campus at one hacker way, it helps to think about the adversaries. A few hundred lawyers and do gooder watchdogs versus not a wayward monopoly, but a dangerous, willful, obscenely wealthy rogue state: age 16, population 2.7 billion, body count unknown. The so-called real Facebook oversight board and the rest of us are taking on real power. Yes, David slew Goliath, but that's just a story, isn't it? Coming up, the threat is worse than you think. This is On the Media.
[BREAK]
BOB GARFIELD This is On the Media, I'm Bob Garfield. Carol Cadwalladr enumerated a series of harms caused by Facebook. Quieting dissent, ignoring incitement and profiting from distortion. But according to Harvard Professor Emeritus Shoshana Zuboff, the bill of indictment can't merely be a list of harms. It must rather recognize a vast, sinister architecture that not only exploits markets and human frailty, but steals our inner selves as fuel for the machine. Like digital Soylent Green, text food supply is people. And yet society has been too much in awe to recognize the damage.
SHOSHANA ZUBOFF These companies were born into a series of historical windfalls. They like to play the game of the naturalistic fallacy, which is the idea that, hey, we're so successful, we must be right and we must be good. But their success really has nothing to do with being right or good. And all these companies were born into an era of the neoliberal ideology, a time when regulation has been diminished and devalued in favor of this idea that somehow markets have a kind of a natural genius to them. Markets always make the right decisions. And so you've got to leave these market actors to be as free as possible. You don't regulate them, you let them, quote, self regulate. And that's how we get to the best outcomes.
BOB GARFIELD For a brief moment, at the turn of the millennium, governments seemed to have developed an appetite at least to regulate the unchecked collection of personal data. From 1997 to 2001, Zuboff says the FTC began to stir. But then came 9/11, and the conversation moved from privacy protection to total information awareness. And so, she says, the stage had been set for the economic world we live in today. One of surveillance capitalism.
SHOSHANA ZUBOFF Historically of capitalism evolves by taking things that live outside of the market, bringing them into the market, turning them into what we call commodities, things that can be sold and purchased. In our digital era. Capitalism has evolved according to the same age old pattern, but with a dark and startling twist. That all the easy commodities were already taken, the forests and the meadows and the rivers and the mountainsides. So what they figured out early in the digital century was that private human experience could be the new virgin wood. The new unblemished mountain side.
BOB GARFIELD So it's my supermarket loyalty card and its Facebook and its Waze tracking my every automotive movement. What are some of the less obvious examples?
SHOSHANA ZUBOFF It's getting easier to answer the opposite question, which is what is still innocent of this process. But let me put it this way. Every product that begins with the word smart, every service that begins with the word personalized. Everywhere that there is an Internet touch point, it could be your dishwasher, it could be your television set. It could be the thermostat in your bedroom, could actually be the mattress on your bed. It could be your car dashboard. You're walking down the street and there are sensors and cameras. You're in a cafe and there are webcams. There are companies that specialize only in tracking your behavior as a renter. And then they make predictions about your behavior as a renter and sell them to landlords. That's a niche market. Multiply that overall, the niche markets, and then, of course, one of the biggest markets that's growing now is health data. This is why everybody under the sun wants to have a wearable device which will exceed even the smartphone as being the most invasive supply chain interface of all. Facebook's A.I. hub is described by the company as ingesting trillions of data points every day and producing six million predictions of human behavior every second, Bob. So that's the kind of scale.
BOB GARFIELD Well, my heart rate is rising and I'm embarrassed to report Apple knows about it because they're monitoring it on my wrist. Now, one of the most obvious applications, I guess the first application of data collection is extremely targeted advertising. You know, I'm pregnant. I've told absolutely nobody but my partner, and yet I suddenly start getting ads for vitamins and crib accessories, which is creepy enough on the face of it. But on the creep-o-meter, that is nothing because you call ad targeting merely the leading edge of the human futures market. And what is that?
SHOSHANA ZUBOFF So what are these advertisers and marketing folks buying? You know, it started out there buying the click through rate and what is the click through rate except for a tiny computational fragment that predicts a fragment of my behavior. You know, what ad am I likely to click on and click through to to that website? That is a prediction. Just like we have futures markets in pork bellies or oil prices. We have futures markets now in human behavior. And it turns out this surveillance economy has come into existence because all these companies from Ford Motor all the way down the line have figured out that, hey, we can make more money and attract more investment by selling predictions of our our users or our customers behavior based on all this data we can gather about them than actually selling a product or a service.
BOB GARFIELD Back in the day, Oh, I don't know. 15 years ago, advertising was an attempt to get some awareness. You take your best guess as to who your ad will reach and then you try to get their attention by whatever means. If I understand this correctly, advertising is not that anymore. Advertising is a bet that if you reach this particular user with this particular message at this particular time, that they will click and do business with you.
SHOSHANA ZUBOFF The real turning point in advertising came at Google based on their invention of this click through rate. Before the click through rate, when an advertiser went to figure out where should we put an ad, they looked for places where the context was somehow related to their brand values. They look for places where they were likely to encounter people who were somehow related to their brand values. Google comes along and says, you know, you've been kind of flying blind, choosing keywords, trying to connect with customers that are interested in your brand values. But we have a completely new program. We have a black box. It's going to tell you where to place your ad if you follow the advice of our black box, your ads are going to be more successful and you're going to sell more product. But don't ask us how we got to that prediction. Do not ask to see inside the black box because that is off limits. That was the moment when advertisers had to make a choice between does advertising project a connection to our brand values or is it just wherever we have a higher probability of making a buck - that's where we're going to go. This black box deal asked advertisers and the manufacturers behind those advertisers to sell their souls, and they did. Now, fast forward in the weeks and months after the Cambridge Analytica story, suddenly we were hearing from advertisers who were all upset that my ads are showing on pages that tout anti-Semitism and white supremacy and suddenly the advertisers are all up in arms and all indignant. Well, that was a bit of a show that was performative, Bob, because really the advertisers had sold their souls two decades ago and they lost the right to complain about where their ads appeared because they bought into the black box. So that was the huge turning point in advertising. And of course, that turning point is what allowed these new human futures markets to become so big and so lucrative that nearly all of the market capitalization of Alphabet, approximately 88 percent at this point, and really all of Facebook's market capitalization, about 98 percent derived solely from those advertising markets.
BOB GARFIELD Google discovered that data collection and the human behavior market far exceeded mere ad targeting, but they could make behavioral predictions about a whole host of things that has long since metastasized into the total surveillance society that you describe.
SHOSHANA ZUBOFF Well, you know, we labor under the delusion that the data that the companies are collecting about us is the data that we have chosen to give them. In other words, I can choose my own degree of privacy based on a private calculation of how much information I choose to share with a company in return for their product or service. And often that's a free product or service, but not always. This is a delusion and it's a delusion that has been nursed by the companies. So, Bob, it's not that you're walking down Fifth Avenue. It's the stoop of your shoulders and your gait, the cadence and pacing and rhythm of your walking. It's not your face just to I.D. you. It's the hundreds of micro expressions that the little muscles in your face produce, because those micro expressions are great predictors of your emotions. And Bob, it turns out that your emotions are great predictors of your behavior, and that's, of course, what they are after.
BOB GARFIELD If we've learned anything about the failures of American capitalism, it is about the perverse inequality and distribution of wealth and opportunity and social justice. To those yawning gaps, you add another, what you call epistemic inequality, the gap between what we know and what is known about us. This is what we're discussing here, is it not?
SHOSHANA ZUBOFF When we entered the digital century, this was supposed to be the golden age of democracy, the democratization of knowledge. This was going to allow us to finally solve our problems as individuals who have commerce that was really oriented toward solving our deepest needs. It was supposed to create the data that was going to allow us to finally cure diseases that had been uncurable, to finally create the solutions for climate cataclysm that have eluded us. Instead, surveillance capitalism has captured the entire domain of the digital. Surveillance. Capitalism owns and operates the Internet and under surveillance capitalism. Yes, it looks on the surface like there's democratization of knowledge. And don't get me wrong, it is a great gift to be able to get online or get on my phone or whatever and search and find information that before I would have had to go to one of the 40 volumes of my Encyclopedia Britannica or into my university library to find. This is a great gift, there is no belittling it. The problem is that on the back of that gift, these companies have built huge concentrations. In the past. We thought of industrial concentrations as concentrations of economic power. But now in the digital century, these concentrations are of knowledge, knowledge about us that comes from us, but is not for us. It's taken from us to use for others benefit not to solve social problems, not to solve the Earth's problems, not to solve consumers and users problems, but to meet the needs of business customers in these human futures markets. And so from this gap of the difference between what I can now and what can be known about me grows a new kind of power, which is the difference between what I can do and what can be done to me.
BOB GARFIELD The quid pro quo does not favor the consumer.
SHOSHANA ZUBOFF We are the objects now of global architectures of behavioral modification. Facebook talks about its ability to analyze moods and emotions so that they can alert advertisers of the best day, the best time of day, the kind of content of a message and the way the message should be delivered so that it will have maximum impact to get us to buy a product. For example, they learn that teenagers go through specific cycles during the week of anxiety and anxiety rises as they approach the weekend. And they can see, for example, that, hey, Bob is getting really uptight about his date Friday night. His self-esteem is really being challenged and he's feeling vulnerable right now. So, hey, advertiser, if you have some kind of really sexy confidence boost product like that black leather jacket that you've got for sale, this is the time for you to quickly send him a message. Tell them you're going to discount that black leather jacket, tell them about its sex appeal. Tell them that you'll do free shipping. And that'll he'll have it by 10:00 a.m. Friday morning so he can wear it on his date Friday night. And you will have maximized your ability to sell him that expensive black leather jacket. Now, what we learned in the year 2018 when Cambridge Analytica entered the global consciousness was that these same methods and mechanisms, which are the bread and butter of every self respecting surveillance capitalist, can be pivoted just a few degrees to political objectives.
BOB GARFIELD And hence Brexit, hence the Trump victory in 2016. Hence the pogroms in Myanmar. And the list is long and bleak.
SHOSHANA ZUBOFF We've learned that in 2016, the Trump campaign, just by using Facebook's political advertising capabilities to their fullest extent were successfully able to target black citizens in swing states and suppress the black vote. And they were able to do that effectively without jackbooted soldiers turning up at anybody's door. Without a single gun shot being fired. So this is what I call instrumentarian in power. It's not the totalitarianism of the 20th century that learned how to control people through the threat of terror and murder. This is a different kind of power, Bob, and it comes to us on slippered feet. It comes to us whispering sweet nothings, holding a cappuccino, but it can get everything it wants, or at least that's the trajectory that it's on.
BOB GARFIELD So let's say the obvious ways to combat the excesses of industrial capitalism were and are regulation, including antitrust, collective bargaining, minimum wage, high marginal tax rates, capital gains taxes, workplace protections, environmental laws and so on. How are we to deal with epistemic inequality?
SHOSHANA ZUBOFF When you look at the origins of antitrust law, which go back to the late 19th century, it was clear to observers then and it's been clear to historians since that antitrust became really popular within the world of ordinary folks, not because monopoly was the only problem or even the worst problem, but because they were so angry at these companies for having so much power over them, for making them feel like pawns in a system over which they had no control, for devaluing them, for demeaning them as as citizens and as individuals and as workers and as consumers. All of those sources of anger and indignity were plowed into the antitrust rallying cry. And I think we've got something similar like that going on today. Antitrust that these are the laws that we have, this is the hammer that we have. Monopoly, as is conventionally understood, anti-competitive practices - these things are real. There's no question that companies like Google and Facebook and Amazon and Apple and Microsoft, these companies are ruthless capitalists in the most conventional sense, as well as being ruthless surveillance capitalists. The problem is, Bob, that if we only address their anti-competitive practices, we run the risk of leaving everything that you and I have been talking about intact. When we wanted to outlaw child labor, we didn't start having negotiations about how many hours a day a child could work in a factory. We said there's not going to be children in factories. Well, it's the same thing about data. We don't want to just be negotiating who owns it or can I get it from the company and take it with me? Once we started talking about data, we've already lost the battle on extraction. It means they've already taken our lives and turned it into data in the first place. We need to go upstream and start focusing on extraction, that's number one. But there's also the markets where ultimately the predictions that come out of their computational factories get sold and these markets that trade in predictions about our behavior, these are the source of the financial incentives. So let's outlaw markets, the trade in human futures. We've done this before. We've outlawed, for example, markets that trade in human beings, even when there were whole economies based on those markets. And we did so because they were contrary to the principles of a democratic society. We can outlaw markets that trade in human futures. The moment that we do that, Bob, we have opened up a vast new landscape for competitors who want data to serve, people who want data to serve society, who want data to serve the earth and are going to find a way to do that, that they can monetize. They may not become trillion dollar companies in the space of 10 years or 20 years, but they can make great profit and they can do it without the overhang of surveillance capitalism. We are starting to get it. We know it. We see it. We can act on it.
BOB GARFIELD Shoshanna, thank you very, very much.
SHOSHANA ZUBOFF Thank you for doing this.
BOB GARFIELD You'll have to excuse me now. I have to return a black leather jacket to Amazon.
[ZUBOFF LAUGHS]
BOB GARFIELD Shoshana Zuboff is professor emerita at Harvard Business School and author of The Age of Surveillance Capitalism.
[CLIP]
DET. THORN They're making our food out of people. Next thing they'll be breeding us like cattle for food. You gotta tell 'em, you gotta tell 'em! Listen to me, Hatcher. You gotta tell 'em Soylent Green is people! [END CLIP]
BOB GARFIELD That's it for this week's show On the Media is produced by Alana Cassanova-Burgess, Micah Loewinger, Leah Feder, Jon Hanrahan and Eloise Blondiau with help from Ava Sasani. Ava, sorry to say, leaves us this week and we wish her absolutely nothing but the best.
Xandra Ellin writes our newsletter and our show was edited this week by executive producer Katye Rogers. Our technical director is Jennifer Munsen. Our engineers this week were Sam Bair and Adrian Lilly. Bassist composer Ben Allison wrote our theme. On the Media as a production of WNYC Studios. Brooke Gladstone will be back next week. I'm Bob Garfield.
Copyright © 2020 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.