A History of Persuasion: Part 3
KAI WRIGHT: Hey everybody. You’re about to hear the final installment in a three-part series. Now this series has got a lot of fascinating ideas and twists and turns and honestly, you just need to hear from the beginning to really understand. So if you haven’t heard parts one and two, stop, back up, go listen, then come back for this awesome conclusion.
[Theme music]
KAI: I’m Kai Wright and these are The Stakes. In this episode, Skinner in the Valley.
[End theme music]
AMANDA: So Kai.
KAI: Amanda.
AMANDA: I’m going to tell you a little story.
KAI: Okay.
AMANDA: Last September…
KAI: Uh huh.
AMANDA: I was riding the subway. I was heading uptown.
KAI: Mm hm.
AMANDA: And I was sitting across from this older couple.
KAI: Mm hm.
AMANDA: And they were holding a smartphone between them and they were listening really intently...
CHRISTINE BLASEY FORD: Thank you Senator Grassley.
AMANDA: ...To the Brett Kavanaugh hearings.
CHRISTINE BLASEY FORD: I'm here because I believe it is my civic duty to tell you what happened to me...
KAI: And you took a photo of this, I remember this photo. Right? Like, it was an intense image of these people just like, on the edge of their seats.
AMANDA: Yeah. He's got like this beautiful white mustache and she's clutching his arm. Both have these like, expressions on their face where you can just tell they're really thinking about what they're hearing.
CHRISTINE BLASEY FORD: I'm an independent person and I am no one's pawn.
AMANDA: So I took the photo and I added this caption that said “Couple listening to the Kavanaugh hearings on the uptown 1 train” and then I tweeted it, and then I got off the train and I was outside and I checked my phone again and the tweet was like, blowing up.
KAI: What do you mean?
AMANDA: It was... likes were coming in..
KAI: Like a viral tweet?
AMANDA: It went viral.
KAI: I’ve never been viral.
AMANDA: I mean your phone goes insane.
KAI: What's that like? What’d you feel like?
AMANDA: It was like being at a slot machine, pulling the handle and winning, you know, ten thousand dollars.
KAI: Jackpot. Just social media jackpot.
AMANDA: Jackpot, right. Cause you -- not only are you getting likes and retweets and likes and retweets and comments, but also like, people who know you start to see it so you start getting texts, you start getting calls.
KAI: Wow.
AMANDA: I was contacted by every major news organization...
KAI: Really?
AMANDA: ...in the country. Yep. There was a Time magazine article about this tweet...
KAI: Wow.
AMANDA: ...going viral. So it was super exciting. And as you might imagine like, over the next few days, I kept checking and checking and checking my phone. It was extremely distracting. I thankfully did not have that much work because this became my job, was like --
KAI: Just try going to see in your likes on Twitter.
AMANDA: Yeah just like checking how my viral tweet was doing.
KAI: Wow.
AMANDA: And then there was this point at which it stopped.
KAI: Oh, the world moved on.
AMANDA: It stopped getting likes and it stopped getting retweets. And that was pretty sad.
KAI: Yeah.
AMANDA: Like the moment had passed. And I go back to Twitter, I tweet, but I -- what I realized is that I am constantly chasing the fix. So even if I just get, whatever, 30, 40 likes, something small I still can’t concentrate because I’m looking at it so much. And I realize like, this is the perfect example of B.F. Skinner's ideas.
VIDEO NARRATOR: The pigeon learned that pecking the disc produced a reward. Then the behavior of pecking could be studied in relation to how often that reward was offered.
AMANDA: So if you're a pigeon or a rat in one of B.F. Skinner's experiments, he would give you a little food pellet every time you pressed the lever, right? Press the lever, get food, press the lever, get food, press the lever, get food. Now, the rat quickly figures this out. So they press the lever and they get food and then they just eat until they’re full and then they stop pressing the lever. They basically figure it out.
KAI: Right.
AMANDA: One of the things he discovers is that if he makes the reward unpredictable...
KAI: What do you mean unpredictable?
AMANDA: Well, you know, I spoke with an expert on B.F. Skinner. Her name is Alexandra Rutherford and she wrote a book about him and she broke it down, she says you know if you give the food to a rat on a schedule that varies...
ALEXANDRA RUTHERFORD: it can be after one lever press and then after three and then after five and then after two and then after one. And when it's delivered that way the rat will continue pressing that bar... forever.
B.F. SKINNER: And that is at the heart of all gambling devices and has the same effect. The pigeon can become a pathological gambler just as a person can.
AMANDA: This is what keeps me going back to Twitter all the time. It's like I never know what I'm going to get. And the bigger lesson that B.F. Skinner took from all this, was that it's not just how often you give out the rewards, but it's like whoever builds the box and doles out the rewards -- they control the animal.
RUTHERFORD: This was for Skinner, the kind of revelation -- the extent to which these variables in the environment could be controlled to produce precisely the kind of behavior that you wanted.
AMANDA: And this I think is like the parallel to Facebook and Twitter and Instagram. They have built the box that we are all existing in and they've got us coming back for these rewards time and time again.
KAI: Oof.
AMANDA: And they are managing our behavior now.
KAI: Ok, so we’re all inside Silicon Valley’s box now, chasing these rewards. But how did we get in the box, right? I mean did they do this on purpose?
AMANDA: I think it’s really complicated, and I’ve been trying to figure this out - and so one of the first things I did was call up this guy, his name is Ian Leslie --
IAN LESLIE: I am a journalist and an author and I write about -- I suppose I write about human behavior.
AMANDA: And he wrote an article titled “The Scientists Who Make Apps Addictive.” And Leslie says he sees a straight line from B.F. Skinner’s “behaviorist” ideas to what’s happening today in Silicon Valley.
LESLIE: They're really just reviving that whole behaviorist strand of psychology. But making it commercially applicable.
AMANDA: And he found that one of the most well-known people in this field is a behavioral scientist named B.J. Fogg.
LESLIE: What a great name, number one. I just like the name B.J. Fogg. And two, he’s come up with this science of persuasive technology. And you know the world runs on persuasive technology so I think going in I had a tendency to think of him as perhaps a potentially somewhat sinister figure.
B.J. FOGG: My work’s all about behavior. And how you design to change people’s behaviors.
AMANDA: B.J. Fogg directs the Persuasive Tech Lab at Stanford.
REPORTER: Stanford University is known to be where the best and brightest students come to learn…
AMANDA: This is one of his former students, his name is Tristan Harris.
TRISTAN HARRIS: B.J. Fogg, the professor, teaches a different class every year. He actually chooses a different topic of persuasive technology.
AMANDA: Harris says that BJ Fogg taught them everything, from how casinos are designed to keep you gambling, to how you get a dog to roll over and beg using clicker training.
HARRIS: So you know, how do you do -- click click, give them the reward. Only when they do the behavior that you want, and when they don't do the behavior you want you don't do the click. Um, you have all of these Stanford computer science engineers in the class and you’re -- you know, are watching videos about a dog trainer and you're like, why am I, why am I doing this?
AMANDA: B.J. Fogg was teaching the science of persuasion and how it like, overlapped with technology. So in 2007, B.J. Fogg decides he’s going to focus his persuasive tech class on this relatively new platform called “Facebook.”
NEWS REPORTER: ...This class, which is completely packed, is all about Facebook.
AMANDA: Picture like this very bland classroom, the students have these really old laptops, they’re sitting shoulder to shoulder, it’s totally packed. And this class has become legendary.
FOGG: There has been no persuasive technology more successful than Facebook ever in terms of persuading, changing attitudes and behaviors…
AMANDA: Now at this point, Facebook platform has just opened up. So instead of being apps that are made by Facebook…
KAI: Mm hm.
AMANDA: … anybody can make an app and upload it to Facebook.
KAI: Right.
AMANDA: So B.J. Fogg organizes his class, he breaks his students up into little teams and he has each team design an app.
FOGG: You have a way to take interpersonal persuasion dynamics and scale that up to millions of users, eventually I think billions of users...
AMANDA: And one team designs this “Send Hotness” app?
KAI: As in like sexy hotness?
AMANDA: Yeah. The idea was that you could send your friend on Facebook like, points for being hot, and I think it would rank your friends by hotness points.
KAI: Okay.
AMANDA: And this app does extremely well. Zero users, 10,000 users...
KAI: Shocker
AMANDA: I know, yeah, 80,000 users, next thing they know they have 5 million users...
KAI: Wow
AMANDA: ...of this app that they’re developing in a class.
KAI: Wow.
AMANDA: They put in a little ad and they make a million dollars.
REPORTER: We’re wrapping up the end of the Stanford class presentations tonight to a group of 500 people...
AMANDA: Hundreds of people show up for their end of semester presentations, for like this class at Stanford.
KAI: Because people are making a million dollars.
AMANDA: Well that’s it, so this is a big deal.
REPORTER: How are you doing BJ?
FOGG: Good
REPORTER: What you did was really put together an amazing class...
AMANDA: People see the financial potential, and investors show up for the last class too.
REPORTER: Think about the value creation. Of what was created.
BJ FOGG: Amazing. I mean, for me as a psychologist….
AMANDA: Over the years, B.J. Fogg’s students have gone to many different companies. He taught the co-founder of Instagram, he taught people who went on to work at Facebook, and Uber, and Google. And he becomes known as “the millionaire maker.”
KAI: It's interesting, B.J. Fogg the millionaire maker, but I've never heard of him.
AMANDA: No I'm not surprised. It's not like... he's not a household name, but his influence in these tech circles has been huge. So not only does he teach at Stanford and he wrote a famous textbook, but he also has been a paid consultant for eBay and Nike and Procter and Gamble. He's basically considered a tech guru. But then in 2016 that basically marks the start of the backlash against Silicon Valley.
KAI: Right because the election, because of Cambridge Analytica and Russian interference and all of that.
AMANDA: Right. So at this point the media goes looking for a villain, someone to blame for all of these psychological techniques that are being used in our devices. And whose name pops up but B.J. Fogg.
KAI: Right.
AMANDA: And he gets cast as this potentially sinister puppet master who is harnessing the dark arts of persuasion for profit. So I gave him a call.
AMANDA: Hello, Professor Fogg?
FOGG: Hi, Amanda.
AMANDA: Hi, how are you?
FOGG: I’m doing pretty good...
AMANDA: So it took a few months, B.J. Fogg is a busy guy, but I finally reached him at his home in California and I started with, you know, easy questions first -- like what’s he best known for?
FOGG: So I was the first to say hey, there's going to be this overlap between persuasion and computers. And put those two circles over each other and say this overlapping space -- this is going to happen, and this is going to be a big deal. And there's good things and bad things.
AMANDA: He also named the study of this overlapping space something pretty terrifying. He calls it “Captology.”
FOGG: Not because of the word “capture.” That’s an unfortunate similarity, but for the acronym “Computers As Persuasive Technology.”
AMANDA: And then we talked about the Facebook class. B.J. Fogg has consistently predicted where tech was going, and he really saw the potential for Facebook back in 2007. But it’s not like he taught them an arsenal of evil psychological techniques. He says the students were just like, at the right place at the right time. And as I mentioned before, he comes up with a new class every year, so take this year’s class -- this one is inspired by his parents.
FOGG: I live part time in Maui and I'm there finishing up my book, but in the mornings I go surfing. So I surfed and I come in, and my parents were visiting us. So they were there. My parents are in their 80s. And so I walk in after surfing and then we're gonna have breakfast and I'm going to work all day.
AMANDA: Right.
FOGG: So I walk in and I say hi. And my mom is sitting there on her laptop, my dad's looking at his mobile phone. And neither one -- they both said hi but neither one looked up at me. And I was like oh my gosh.
AMANDA: B.J. Fogg’s own parents couldn’t put their phones down long enough to say hello to their son .
FOGG: So this year I decided to have a class on behavior design for reducing screen time.
AMANDA: REDUCING SCREEN TIME?! Are you kidding me? This is not what I expected from this guy.
AMANDA: I'm curious how you see your own role in, you know, in being someone who sort of taught how to keep people on the devices who is now teaching people how to get off them. Like what…. How have you seen your role in all this?
FOGG: Yeah, but Amanda that's not fair. I haven't taught people how to keep people on devices.
AMANDA: He thinks he’s been treated unfairly by the media
FOGG: To me what feels unfair? It’s when people say, “B.J. Fogg had this list of manipulative techniques and he trained all of these people who then went into companies and now they’re addicting your child, thanks to B.J .Fogg’s secret list of dark techniques”
AMANDA: Right.
FOGG: And that's hurtful. So here you are, the one that saying, hey, here’s going to be the problem and you spend a lot of time trying to foresee what the problems are and get people to take action, just like somebody studying climate change saying hey, this is going to be a problem. And then you blame the climate change researcher for creating climate change. That's kind of how it feels on my side.
HARRIS: You know I really want to name here that B.J. Fogg is not responsible for creating this horrible -- it’s like there was this whole discipline of persuasive technology before B.J. came along.
AMANDA: This is Tristan Harris again. He says that while Fogg was explaining how persuasive technologies work, he was also talking about how to use them ethically.
HARRIS: I think he named the field. He contributed a bunch. There's a lot of alumni from his classes that have then gone on to create some of especially some of the more damaging products. But he personally didn't, didn't tell people: this is how you manipulate people's minds. He was just engaging people with the questions. What's sad to me is that when he raised even the ethics of persuasive technology because there's a whole segment on ethics that I know for a fact several people went on to start companies that are highly manipulative users of persuasive technology and didn't seem to hold that conscience of what was trying to be imbued in that class.
AMANDA: After Harris took the persuasive tech class. He quits Stanford, he goes to start his own business, and then that business is bought by Google. And while he’s at Google, inside this huge tech company, that he starts to see a shift. That the goal of many of these companies stops being about connecting people.
HARRIS: Everyone that I knew who was in the tech industry, was no longer really building stuff that was about advanced technology, it was really this race for who can manipulate our social instincts better. Who can find a more creative style of notification? Who can find a more creative way to get you pulling like a slot machine to check that thing more times in a day.
AMANDA: Harris says that this was a result of an ethos in Silicon Valley that I’m sure you’re familiar with -- that everything on the internet must be free.
HARRIS: Free is the most expensive business model we've ever created.
AMANDA: That's the core mistake.
HARRIS: Absolutely.
AMANDA: You believe that that’s the core mistake?
HARRIS: Absolutely. The business model of capturing your attention means that I am not here to help you. It looks like I'm here to help you, but I'm really here to basically drill into your brain and get the attention out.
AMANDA: So I can sell it to somebody else.
HARRIS: So I can sell to somebody else.
AMANDA: Anything that keeps you on your device for as long as possible -- that is the new goal.
HARRIS: The engineers don’t distinguish between what we want, and what we can’t help but look at.
AMANDA: This is when you get bottomless scroll and videos that autoplay and constant notifications…
KAI: Uh huh. And this is it -- this is when all of this history of persuasion that you’ve been telling us about comes into play. And they come up with these new devices and these new platforms and they’re free, but they got to monetize them and so that means they’re selling our attention, they’re selling us to advertisers.
AMANDA: All that matters is what they call “Time on Device.”
KAI: Right.
AMANDA: But they have a new problem now: which is that there’s only so much time in the day and there are only so many humans.
HARRIS: The companies are in a race to capture a finite resource. How much attention is out there? Only so much. Takes 9 months to grow a new human attention.
AMANDA: If you didn’t catch that, he said, “It takes 9 months to grow a new human attention.”
HARRIS: As we start to run out, how are the companies going to get more? Their stock prices have to keep going up so what are they can do? They're going to start fracking for your attention.
AMANDA: They have to split your attention into multiple streams.
HARRIS: So let's say they split it into four streams. They just quadrupled the size of the attention economy.
[Sounds of texting, video games, YouTube videos, and music]
HARRIS: Now there's four times as much attention because you're paying attention to four things at once, as opposed to one thing at once.
AMANDA: I totally do that.
HARRIS: We all do it. We switch our attention about every 40 seconds. So we're running this vast psychological experiment that humanity has never been through before of what happens when two billion people are jacked into an infrastructure run by, you know, five or six tech companies. And that's shaping, you know, I think world history. And I -- this is a bold claim but I really believe it -- I think technology is holding the pen of human history right now. So we're not choosing anymore. We if we're not aware of how technology is influencing us, we are subject to the influence of technology.
KAI: Which means all the stuff we’ve been talking about over these past few episodes -- all the tools and techniques for manipulating human behavior -- it’s just reached an unprecedented scale. I mean it’s hard to comprehend. There are almost 2.5 BILLION Facebook users. So what do we do about it? Coming up, we look back at a moment when behavior modification was seen as a real danger, and it was reigned in.
DOROTHY GLANCY: It was the first time that anybody said that the law had anything to say about or object about the use of behavior modification.
KAI: That’s next.
+++
KAI: Hey Stakes listeners, we have been asking you about your own addictive relationships with technology. And here’s a little more of what we’ve heard.
LISTENER 1: Checking something. Sometimes I just sort of don’t even really know what I’m doing. I might just pick it up as almost a reaction. It’s sort of like I’m almost programed at this point with the phone.
LISTENER 2: I’ve just had this thing on my phone for so long I’m just like conditioned to pick it up and put it down for no apparent reason.
LISTENER 3: Seeing how many likes you get. Seeing how -- if it goes viral.
LISTENER 4: I think it’s less so about like, getting the one like, as it is like, if I don’t get 100 likes I don’t feel like I’m worth it, you know?
KAI: Thanks to all of you who have sent in your stories, they’ve been really great. And here’s one more thing I wanna ask you to do: If you like what we’re doing here, please leave us a 5 star review -- 5 stars -- on Apple Podcasts. It really helps others find the show.
+++
KAI: It’s recently become apparent that there were some people in Silicon Valley who went down this manipulative road on purpose. The most telling moment came in 2017 at an event put on by the news site Axios. It was there that Sean Parker, who is co-founder of Napster and the first president of Facebook, he just copped to the manipulation.
SEAN PARKER: I mean it’s exactly the kind of thing that a hacker like myself would come up with because you’re exploiting a vulnerability in human psychology. I think the inventors, creators, you know, it’s me, it’s Mark, it’s Kevin Systrom at Instagram, it’s all of these people… understood this. Consciously. And we did it anyway.
AMANDA: You know Kai, this is the people who built these platforms, admitting that they made it manipulative on purpose. I find this shocking.
KAI: But then there’s so much money and power on the line, it just seems impossible to roll back.
AMANDA: Maybe, but there was a moment in time, when we did attempt to legislate this “technology of behavior.”
BILL MCCREARY: Good evening, I’m Bill McCreary and this is Black News.
AMANDA: It was the early 1970s. And that point in time there Is still a sense that prisons might be a place for rehabilitation as opposed to just punishment. So psychologists are hired to help design new behavior programs to be used inside prisons, and these are based on Skinner’s ideas of rewards and punishment. And these programs, they spread.
MCCREARY: A new program at Butner, North Carolina has been proposed. That federal center for correctional research is the subject of controversy…
AMANDA: But quickly some of these programs go from like, incentivizing good behavior -- you know, giving out more cigarettes or more time in the yard if you’re good, to using punishment and reward to control the prisoners, to make them more docile.
KAI: To make them just do what the warden wants.
AMANDA: And fit in with the prison system.
REPORTER: What are they trying to modify their behavior towards?
INTERVIEWEE 1: Well as far as we were able to ascertain they're simply trying to adjust them or get them submissive so that they can re-adjust to the prison society.
REPORTER: By that you don't mean readjust to life on the outside?
INTERVIEWEE 1: Oh no not at all. I mean just that -- the prison society.
AMANDA: So in North Carolina activists show up and they’re like, no way.
INTERVIEWEE 2: We can readily see that the guinea pigs will be predominantly black.
AMANDA: These activists reach out to this senator from North Carolina named Sam Ervin, and Dorothy Glancy, woman who was a young lawyer at the time, she works for the Senator, and the two of them look to see, you know, are there any laws governing this stuff? And they find that there’s nothing.
DOROTHY GLANCY: There was a lot of human experimentation being paid for by the federal government and the agencies didn't seem to acknowledge that was going on.
AMANDA: They couldn’t even figure out how many prisons had behavior programs like this.
KAI: Wow.
AMANDA: So the senator had this really interesting take on this. He saw forcing behavior change as an attack on an individual’s freedom.
GLANCY: He was interested in the individual soul, in the individual mind and for people to be able to choose to follow the right path without being sort of brainwashed into taking the right path.
KAI: Right, this is not just about running more efficient prisons. This is a fundamental attack on our basic human rights, our ability to make choices for ourselves.
AMANDA: Right. So the senator’s Constitutional Rights subcommittee publishes this really brutal report.
GLANCY: It was the first time that anybody said that the law had anything to say about or object about the use of behavior modification. And the question then becomes whether we as human beings have the right to do that to other human beings. It's an ethical issue.
KAI: Wow, so what happens?
AMANDA: Well, the subcommittee demands that all of this federal money that’s been going to these programs be cut. And a few years later, there are strict guidelines that are put into place that limit experiments on human subjects.
KAI: And so the use of behavior modification can in fact be reigned in.
AMANDA: Yes, but you know back then they were largely regulating places that used public money, so prisons, universities, VA hospitals. But now, the difference is with Silicon Valley, we’re talking about private business, you know very large private businesses.
KAI: And ones that have behavior modification as literally part of the business model.
AMANDA: Exactly.
[News montage about Facebook, Twitter, and other tech companies using manipulative practices]
AMANDA: But there are people who are trying to figure this out.
REPORTER: I just want to talk about the bill that you rolled out today, with a catalog of issues that you could regulate in social media. Why start with this one?
SENATOR MARK WARNER: Last fall I laid out a white paper that had twenty different ideas…
AMANDA: Mark Warner is a Senator from Virginia, and he proposed a bill that limits the way tech companies can exploit us. And he specifically targets the use of psychological experiments on users who have not consented to it.
AMANDA: When you say a psychological experiment, what is it that you have in mind?
SENATOR WARNER: Well if we think back to when Facebook decided without telling any of their -- the guinea pigs that they were choosing, they took one segment of Facebook users and only gave them good news….
NEWS HOST: It was all part of a study to see how emotion spreads on social media…
SENATOR WARNER: On the other side of the equation, they started to take people and simply gave them negative news. So the fact that they were manipulating us to see how they could play with our emotions and how that would then play into the ability to sell that information from an advertising standpoint... I think that user ought to have that knowledge before they become such a guinea pig.
AMANDA: Right. Is there anything wrong with being a guinea pig. If what they're using me for is to make the product better?
SENATOR WARNER: Well, they're deciding what's better. Is better the fact that Facebook is going to be able to more profitably take your personalized information and sell that more often to advertisers? That may be better for Facebook. It may be better for the advertiser. I'm not sure that's better for the consumer particularly if you're being manipulated and you didn't even realize that you'd given up that consent to allow yourself to be used as that guinea pig.
AMANDA: This bill is just one of the ideas for fixing this kind of thing. And look, there are many proposals out there right now -- there was one recently, that proposed making infinite scroll and auto-play on videos illegal.
KAI: Just banning them.
AMANDA: Yeah .
KAI: I mean honestly that sounds unlikely that that would become law, and frankly the whole thing seems so big and mysterious and hard to truly regulate.
AMANDA: That’s true. But Tristan Harris reminded me that of course Silicon Valley is not just some unknowable, impenetrable fortress. It was actually built by people.
AMANDA: Who is in Silicon Valley?
HARRIS: I think we all know the answer to that question. A bunch of young mostly male, mostly white, 20 to 30 year olds, having studied computer science, technology, engineering, all living within now about 50 miles of San Francisco. I mean you have a very particular culture of people who are not trained to question the systems.
KAI: It’s kind of like a perfect demographic storm. I mean when you think about it that way, it’s almost inevitable that we ended up this way.
AMANDA: Well, but here’s the thing, that’s not the point he’s trying to make. The point he’s trying to make is that it’s actually a really small group of people.
HARRIS: It's a handful of executives, product designers business people at the tops of Apple, of Google, Facebook, Amazon...
AMANDA: Harris says this is not like climate change where billions of people have to change what they’re doing, Silicon Valley was built by a pretty small group of people. You know, people who are profiting enormously off of… us.
HARRIS: Unlike climate change only about a thousand people need to change what they're doing, which should be an optimistic message.
KAI: I would like to be optimistic. But part of me is just terrified by the idea of being one of B.F. Skinner's pigeons mindlessly chasing these rewards on my phone. And honestly I feel like I'm close to becoming this crazy old guy hiding in the woods. I mean, Ted Kaczynski was obviously insane, but on that point at least, I find myself in agreement.
AMANDA: I know. I feel that way as well, often, but I try to keep that part of me under control. But Harris says we just need to imagine like, a totally different future.
HARRIS: Imagine a world where the people who built the tech industry that 2 billion people live inside of were like, the critical theorists -- people who have been highly cynical and critical of all systems of power. You know, imagine a system of technology built by anthropologists. Imagine a system of technology built by psychotherapists. How different would that world look? We could have had a 2 billion person, you know, digital Jane Jacobs city, you know, that was all super liveable.
AMANDA: At the moment it is not entirely clear how we get from here to there. I imagine it’s gonna take a lot of trial and error to regulate the use of behavioral techniques in tech. But I do like what Tristan said there. It made me think of the way we live online in a totally different way.
KAI: Okay, who would you want to design the internet? Who would be your people?
AMANDA: Who would be my people to design the internet? I mean, the first thing that popped into my mind was muppets.
KAI: (laughs) You want Jim Henson in charge of the internet?
AMANDA: Maybe! I mean they’re just so well intentioned!
KAI: You know what, Amanda? I am down.
AMANDA: (laughs)
KAI: Jim Henson…
AMANDA: Who did you think should do it?
KAI: I mean, Beyonce...
AMANDA: Oh interesting, the Beyonce internet.
KAI: … would certainly be very popular internet.
(Amanda and Kai laugh)
AMANDA: Okay, Beyonce…
KAI: Certainly women. But you know what? I wouldn’t mind Elizabeth Warren controlling the internet.
AMANDA: Interesting…
CREDITS
The Stakes is production of WNYC Studios and the newsroom of WNYC.
This episode was reported by Amanda Aronczyk.
It was edited by Christopher Werth.
Cayce Means is our technical director.
Karen Frillmann is our Executive Producer.
The Stakes team also includes… Jonna McKone, Jessica Miller, Kaari Pitkin and Veralyn Williams…
With help from…
Hannis Brown, Cheyann Harris, Michelle Harris, Rosemary Misdary and Kim Nowacki.
And hit me up on twitter, @kai_wright. Thanks for listening
KAI: Uhhhh how about Ruth Bader Ginsberg?
AMANDA: Ruth Bader Ginsberg.
KAI: RBG running the internet.
AMANDA: Would it be a fairer place?
KAI: I think there’d be a lot of work out videos on it.
(Amanda and Kai laugh)