Digital Life Is a Moral Mess
Regina de Heer: New technology is always coming out. Cryptocurrency is a big one, NFTs, et cetera, and people are talking about the morality and ethics around them. If you knew that technologies like those were being used for illicit activities, would it prevent you from using them in the future?
Respondent 1: That would affect my use of crypto if I knew.
Respondent 2: Bitcoin is good, NFTs are good, but I'm old school. I'm not going to let that control my life.
Respondent 3: Yes, you can buy some illegal things, but I don't know on real life, we could use and buy those things too. I don't know. Maybe we can start from the education.
Respondent 4: The shift that's going to happen with the crypto market and that being our new form of currency, which is going to happen, we want to be smart. You can become a millionaire off of it.
Respondent 5: You have to look at the trends. You have to be realistic. One thing I say, play the game. Don't let it play you. You have to be smart in everything that you do.
[music]
Kai Wright: This is The United States of Anxiety. I'm Kai Wright and welcome to the show. We're talking tonight about moral choices. I'll say this episode is prompted by a question that one of you sent to us. Let me bring in Kousha Navidar, who is one of our producers who you are used to hearing from whenever we go through the listener mailbag like this. Hey, Kousha.
Kousha Navidar: Hey, Kai.
Kai Wright: Set this up for us. What question came to us and in response to what?
Kousha Navidar: Yes, last month, we did an episode about cryptocurrency, the promise and failure of the technology to level the financial playing field. One of you listening at home heard this episode and sent us a provocative question.
[music]
Tim: Hey, Kousha, this is Tim from Brooklyn. Just listened to the episode on crypto and wanted to know a little bit more about the moral implications of putting your money into a market or a coin that seems so likely to be used for nefarious purposes, drugs, human trafficking, et cetera. Seems like Bitcoin and those cryptocurrencies can be a little dangerous. Thoughts?
[music]
Kousha Navidar: Thank you, Tim. Kai, I love this question, but here's the challenge. How do you actually answer a question about morality?
Kai Wright: It kind of sounds like you need a philosopher.
Kousha Navidar: Yes, so I called one.
Kai Wright: You called a philosopher?
Kousha Navidar: Yes. Dr. Christopher Robichaud. He's actually an old professor of mine too. Senior lecturer in ethics and public policy at Harvard. He's also really good at explaining philosophy through superheroes. We talked about Tim's question. Here's what Dr. Robichaud had to say. Dr. Robichaud, I am so excited to see you again. Right at the top, I just want to ask, have you ever heard of a question like Tim's before?
Dr. Christopher Robichaud: I think it's a great question. I have heard this question before more than I've asked questions like this of myself. Behind it is a deep issue that we face with a whole bunch of other things besides crypto, which is, "Do we want to play a part in broad enterprises that we know or have good reason to believe are at least deeply morally problematic?" That's something that's an unavoidable question that all of us face living the kinds of lives that we live today.
Folks in my courses all the time will come to me with questions like, "I feel bad about participating in a capitalist society," or "I feel bad about eating meat," or "I feel bad about buying this clothing line." It's more than just, "I feel bad," right? Behind that is this, I think, pretty deep question, which is, where do we draw the line for ourselves when we know in some sense that we're a vector of suffering that we, in some way, contribute to it by participating?
Kousha Navidar: Yes, and crypto seems especially tricky because, for a lot of people, it appears at least the promise of it is a way to take back financial power. During the episode we did last month, I'd say we found a lot of tension. Like a lot of people saying, "Hey, listen, this is a way for me to get ahead in this financial world that isn't really set up for me. Maybe it is for others, but not for me." Does that factor tip the scales one way or the other?
Dr. Christopher Robichaud: It's pretty easy to say, "I'm going to take a moral stand on something because it has no benefit for me anyways and I don't want to do it." Other people don't have those sorts of choices, but what you're introducing is the thought that, well, for some people, investing in crypto say really does come with some promise of some benefits. I do think that if you really do believe, for instance, with crypto, that it does contribute to some really morally horrific things, that's got to go into the calculation. I promised you for better or worse, this is the sort of thing that I do.
The deep question here is, do moral reasons trump all reasons? If we step back and be like, "Whatever else is true, morally, this would be the wrong thing," then that settles it. You're done. For other people, they're like, "No, that just goes into the calculation," right? It could help me personally. It could benefit society in other ways. It's also morally wrong. I fold that all together and come up with a result on what to do. I think that it's up to each individual to ask what they think comes with the conclusion, "This is morally wrong." For some, that's going to be the end of the conversation. For others, it might just be the beginning.
Kousha Navidar: Can we take a step back a little bit maybe? I'm thinking about somebody listening to this and they're like, "Well, what does morality even mean for money?"
Dr. Christopher Robichaud: Money per se is, to me, just a means of exchange. I think, really, the question we might be asking is, "What are moral ways in which to exchange goods and services, and what are some immoral ways to exchange goods and services?" I could imagine us saying, "Well, maybe the system of exchange that we're currently in with the concentration of wealth amongst very few and everyone else struggling, that is immoral," but money itself to me is maybe a distraction.
Kousha Navidar: If currency, let's call it, on its face is amoral, should we think about crypto differently than maybe the dollar bill?
Dr. Christopher Robichaud: It's a great question. I feel a little bit unqualified to answer that question. I think like anything else, something that really promises to be liberating has something very positive and has something very negative. I constantly think in terms of analogies, I'm old enough now to remember the birth of the internet and I remember the early promise of it, which is, "Everyone's going to be connected. You're going to be able to hear from everyone. Yay." Now, the problem is that everyone is connected and you can hear from everyone.
[laughter]
Kousha Navidar: Right.
Dr. Christopher Robichaud: With the positive comes from real negative and I think it's healthy to always be stepping back and asking, "All right, are the--" Let's at least try to make the goods outweigh the bads.
Kousha Navidar: I love that comparison to the internet because it feels so similar to the conversation we're having now, right? Even in the way that you just phrased it with that liberation. One thing I love about your work is how you try to do it through analogies and specifically superheroes. When you talk about this idea of liberation and its promise and the two sides of the coin, does that evoke any example of maybe a superhero?
Dr. Christopher Robichaud: Sure, I'll point to the obvious, but I think it's obvious because it's so powerful. The last panel of Amazing Fantasy #15, where we are introduced to Spider-Man comes Stan Lee writing the famous line, "With great power, there must also come great responsibility."
Ben Parker: Pete, look, you're changing. I know. I went through exactly the same thing at your age. Remember, with great power comes great responsibility.
[music]
Dr. Christopher Robichaud: I really think it's important for us to know that the line is not, "With great power comes great responsibility." It's with great power, there must also come or there must come great responsibility. Now, the power is the liberating part. Peter Parker is a, quite honestly, poor, nerdy, geek. He doesn't have a lot of options in front of him and, suddenly, he's Spider-Man. Suddenly, he can do amazing things. Suddenly, he has a lot more opportunities in front of him.
Stan Lee immediately comes in with, "Okay, well, you have a lot more power now. You have a lot more opportunities. Don't forget that there must come responsibility with the exercise of that." I think that money is one way and one obvious way in which one can gain some freedoms. I think, in some ways, the entire narrative of superheroes is looking at individuals with tremendous power and asking ourselves, "Are they living a responsible life with this power?"
I'm old enough to remember when Facebook was being invented and the promise of that was, "Here's a way to get people connected, meet up with your high school friends, et cetera, et cetera." Now, we're at a point where we're asking serious questions about whether Facebook and other social networks, but this one seems to stand out, is it eroding democracy? Did we use this new power, this new freedom responsibly?
Kousha Navidar: That makes me wonder and this might be reducing it a bit too far, but isn't the moral thing to do may be to get involved and steer it in a way that promotes good for society?
Dr. Christopher Robichaud: Well, I'm always hesitant a little bit to say what's the moral thing to do, right?
Kousha Navidar: Yes.
Dr. Christopher Robichaud: The thing I will always warn about, and this is true of all of us, we often go into some things thinking, "I'm going to be the one to use this tremendous power wisely and I'm going to be directing it towards great ends," and then they usually write tragedies about those people. Then they wake up one day and they're like, "How did I end up an accomplice all these horrible things?" Of course, the story's never like this one moment. It's always this gradual erosion of one's moral character. I think this is true for people that serve in government or anything else as well.
It's like you could always justify and rationalize something by saying, "Look, I'm just going to stay on the inside because there needs to be good done in here." We do need good people on the inside, but that should come with a big warning that we are very easily seduced into thinking that we're doing good from the inside when we're actually just part of the problem. It's hard to know without friends, family, and other support networks to help keep us honest about the role that we're playing.
Kousha Navidar: Let's bring it down to the individual, I guess, because that makes me think of Tim and the voicemail that he sent. For Tim or somebody listening to this who might be thinking about how to navigate crypto or maybe eating meat, how do they operate?
Dr. Christopher Robichaud: I'd start by saying, "Well, do you think that this is as morally reprehensible as you think?" If the answer is yes, I'd say, "All right, are there costs for you not doing it?" If the answer is, "Yes, there's high cost," I'd be like, "All right. Well, now, it's time to think about weighing costs and benefits." I guess the best way to put it, I believe that a moral life is constant work really hard and we're going to fail more often than we're going to succeed.
I think that's okay. That's not meant to be doom. I think that most of the religious traditions say something like that. I think that the moral life is a hard life. I think that when you raise moral questions, you should not run away from where they lead you to examining other parts of your life. If they lead you to places where you go, "My God, I might really need to change some stuff," don't run away from that.
I know you might say, "All of that from a question about crypto?" [chuckles] Of course, because the question that we heard was not just about crypto. Of course, it was a deep question about what we should do when we think we might be involving ourselves in something that's really morally problematic by our own lives even if it would bring us some good. That question's going to show up again and again and again in our lives. We can't run from it.
[music]
Kai Wright: Kousha, it really struck me when Dr. Robichaud asked in the beginning if moral questions should trump all other kinds of questions because, honestly, I don't really know how to answer that.
Kousha Navidar: Same. I've got a follow-up question for everyone listening. Have you ever had to make the tough call in your life when morality took a backseat to other factors? What was the choice and how did you make it? Email us a recording of your story.
Kai Wright: Yes, and I immediately think of stuff like continuing to eat meat when you don't have other options or investing in a company that does harm in the world.
Kousha Navidar: Well, funny you should say that. All these questions of technology, digital life, democracy, and inadvertent harm got me thinking about one company we'd all benefit from understanding a little bit better. Facebook or as it's called now, Meta. In fact, I called up someone else to help me understand what we can learn from their history with regards to how we're living our lives online.
Kai Wright: Okay, wonderful. We will hear more about that in just a moment. Thank you, Tim, from Brooklyn for inspiring this conversation. Thanks to Dr. Christopher Robichaud, senior lecturer of ethics and public policy at Harvard's Kennedy School of Government. We will be right back to pick up this Facebook discussion. Stay with us.
[music]
Kai Wright: Welcome back. I'm Kai Wright and I'm with our producer, Kousha Navidar this week. Kousha, before the break, you said you called up someone else to talk about Facebook.
Kousha Navidar: Yes, I don't know about you, Kai, but, for me, living online can be exhausting. If you think of it as being a digital citizen, determining what companies doing harm or good in the world is messy, and even just feeling like I have a choice in which platforms I need to be on is messy too, especially if I depend on digital to make money or stay in touch with people.
Kai Wright: You found somebody to talk to you about this?
Kousha Navidar: That's right. Shirin Ghaffary. She's a senior reporter at Recode and she does this podcast I love called Land of the Giants. Their most recent season is about the story of Facebook, where it went wrong, and where it's trying to go today. I called her up to talk through what we can learn from Facebook's story about how we live our lives online.
[music]
Kousha Navidar: I thought about how I wanted to start the story and I actually came to the point where you did start yourself. It was on a specific date, October 28th, 2021. Mark Zuckerberg, he's on-screen at this virtual company event. He makes a huge announcement. Tell us, what was that announcement? Why did it matter when you talk about that existential crisis that Facebook is going through right now?
Shirin Ghaffary: Right, so Mark Zuckerberg starts this special event. He makes a change that people had been whispering about, but no one knew exactly when it was going to come or what it was going to be. It was that Facebook was rebranding itself. It's going to change its name to be Meta and that, from now on, the company would not be just a social media company. The company's North Star would be this concept of the metaverse.
Kousha Navidar: Why did they do that?
Shirin Ghaffary: Well, Facebook says this has been in the works for years for a while. You can see if you look back at Zuckerberg's comments over the past few years, he's been repeatedly talking about and investing in VR, AR, these futuristic goggles that you can wear glasses on your head that will interpose, basically, a computer into your real life. The moment comes at a time when Facebook is just coming off of the huge leaks by the whistleblower, Frances Haugen. It is facing this decline in its usership. It's seeing an aging user base on Facebook, right?
Facebook's becoming more and more uncool. Younger users are on TikTok or other platforms. Everyone is thinking or the elephant in the room is, "Is this just a distraction from Facebook's problems?" Is Facebook trying to say, "Don't look over here at our aging and problematic social media platforms. Come look at this shiny new metaverse," which we don't fully understand or even know what it's going to be yet, but Zuckerberg throws out this really nifty and high-def set of visuals about what this-- It looks like something straight out of a science-fiction novel like robots floating in the air and people-
Kousha Navidar: Like playing poker?
Shirin Ghaffary: -transported to ancient Rome. Yes, playing poker in a spaceship and there's avatar version of Mark Zuckerberg making all kinds of corny, awkward jokes and being his awkward self.
Kousha Navidar: Well, I know what that's like, so I can help you with that.
[laughter]
Shirin Ghaffary: It was quite a moment and it really felt like he was trying to evoke this almost Steve Jobs-ian type of big presentation that captures the world's attention and inspires people. It wasn't quite received that way, I would say, just given the controversy and the state the company is in.
Kousha Navidar: You feel like he was distancing himself from something. Can you say explicitly what that something was that he was trying to distance himself from?
Shirin Ghaffary: I think it's just a baggage of the past. You could hear it in his voice that he was ready to move on to something new. He's a coder, he is an inventor, and even says, "I'm most excited about building new products, new tools." The subtext there, maybe what he isn't directly saying is that social media is pretty old at this point by the technological standard. The concept of Facebook or even Instagram, it's not really innovative anymore, right? This seems to be what Zuckerberg's running away from is this stagnant medium of social media that he's come up with.
Kousha Navidar: Yes, and something that has created a lot of complications too in society, right? You talk in your podcast so much about the trajectory of Facebook being one that they didn't expect, I guess, if that's fair to say. Along the way, things happened that created a quagmire.
Shirin Ghaffary: I think the story of Facebook is if you could sum it up in two words is like unintended consequences. Both good and bad sometimes. Unintended, and that they reached more users and probably anyone could have predicted in a very short amount of time, they had billions of people using the platform. On the other hand, they just, time and time again, failed to see around corners about how people could exploit their platform, how Facebook and Instagram and WhatsApp can draw out or encourage, enable, or, at minimum, just become a platform for some of the ugliest aspects of human behavior.
Kousha Navidar: Let's talk about that history a little bit, just tracking the big moments. When you were reporting and, I guess, now that you're done and looking back, what was the first big moment that you identified in their history that things started to go wrong? Can you walk us through that a little bit?
Shirin Ghaffary: If you go back to Facebook's early days, all there was was you look up someone's name, you see photos of them, and you see what they call the "wall," where people or your friends can leave comments or whatever. It was more like a directory really initially than the kind of robust social networks we know today. There was no feed that we have now giving you updates on what your friends are doing and what the news is. The first big invention that Zuckerberg pushes out, the big change in his product that's really controversial is adding this News Feed.
Kousha Navidar: I remember this. It was a big deal at the time. The way you're describing it, it sounds like a moment of original sin. Why was this such a fundamental and it sounds like irrevocable shift?
Shirin Ghaffary: When News Feed was first introduced, it's a big change because people are used to going and seeking out information like how you would use Google, right? You type in someone's name and then you go see them and you see what's going on in their lives. Now, you don't have to do anything. You can just sit back and relax with the invention of News Feed and see a list that's constantly changing of who's going to what party, what photos they're tagged in.
It's a massive change in just how people are consuming information on social media. It was a big deal. People didn't like it. I would say this is a first time that Zuckerberg and the company face a very strong public backlash, a real user protest. There are these groups, Facebook groups had come out by then, and people made protest groups saying like, "Students against Facebook."
They got 60,000 more users in it, which at that time was a lot. I think the most popular group on Facebook at that point was one of these protest groups. Zuckerberg doesn't back down. He keeps the News Feed. First, he writes a blog post flipping and dismissive of people's concerns saying, "Relax, breathe, calm down." Then he becomes a little more apologetic, but he still doesn't really fundamentally change this idea that we're going to broadcast your information to your friends.
Kousha Navidar: Is there something that you think could have been different about how it was handled looking in retrospect?
Shirin Ghaffary: In retrospect, I could see a leader who's less sure in their footing just canceling the change, right? Going, saying, "My God, we're getting our biggest user revolt ever." They knew it would be controversial. At one point, they had security guards outside their office. There was one group we talked to, an early product manager and engineer who helped develop this and, unfortunately, took a lot of the blame for it, even though she wasn't the only person working on it and everyone.
The buck stops really with Zuck. Because her name was on the product rollout, they made a group called "Ruchi is the devil." Her name's Ruchi. In another world, you could see Zuckerberg saying, "This is a disaster. Let's just stop," but he didn't. First, he dismissed, then he apologized, and he offered minor tweaks, but he doubled down that he knew best for users. He knew what users really wanted.
Because even though people were protesting and making these groups, he looked at their behavior and he saw that it was popular. People were spending more time on Facebook. They were more engaged. This idea of sitting back and just letting the News Feed tell you what to look at was drawing people in. That's just become the default mode of how any social media network exists today is this idea of a feed.
Kousha Navidar: It's so cool that you brought up Ruchi Sanghvi, who is the engineer that helped make the News Feed, because a quote of hers was actually one of the standout moments for me in the season so far.
Shirin Ghaffary: Would you raise anything or just change anything in the way you approached it at all?
Ruchi Sanghvi: I think that to anticipate all of these things would have virtually been impossible. In retrospect, had we been oracles and been able to predict all of these things, it would have probably stopped us in our tracks. I think the issue is technologists as a genre of people are just optimists at heart.
Kousha Navidar: When you heard her say that, what did you make of that?
Shirin Ghaffary: I was both surprised and not surprised. I thought that maybe she would come up with some guardrails. She gave a much more pure Silicon Valley answer to me, which is the part that's not surprising because what she says about optimism, I think, is really true and resonates with how most leaders in Silicon Valley talk to feel, which is that if you worry too much about how your product can be misused, if you sit there and you're eeyore about it and say, "Well, this is all the way the world is bad," you get stuck. I actually think it was a pretty honest answer. I respect in a way that she was giving the most purest defense of social media in that you have to accept the bad that comes with the good.
Kousha Navidar: It really resonated for me too because it was this idea that, well, it's almost impossible to know everything 18 years from now. I can only look at right here. I got to say, that seems reasonable if you think about just the impossibility of going through every single way or looking into the future but you can't. Is it too high of a bar to tell people working on these products that they need to be oracles to guess what impact their technology will have a decade from now? I don't know. What do you think?
Shirin Ghaffary: I think, to some extent, you should be thinking about the common sense. If you're building a car, you better think about if this car is safe for people to drive, right? There are some outcomes anyone can predict. When something is new as social media and she was really one of the first people working on this concept of a News Feed, I don't blame her for not knowing exactly how it would be misused. I think what you can start to put more blame around is when you're seeing early warning signs. Then to be fair to Ruchi Sanghvi, I think a lot of this happened after she left Facebook.
When you're starting to see maybe in some countries outside the US like Myanmar, how Facebook is being used in an increasingly polarized way that's potentially inciting violence and maybe leading to something that would later be known as a genocide, that's when you have to stop and say, "Okay, that was happening, the latest 2018, but even the years before that." At what point does Facebook keep making the same mistakes? At what point does Facebook lose that innocent card of, "Well, we didn't know. We couldn't have guessed." Because at some point, you can start to guess once your platform's used by millions of people around the world.
Kousha Navidar: Was there a point where you feel you could say, "Okay, so we're in the News Feed"? It's 2006, I think, that that comes out. What was the next moment where maybe you could point to and say like, "Oh, same mistake, different circumstance"?
Shirin Ghaffary: To give Facebook and social media some credit for a long time, it seemed like, "Wow, this is bringing out the best in people," because you had democratic movements. You had the Arab Spring happening on Facebook and Twitter. You had the launch of the young and charismatic President Obama coming up on Facebook and him using Facebook as a big tool to get out the vote.
Those aren't in Facebook's few bad things. These are great things. They're seeing how social media can be used to bring out the best in humanity. I think it does take some time before you can even find mass public events that are linked to social media in a really negative way. I think that really ramps up around the 2016 elections and also around Myanmar and the genocide in 2018, but even in the years before that. That's being discussed on the platform.
Kousha Navidar: Was the mistake again that they were dismissive of the warning signs or was it something different in that case as well?
Shirin Ghaffary: I think they were dismissive of the warning signs. Look, no one could have anticipated that Russian troll farms were going to try to divide Americans in the election even probably a couple of years before that happened. What you do see is when the news first comes out, when people start finding evidence of this, that Mark Zuckerberg calls it a crazy idea that fake news could have impacted the election.
At that point, you have to ask, "Why are they not acknowledging this bigger issue?" which is not just, "Did, technically, this one Russian's troll account sway this one voter?" It's about, "Wow, our platform is being exploited and misused by people with bad intentions, by people who are trying to divide society." People are being maybe pushed to more fringe or extreme beliefs or hating each other more or some people because in part of what they're seeing on our platform.
That is something that you don't see the company fully reckon with. I think, instead, you see them in defense mode and saying, "It's not such a big deal. Guys, don't blow this out of proportion," and then the evidence just keeps coming that it is a bigger and bigger deal. Actually, I think it goes the opposite direction where maybe Facebook is blamed, in my opinion, maybe even too much for swaying the election sometimes partly because they just didn't step up to that responsibility sooner.
Kousha Navidar: What do you think stopped them from doing that?
Shirin Ghaffary: I think it's partly that optimism. I really do think that it is the use cases that they envision oftentimes that these tech companies come from a limited point of view. Think about who is at the top highest levels designing social media platforms. Are they people who have lived lives that are really directly impacted by cutthroat geopolitics? No, they're not people maybe who have lived through a genocide happening in their country or who are understanding how foreign adversaries could try to impact an election.
I don't think you can expect someone sometimes with the life experiences that they have running these companies to maybe understand these problems as quickly as someone else might. I think that plays into it. I also just think there's an instinct to defend yourself, to defend your company. You don't want to be bogged down by going and running a big investigation into exactly how the Russians misuse our platform.
You just want to get more users. You want to make more profit. You want to keep going. You want to beat the competition. I don't know. It's so hard to think in hypotheticals. I do think it is worth maybe playing out that thought experiment, though, of like if it weren't Mark Zuckerberg, if it were someone who, again, let's say grew up in a country where they saw mass anti-democratic movements rising, would they be more attuned to the early warning signs of this?
I don't know because we haven't seen that play out. What I can tell you is that I think that every company in Silicon Valley is pressured to grow. I think that there are certain leaders who may have paused, who may have done things a little bit differently even at the expense sometimes of this kind of growth that we're seeing. I do think that maybe other leaders might have worried a little more about the legacy or just the problem or the baggage.
Kousha Navidar: That's a really good point. Did you see any change from the top over time in any way?
Shirin Ghaffary: It's hard to say. I think, fundamentally, no to be honest. From what I've seen, growth is still very much the top priority for Facebook. Yes, they have some more safeguards now. I would say they have more of an awareness of the negative consequences of putting growth above everything else, but growth is still number one. Facebook, it's in survival mode right now. It's in war mode, right? It's trying to beat TikTok. Its user base is getting old and uncool.
It is sinking millions and millions of dollars into the metaverse. It needs to make money. It needs to execute on its vision. It's taking some big risks and it's facing huge threats that it's never faced before on this level. This is not a time to sit and stop and do thought experiments and ponder the meaning of Facebook and its impact on the world. I just don't think, given their business priorities, that they even have the incentive to not focus on growth. They need to do that if they want to remain a company with a strong position in the economy.
Kousha Navidar: The term "incentive" is so loaded for me. There's so much to unpack there for me. To what extent is the issue that they're just operating in a circumstance where the incentive structure is not set up for them to do what needs to be done or is that too reductive and you can make space for what you need to do to protect democracy?
Shirin Ghaffary: I think those are questions far beyond Meta's control, and so maybe that shifts some of the blame from Meta onto the larger financial and economic systems. I do think, again, there's ways to reduce harm. I think there's an argument that all this controversy that Facebook gets itself in that it thinks is a distraction like, let's say, misinformation or anti-democratic movements happening on the platform or COVID hoaxes, I think all of that ends up harming the company or user privacy issues because all of that actually impacts user trust.
All of that maybe leads to users not being as excited about using Facebook as other platforms. Look, I think the responsibility at this point is beyond Meta. I think also, there's a question of, "Well, why should we trust a company that has repeatedly made mistakes to then correct itself?" At what point should government step in? At what point do they need actual change in the laws to compel them to walk the walk?
[music]
Kai Wright: This is The United States of Anxiety. I'm Kai Wright. You're listening to our producer, Kousha Navidar, in conversation with Shirin Ghaffary, host of the podcast Land of the Giants. They're talking about the future of Meta and what we can learn from the company's story as we navigate our own lives online. Take a break. We'll be right back. Stay with us.
[music]
Kousha Navidar: Hey, everyone. This is Kousha. I'm a producer. This episode is all about how messy digital life can be, but digital life can also make us all feel more connected. Here's one way you can help us with that right now. We're building a playlist on Spotify of our summer jams. We want you to add a song to that list. If you're a regular listener, you've been hearing another one of our producers, Regina, asking for your submissions.
Well, the end of summer is approaching, and so I'm here asking you again to send us your jams, so here's what to do. Send us a voice recording telling us your song choice and why you chose it. What's the story behind why you picked the song? You can email the recording to anxiety@wnyc.org. That's anxiety@wnyc.org. The playlist, it's already live on Spotify, so start streaming it now. You can find the link in the episode description. Thank you so much and we look forward to hearing from you.
[music]
Kai Wright: Welcome back. I'm Kai Wright. We've been talking this week about the moral and ethical questions we confront in our digital lives. They're not always simple answers to those questions. As our producer, Kousha Navidar, pursued the story of Facebook and how it has or has not answered these questions, we are curious to hear from other tech experts.
John Clark: I'm John Clark. I'm currently a PhD candidate studying how we can better anticipate the impacts of AI.
Jackie: My name is Jackie and I'm a product manager for healthcare products.
Regina de Heer: What responsibility if any do you feel about the use of technology that you're helping create?
Jackie: I think as a product manager, you have a lot of responsibility for the products you create, but I think there are small trade-offs that you have to make when you're looking at a product in order to figure out, for example, should I collect more data to improve the product? I think that is definitely a bit of tension between user goals and then also company goals. That's what you try to navigate and prioritize.
Regina de Heer: When you were entering your field or looking for a job, did you think consciously about the ethics of what you would be working in?
John Clark: Yes. Basically, I recognized that the impacts of AI are going to be so profound that there's a lot of potential to have positive social benefit by doing good work in that space. I've become acutely aware of the negative mental health impacts that social media can often have. On Instagram or Facebook, we see these highly-curated snippets of the best moments in people's lives, but we see our own lives in their totality with all the pain and disappointment and frustration and failure. The sample we get through our curated feeds is not necessarily representative of how our friends actually live and what they believe. We have to be very careful not to get a skewed sense of what our social circles actually are like.
Kai Wright: Kousha actually has some personal experience in the industry as well. It came up in his conversation with reporter Shirin Ghaffary.
Kousha Navidar: I worked at Google actually for a little while about 10 years ago. I remember, I still do, I feel proud of the people that I worked with. Everyone that I worked with seemed to carry that same ethos and being removed from the tech industry now. Looking back and seeing the trajectory of the world, it makes me wonder like, what was my responsibility being on the inside, being somebody who helped make these products "better," however you want to define that, whether for good reasons or bad? How much responsibility did I have and should there be something different with the way that I operated? I'm not sure if you could answer that, but how does that resonate for you when you think about not just the leaders, but the people inside these companies?
Shirin Ghaffary: You should have shared anything at all, bad or controversial, with a reporter and then you would have done your due diligence. [chuckles]
Kousha Navidar: I just went into the industry instead.
[laughter]
Shirin Ghaffary: No, I joke. I do think, though, transparency honestly can help. I think we know a lot more about companies like Facebook and Google and Twitter. Thanks in part due to employees who have brought information to the public via journalists or however they may want to. If they want to bypass journalists, I understand. I do think, at some point, if you are trying to make changes from within and it's not happening, I think there's, obviously, an argument that some information should be public.
I think these companies should share more. I think they don't share nearly enough, and it's not just Facebook, all the big tech companies, about what exactly is going on. Unfortunately, there's just this very small segment of society who are people who work at Google or Facebook, who really even come close to even understanding how these black boxes work. That is a great responsibility. I really do think so.
I think every person is going to be able to change this massive ship, this massive, massive leviathan company that they're working for if you're just a product manager or you're an engineer. I don't know. I do think that anyone who works at a tech company is in a place of privilege if you feel like you have any kind of say and you feel a certain way. I would say advocate for it because there's not many people who even come close to that level of proximity.
Kousha Navidar: In your most recent episode, you talk about some of the challenges that Meta now has been confronting more recently. How would you compare what they're dealing with recently versus the challenges further back in the past?
Shirin Ghaffary: I think it's different and that the people reacting are different. You see the same kind of backlash where Facebook introduces a change. People are upset. The company maybe apologizes, make some minor tweaks but doesn't really fundamentally change course. It still doubles down on its true vision. That pattern's still, I would say, happening, but I think what's different is that the people who were reacting to Facebook now are not just college kids.
It's politicians. People in the US on both sides of the aisle distrust Facebook and its political intentions and feel that Facebook has its thumb on the scale either way. Now, they're basically upsetting a more powerful group of people who are not just students. In its early days, Facebook was only students. Now, it is world leaders. Now, it is people all over the globe. Now, it is other CEOs and powerful people who use this platform and rely on it in some ways.
As we're seeing with Instagram now, it's also these celebrities like Kylie Jenner coming out and saying she doesn't like what they're doing to Instagram. That has more of an effect. There's bigger problems to reckon with than even 60,000 college students because these are people who have the power to make or break your platform in some ways. I think that's where now maybe the company is forced to be a little more reactive than in its early days.
Kousha Navidar: Yes, and the requests they receive seem to come with increasing pressure and complexity. There's a story earlier this summer. Nebraska police got private Facebook messages from Meta between a teenager and her mom to investigate an alleged abortion. If you miss the story, let me quickly lay out the facts. These two people are facing criminal charges related to the alleged abortion and mishandling of the fetal remains. Later, Meta released a statement saying the warden hadn't mentioned abortion and the company received it before Roe v. Wade was overturned.
Shirin Ghaffary: The point of the matter is that Facebook shared information about this person's very personal matter, right? At a time right now when there's so much anxiety and real fear that women are going to be prosecuted for what they do with their bodies because of the overturning of Roe v. Wade, so it's also just the moment that this revelation comes in even though this whole incident happened pre the Roe v. Wade being overturned, which is what Facebook keeps saying in its defense.
Again, it's a very PR type of reaction to this fundamental problem that they are ignoring and not really addressing, which is, what are you going to do when the government comes knocking on your door asking for more and more information about people's private lives? That's going to happen more with what's happened with Roe v. Wade. It's going to happen every day. It's going to become routine. I think this kind of case is the first we're hearing, but there's probably been more in the past and there will be many more in the future.
Kousha Navidar: It's hard for me to hear a story like that and not think about the future of our data and our identities online. Is it even possible to say what that future looks like?
Shirin Ghaffary: I personally tend to dismiss a lot of privacy concerns about data on social media companies, at least in my own life. Something like this makes you pause, makes you think, "Wait a second." We can't always anticipate a conversation that me and you were having today that we may think is perfectly legal and fine may become illegal tomorrow in the world we're living in and in times of political change and a changing of people's values in this country. That's a little scary to think that these companies have a log on you of every conversation you've had. These things could be used against you in a court of law for sure.
Kousha Navidar: I got to say, as an individual, it feels like I'm often at the whim of whatever a company decides to do if I depend on that platform, if I want to be a part of economic activity, social activity, anything. Am I powerless or are there things that I can do to take back agency?
Shirin Ghaffary: I think there absolutely are things you can do. Again, I think it's like voting. Even if as an individual person, you think, "Oh, this is pointless. It's just a drop in the bucket." We know in the whole, it isn't, right? Same thing with what apps you choose to use. If you really don't like an app, there are alternatives.
Kousha Navidar: Right now, a lot of people are talking about the potential of crypto, the metaverse, obviously, Web 3.0, all of these buzzwords. They are real technologies and they could make a real difference in the future. Another decade from now, do you think we're going to have similar stories about how these technologies showed so much promise but ended up creating great harm and there was a canary in the coal mine that we could have pointed to?
Shirin Ghaffary: Yes, I think we're already having those conversations. I really think this is where Facebook or Meta does have a chance to do it better. It's early. They are saying, they're promising they will, that they've learned lessons from the past. They're saying they're catching stuff earlier, but we are hearing some of the same early problems, I'd say. I think they may be getting a little faster fixing it.
There was one report about a user testing group for one of Facebook's metaverse products, which is their hangout social space called Horizon. A woman who is a user tester and they said that she was being virtually groped by other users, by male users in the metaverse. Meta caught them and they start to introduce these bubbles. They say now that if you're in the metaverse, your avatar cannot automatically go up to another avatar. You have a personal safety net around you.
I think that's an example of, "Okay, not great that this happens. Couldn't they have thought of this sooner?" On the other hand, it wasn't already a mass-adopted tool the way that Facebook was when it was being used for genocide, right? This is still relatively niche and this is still something that they're seeing in user testing maybe before the product fully launches. I think we'll see. I think even with a company like Meta, I want to give them a chance to see if they really improve even though, in the past, they've made repeated mistakes.
I do think we're definitely going to see all the same problems about bullying, harassment, misinformation, I think, maybe even more amplified. Because if you think about it, this metaverse vision, if it's fully executed, feels more lifelike. It feels more immersive. There's more senses involved. This groping, it felt more personal than if someone were to type out that, "I'm groping you," because you feel like you're in an alternate world.
This avatar is coming toward what you feel like is a version of your physical body. You can think of how far this goes. What if we start to introduce actual touch? There are sensors and things that can recreate feelings that we don't experience right now in our 2D digital world. I think we have to be more concerned about these problems and just, I think, give feedback and give it early. I'm glad that that user tester shared her experiences, right?
Kousha Navidar: What surprised you the most about Facebook's story?
Shirin Ghaffary: That's a good question. I would say just how much they were worried about competition at every stage. I always thought that Facebook was number one and it has been for so many years. For nearly two decades, it's been the biggest social media company in the world. I think now more than ever with TikTok, I'm understanding just how much at any point in time, Instagram before Facebook bought it could have overtaken, WhatsApp before they bought it could have overtaken. Twitter, Vine, what have you, all these companies are constantly threatening Facebook's existence.
Some of them, Facebook wins because they buy them like Instagram and WhatsApp. Others like, let's say, with Twitter, Facebook's just better at scaling and growing and monetizing. All those great values we talked about are ultimately what helps them win. I think what was surprising to me was just how fragile Facebook saw itself at times and how it never rested on its laurels. I think it needed to do that. As bad as the company was at seeing around corners regarding the societal impact, regarding Cambridge Analytica, or Russian misinformation, or what have you, they are great at seeing around corners with competition most of the time.
I think maybe until really TikTok, they really saw, "Oh, crap, we got to do something about mobile. We're going to buy Instagram," or "Oh, people are messaging each other a lot and not using Facebook necessarily. We got to get WhatsApp," or "We're going to copy Snapchat Story's feature because people aren't into posting photos on the grid. They want to share more ephemeral content." I think that what was surprising to me is just how much Facebook is always looking over its shoulder and good at shifting to win. They're good at looking over their shoulder for the competition, but not for maybe the good or bad of society sometimes.
Kousha Navidar: What do you hope people take away from all the reporting you've done about the Facebook-Meta story?
Shirin Ghaffary: There's so much controversy about Facebook. I think a lot of people are just so critical of the company. On the other hand, people inside the company, they feel that that criticism is overblown a lot of the time, right? They get defensive about it. What I really want people to do is whether you're critical or more positive about social media, I want you to be able to understand the other side.
Maybe take away something where you go, actually, to Facebook's credit here, that may have not been an intentional mess-up. There may have been some things out of their control that contributes to this. On the other hand, I want people inside the tech industry to understand, "Hey, here's where we can stop and reflect and actually maybe think that something went wrong." I hope we're doing that.
Kousha Navidar: Thank you so much for spending the time and also for your reporting.
Shirin Ghaffary: Yes, it was such a pleasure. This was so fun. Thanks for having me. I love you guys' work.
[music]
Kai Wright: That was our producer, Kousha Navidar, in conversation with Shirin Ghaffary. She's a senior reporter at Recode and co-host of Land of the Giants, which is a podcast that examines how the biggest tech companies rose to power. This season explores how Meta, formerly Facebook, arrived at this moment of transition. Kousha, what really struck me in that conversation was Shirin's whole reflection that even if there hasn't been a mass exodus from Facebook, there has been a gradual march like empires do fall.
Kousha Navidar: Yes, Kai. For me, it's personal. A lot of times, I feel stuck about where I need to be online. I bet, a lot of other people feel that way too. How do you choose and what do you do when those options come with baggage? Like with Facebook, it creates some harm in the world.
Kai Wright: Indeed.
Kousha Navidar: It goes back to our question from our first segment. Have you, listening, had to make a tough call in your life when a choice you made created some moral harm? What was that choice? How did you make it? Email us. The address is anxiety@wnyc.org and bonus points if it's a voice memo.
Kai Wright: Okay. Thanks, Kousha.
Kousha Navidar: Thanks, Kai.
[music]
Kai Wright: That's it for tonight, The United States of Anxiety is a production of WNYC Studios. Sound designed by Jared Paul. Joe Ford mixed this episode. Matthew Marando is our live engineer. Our team also includes Emily Botein, Regina de Heer, Karen Frillmann, Kousha Navidar, and Rahima Nasa. I'm Kai Wright. Keep in touch with me on Twitter and Instagram @kai_wright. Thanks for listening. Talk to you next week.
[music]
Copyright © 2023 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.