Melissa Harris-Perry: Whenever you enter a term into a search engine, whether it's debt ceiling or recipes for chocolate chip cookies, you're likely to be flooded by results in a split second. Now, having so much content available at our fingertips can be thrilling, but what's less apparent is how much bias is baked into the algorithms that brings certain results to the top of our Google searches and suppress others.
Dr. Safiya Noble is an Associate Professor at UCLA in the Departments of Gender Studies and African American Studies. She's also the author of the best-selling book Algorithms of Oppression: How Search Engines Reinforce Racism. In her research, she's extensively detailed the negative impacts that come from rarely having women-of-color, particularly Black women programming algorithms to popular search engines. When I spoke with her, she began by explaining where we're most likely to see bias in the context of algorithms.
Dr. Safiya Noble: Well, of course we're watching the news unfold around social media and the way in which social media algorithms are designed to foster the most engagement. Typically, that means with the most titillating or egregious or inflammatory kinds of material. What that does is of course contribute directly to anti-social behavior, harm, depression, addiction, all kinds of concerns.
In other spaces where I focus a lot of my work around Google and commercial search engines, you find that there can often be a tremendous amount of racial and gender bias. This really was evident to me about 10 years ago, when I first started looking at how girls-of-color in the United States were so profoundly misrepresented in a search engine.
You could do searches on Black girls, Latina girls, Asian girls, and have the majority of the first page of results come back as pornography or hyper-sexualized content. Of course, that's an incredibly inappropriate matchup for girl children-of-color, to be represented and misrepresented in these ways. Also, it tells us so much about how vulnerable communities and people are neglected and really not thought-about in the design of these algorithms.
Melissa Harris-Perry: On the one hand, it is both an indication of a harm being done when that search material comes back, but is it-- And I mean this really [unintelligible 00:02:31]. Is it an appropriate algorithm in terms of demonstrating how Black girls and brown girls are represented and thought of within the social system?
Dr. Safiya Noble: Well, there's no doubt that this kind of racist and sexist and misogynistic representation, misrepresentation has been hanging over the lives of Black women and girls-of-color for centuries. These ideas predate the internet. One big difference though, I think is that when those kinds of artifacts circulated in Hollywood and in popular culture as racist artifacts, we had much more ability to comprehend the subjective nature of that. We knew what it was when we saw it, and we could understand the role it was playing in popular culture.
When people engage with something like a search engine, they truly believe they are in a knowledge portal. They don't think they're in a large multinational advertising platform that really privileges the highest bidders, who's willing to optimize and pay Google the most. They think that this is credible.
Melissa Harris-Perry: Can you say more about large tech companies, like Google, like Facebook and the ways they make choices to allow or create algorithms that lead to this kind of harm and exploitation?
Dr. Safiya Noble: What I've seen over the last decade is that in response to both my critiques and other people's critiques, terrible news stories, public relations fiascos, Google will change and suppress and modify the results. It'll down-rank the terrible things. One thing that helps us understand is that the large-tech companies are able to manage, and they do respond to the public relations crises. We might remember the infamous photo-shopped image of First Lady, Michelle Obama that had a racist image of a monkey superimposed over her face.
When you did a search on her name, this image would come up first in Google images. Of course that elicited quite an outcry from the media and the White House, and Google downgraded that image so that that didn't come up to the top. What we know from their business practices, and this includes Facebook and other social media companies, is that in the countries where they do business, when illegal content moves around their platforms and becomes visible, they take it down. They hire armies of content moderators to screen content.
Of course there have been amazing books written about content moderation. What we know is that these companies are not just feeding back to the public, what the public is doing. That's probably one of the greatest misnomers is that people think, "Well, we get these results because that's just what people are doing." What we know is in fact that some of the most egregious kinds of material, it has high engagement, it circulates. Racist and sexist ideas are big business in America. It's one of the reasons why we come across those kinds of ideas.
The problem here again is that many people in the public when they're working in search engines, they really think that there are some experts or gatekeepers or knowledge. They think maybe there's librarians or people who are curating and making sure that the best, most truthful credible information comes to the top and nothing actually could be further from the truth when you're looking at certain types of social information.
Melissa Harris-Perry: It's so useful to hear you frame it this way. It's like the danger of reality TV versus scripted, is if we're watching a sitcom, we know that it is fiction but we can watch reality TV and believe it to be true. Believe it to be a documentary of some kind, not-- Documentaries have themselves edited, but we understand it to be just a representation of what is rather than these profit-driven choices made by these large companies.
Dr. Safiya Noble: Yes, and I'll tell you that's a great analogy, because in unscripted television and these kinds of reality TV scenarios, the most inflammatory, the most sexualized, the most stereotypical and derogatory kinds of unscripted content also sell. Especially, I think for Black people, many of us find that our lives are much more boring and ordinary and mundane, and regular than the way we see African-Americans represented in reality television. Part of that is because playing off of these kinds of tropes in our society, again, is incredibly lucrative.
People tune in to watch the train wrecks of it, but of course it also does the work of confirming for so many people who never engage with Black people that maybe this is an accurate portrayal. Those are the things I worry about in search engines, is people do believe that a lot of the things they encounter is accurate, because when you look for banal boring information, like, "Where's the closest Starbucks?" you're going to get accurate information. Those kinds of accuracies on the mundane reinforce your trust and things that might be incredibly subjective.
Melissa Harris-Perry: All right. I also love that you point out one, that in the mundane we do get accuracy, and that when there is pressure we get change. How might regulation, and from where FTC Congress, but how could regulation from some source get us to more accuracy and to change?
Dr. Safiya Noble: First of all, that we are even able to have these conversations about regulation puts a smile on my face so many mornings, because I can remember a decade ago when we talked about regulation. People shuttered, they just couldn't imagine that the way in which that would be an encroachment upon the tech industry, but of course the tech industry is one of the most powerful and unregulated, for the most part industries in the country.
It has so much control and an effect in our lives. Yes, absolutely. The Federal Trade Commission was the focus in my book of where the interventions could come from, because I think so many things we deal with on the commercial internet that we all are on, are matters of consumer safety and safety for the public. I think we have to think about what monopolies and anti-trust approaches could do to diversify the field of play, so to speak and to give the public more choices but also more importantly, to give the consumers more protections. I think we need an office of digital in the Civil Rights Division of the Department of Justice. I think victims need to be able to have recourse beyond just pursuing civil claims. We don't have enough laws on the books yet to protect people. I think of groups that I'm a part of, like the Cyber Civil Rights Initiative, who've done the critical work of getting things like non-consensual pornography, or revenge porn laws on the books.
The majority of the states, that has come from relentless organizing and that has made a huge difference in the lives of mostly women, who have been trolled with revenge porn and who've lost their livelihoods or been in harm's way. Laws are important, regulation is important, but I think there's another part that I always want to surface, which is we have defunded so many of the democratic public institutions and counterweights to this environment of tech company control in our lives.
College is out of reach for so many people. It's not affordable. Our K-12 schools are crumbling and are being defunded. We don't have a robust public health and public media systems.
When you have a sector that, for the most part doesn't pay taxes and engages in corporate tax evasion to the nth degree, the loss of those resources, not only weakens all of these other kinds of institutions that would strengthen our literacy and our resolve and our democratic fabric, but it also, I think really creates market conditions for when those institutions are weakened for those companies to rush in with their tech solutions, and to continue to grift off of the public. Those are some of the things that I think are really important to also have in the conversation.
Melissa Harris-Perry: Professor Safiya Noble is a Professor at UCLA and author of Algorithms of Oppression. If you think that what you heard was brilliant, it is perhaps because Dr. Noble is also a newly minted MacArthur fellow, colloquially understood as a genius. Thank you for joining us today, Dr. Noble.
Dr. Safiya Noble: Thank you.
Copyright © 2021 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.