Micah Speaks To Kyle Chayka About The Filter World
Micah Loewinger: Hey, it's Micah Loewinger. You're listening to the On The Media Podcast Extra. Before I landed a job at this show, I worked for a few years on and off at a couple of record stores around New York City. Some of my favorite albums to this day were recommended to me by my co-workers, men and women who I consider to be archivists. Not just of old formats like vinyl records, cassettes, and CDs, but of under-appreciated artists and niche genres. A knowledge of music history that can only come from a lifetime of obsessive listening, research, and curation.
Nowadays, I pay for Spotify. I try to learn about music off the app and then save it for later listening on Spotify, but sometimes I find myself just letting its recommendation algorithm queue up the next track and the next. It definitely works. Spotify has helped me discover great music, but it's never been as revelatory as a personal recommendation from a friend or an expert at a record store or an independent radio station. This feeling that I've traded convenience for something deeper is what made me want to read Filterworld: How Algorithms Flattened Culture by Kyle Chayka, a staff writer at The New Yorker. Chayka says apps like Spotify and TikTok are great at studying user behavior, but that we should be suspicious of the idea that they can really know your taste.
Kyle Chayka: The algorithm as your best friend, as the intimate knower of your innermost secrets, is definitely what Spotify, Facebook, X, and TikTok would love for it to be. You forget what you do on these platforms. You are not aware of every time you click into a Spotify track. You're not aware of when you favorite an album. On TikTok, you're absolutely not aware of every microsecond that you like, flip up a video, or what you pay attention to a tiny bit longer than something else. It doesn't know exactly what you're doing, and it doesn't forget like that one time that you lingered too long on the shower tiling video. It's like, "You remember those shower tiling videos? Let me give you some more."
Micah Loewinger: Exactly, that happens all the time for me on Spotify where it will recommend something to me and it'll be based on patterns that I'm not perceiving. It would be good to define our terms a little bit. I know Spotify's algorithm is a trade secret. We don't know exactly how it works, but as you write in your book, there are clues based on literature about the development of recommendation algorithms that might tell us how it likely works.
Kyle Chayka: Most recommendation algorithms are black boxes, not because they're impossible to figure out, but because the company itself does not want you to know how it works, because you might game it, because that might disrupt how it works, and that would ruin their product. A lot of them work along the same lines, essentially measuring a bunch of variables about the content that's on their platform. How many times people have clicked it, what the faves are, what the retweets are, what the time watched is, and then using that to figure out what to promote more and what to push off to one side or another.
Micah Loewinger: You referenced this 1995 MIT Media Lab paper that described social information filtering. Can you describe that concept?
Kyle Chayka: Social information filtering is this thing where, let's say, the tastes of two users are compared. For example, there is another recommendation engine called Ringo that works with music recommendations, and this was another early, mid-90s thing. It asks all of its users to build lists of their tastes in music and name a hundred bands that you like. Then the recommendation algorithm compared the profiles of all of those people and determined which people were more like each other. Who has similar tastes to you? Then using that comparison, social information filtering says, "Since your taste is like this person, and this person likes this band, but you don't yet, you may be likely to enjoy this band." You can see that scaled up in the digital platforms that we use now.
Micah Loewinger: Really grouping us with other users, like that's the solution, because Spotify doesn't know what's good or bad.
Kyle Chayka: Right, that's a fundamental thing. Algorithmic recommendation is not about quality. There is no essential metric of quality. There is only attention, what is listened to more and what is listened to less. That's all the data it can take in. It can do thumbs up, and thumbs down, but it can't be like, "Oh, Bach is better than Mozart."
Micah Loewinger: Spotify's recommendation algorithm is just one part of what you call Filterworld. It's the name of your book, but it's also a concept. Can you describe it?
Kyle Chayka: Yes, Filterworld for me was this single term to describe the entire ecosystem of algorithmic feeds that we exist in. When we're on the internet today, we are moving across all these different platforms, whether Facebook or TikTok or Instagram, that are all driven by algorithmic recommendations, that are constantly trying to guess what we might like and put the next piece of content in front of us based on what we've consumed before. I personally felt totally enclosed by this sphere almost of algorithms and I couldn't find something or listen to something without facing that surveillance and recommendation of what I was doing.
Micah Loewinger: I want to dig into some examples of this feeling of being boxed in by algorithms at the same time as feeling that they provide us with the things that will fill our time and our hearts, TV shows, movies, albums. Let's talk about Netflix, for instance. When I open up the app on my TV or laptop, it feels like I'm being given a wide range of shows and movies tailored to me. What's really happening there?
Kyle Chayka: The homepage is supposed to be a thing that reflects your tastes and filters through things that you're going to like. More often, these categories are so broad and the labels are so vague that they don't actually promise personalization. There's like a top ten or there's a popular right now. Those shows are just what's convenient for Netflix to promote at a given time, what's popular with a certain segment of the users, and what they can most conveniently convince you to watch in a way. Netflix has this algorithmic system to change the thumbnail of a show.
Micah Loewinger: Yes, this is so creepy.
Kyle Chayka: When you go on Netflix, the images of every TV show and movie are tailored to your preferences.
Micah Loewinger: For instance, in 2018, there was a controversy where a bunch of people were being promoted the film Love Actually. A pretty safe film to promote to a lot of people, very popular, but it turns out some people were being recommended with the prominent imagery of the Black actor, Chiwetel Ejiofor, who plays only a minor part in the film.
Kyle Chayka: It's so manipulative if the Netflix algorithm know that you watch a lot of movies with Black actors, then I am going to present every movie as if it focuses on Black actors. In the case of Love Actually, which absolutely does not focus on Black actors, I will highlight one of the few scenes that has this man in it in an effort to get you to watch it. Not because you definitely like Love Actually, not because you're going to love Hugh Grant dancing through the halls of the government or whatever, but because it would be convenient for Netflix if you watched this movie.
Micah Loewinger: This is the breakdown of the illusion of the recommendation algorithm, which claims to be shaping the viewing options for you, but it's really just convincing you that you should like something it wants to show you anyway.
Kyle Chayka: Right, that you might not actually like or that might not fall into your taste categories at all. There's other academic term, corrupts personalization, which I think is exactly what's going on with Netflix. Corrupt personalization is the image of personalization, the idea of personalization without the actual reality of that.
Micah Loewinger: That's an egregious example of the bait and switch. I guess I want to talk about another theme in your book. You're talking about something slightly more pernicious, which is a recommendation algorithm like Spotify's that in maybe the largest library of legal music ever created. I am subtly encouraged to listen to the same stuff that I like over and over, how is that happening?
Kyle Chayka: There are a lot of knobs and variables that can change in these formulas, but for Spotify, it feels pretty conservative a lot of the time, I think, if you put on an album and then let it go. Usually, within a few songs, I think it serves you up something that you listen to constantly that it knows you are not going to turn off in order to lull you into that hypnotic state of just listening to the infinite playlists and not thinking about it.
Micah Loewinger: This gets into what you want out of a listening experience or what you want out of a library of culture. My own leaps forward in music curiosity has come from listening to the radio. Big shock. I work for a public radio show and I really like radio, but I think of WFMU, the independent radio station in New Jersey, or WKCR, Columbia's radio station. I remember for the first time hearing the Indian music show and hearing a 30-minute raga and then somebody explained why it was interesting at the end. I had never encountered that kind of music before. It opened some avenues of inquiry. There is something that you argue in your book that is lost when we take curation away from humans.
Kyle Chayka: Yes, human curation and that idea of a DJ, a human person who has selected this raga, and even though it's 30 minutes, that person is like, "You are going to like this. It's important. You should listen to it with me in a way. I will guide you into this culture. I will show you that it's important. I'll explain it to you after." That's such a different encounter with a piece of culture than what you get on Spotify or what you get in a YouTube recommendation. The job of human curators like a DJ, like a museum curator, or a librarian, is to build meetings through juxtaposition and then guide the consumer into it in a way that helps them broaden their own horizons, as you said, or brings them to a new place of taste or thought. We just don't get that from a machine.
Micah Loewinger: That said, I'm sure listeners right now are like, "But Discover Weekly has delivered some great stuff to me, or I keep a close eye on some of the high-profile Spotify playlists that are curated by humans." This is not a pure either/or, right?
Kyle Chayka: The internet is not the same thing as algorithms. There are many digital platforms that are not algorithmic. There are also ways of using Spotify that are not guided by algorithms. We can't blame algorithms so much like they fulfill a really important function in sorting information, but I think we can take back some of our agency
Micah Loewinger: This pernicious idea that with recommendation algorithms, we feel like we have agency when we really are having it taken away. One famous example of this is TikTok's recommendation algorithm. The For You Page, TikTok's great innovation, you say, can have this numbing effect on you.
Kyle Chayka: The feed is so hypnotic, it slots one piece of content after the next, after the next. They're not too different, but they're a little different. They're ephemeral, they're over within a minute or two.
Micah Loewinger: You compared it to Brian Eno's 1978 album, Music for Airports. A lot of people consider it to be one of the first ambient music albums.
Kyle Chayka: He literally came up with the phrase ambient music, and what he meant by ambient was music that you could either ignore or pay attention to. Music for airports, you can just let it fade into the background, it's completely ignorable. It's a nice wash of sound, but if you do pay attention to it, there is something there. You can think about his conceptual gestures. You can listen to the details of the synth washes or how things move in and out.
I think the problem with the ambient quality of algorithmic feeds is that they're too often just ignorable. It's like the lo-fi chill hip hop beats to study/relax to a problem that is designed to be ignored all the time. The idea that you'd never have to think about who made something or go deeper into the musician who made a song, or the artist who made a painting and you just consume the feed as the producer of the work, that is pernicious, because you almost forget the idea of the artist.
Micah Loewinger: In comparing past ways of consuming music, say, like through the radio to what you call Filterworld, we do run the risk of being overly nostalgic. The tastemakers of old, the radio DJs, the record store clerks, the critics, they had their own blind spots and biases. DJs of top 40 radio stations were swayed by money, pressure from labels, whatever the public at large they thought would respond to. That's not exactly for the pure love of music, right?
Kyle Chayka: The old algorithms were human gatekeepers who made decisions about what culture should be promoted and what shouldn't, magazine editors, record label executives, the DJs who might be influenced by payola. I don't think that's inherently good. I do think in the best examples, like an indie radio DJ who's not overseen by corporate overlords, that can create really beautiful moments of curation and the transmission of culture, but so can a YouTube recommendation. I've gotten really interesting stuff from a YouTube recommendation that I wouldn't have known a person who could give me and I wouldn't have known to seek it out.
Micah Loewinger: Give me an example of the algorithm serving up something that got around the calcified biases of the old gatekeepers.
Kyle Chayka: An example of something that I personally like is the Japanese genre of city pop, which was this music that was made in the '70s and '80s mostly. It's this very brilliant R&B, big orchestra, propulsive beats, big bold, crazy music, and it's really fantastic. It was hidden away for a long time. Japanese people were not listening to it much after the '80s. Then in the 2000s, some record DJs brought it up and then it hit YouTube where it just blew up because, for some reason, it worked for the recommendation engine.
A lot of people were listening to this music, they were liking it, they were engaging with it, they were seeking out more of it. City pop became associated with YouTube. In that way, the mathematical quality of it did circumvent a lot of human tastemakers. YouTube registered that this music was getting popular with an American audience long before a record label executive could do anything or even a radio DJ. It was a democratic revival of the genre of music online, which I think is really cool.
Micah Loewinger: With the so-called democratization promised by social media is amplification and all of the problems that it introduces. Algorithms picking things up to go viral that otherwise might not have. That any regulation of algorithms, which you explore in your book, should mandate greater transparency around what gets pushed into people's feeds. Tell me a little bit about why we should regulate algorithms and what you see as the potential avenues for that.
Kyle Chayka: Right now, there are essentially no rules about what an algorithmic feed can recommend to you or how it can interact with you. You can regulate what kinds of content gets algorithmically recommended. You could say that problematic content that promotes violence or self-harm cannot be subject to an algorithmic recommendation. If that was blanket illegal, as it may soon be in the European Union, then social networks would be much less likely to even touch that kind of material in its feed. All of a sudden, you could only find that stuff if you opted into it. It would not get pushed out to more people.
There's regulation about what kinds of content can be recommended or promoted. There's regulation around transparency for algorithmic feeds, which means that we could see how something works and know what variables are being taken into account when something's promoted to us. There's regulation that mandates you be able to opt out of recommendation.
Micah Loewinger: When you say regulation, you mean that you are seeing lawmakers, academics, and so forth, propose these ideas, but are they likely to pass and be implemented?
Kyle Chayka: In the European Union, they have passed the General Data Protection Regulation, which has caused that wave of popups that say, "Please let me give you cookies." The Digital Services Act more recently, which does mandate things like algorithmic transparency and opting out of feeds. In the US, we're way, way, way behind that. Some of these companies like Facebook are changing how their feeds work based on the European regulations, but in the US, we don't actually have any of those rules. The few efforts that have been made in government have just not gotten very far at all.
Micah Loewinger: In your conclusion, you acknowledged that the intersection between art and culture and technology has always been fraught. Cameras and radios sparked fear, so did the telephone, in fact, who was lamenting what was lost when streetlights were introduced in Tokyo at the late 1800s and early 1900s. Are algorithms fundamentally upending how our world works, or is this part of a larger fear we have about change?
Kyle Chayka: We do always fear what technology does to culture. Culture is threatened by a thing like the camera, like recorded music, [laughs] like the radio, and then artists find a way to carry on and make great things, and then we adapt and reframe our idea. No one's going to say that recorded music is a sin or that we should go back to only live music because that would be more authentic. There's no pure culture. I do think pendulum swing, we've gone so far into this algorithmic ecosystem that I think we desire to retreat from it a little bit. The same way that we had to make up regulations for seat belts and car safety, rather than people flooring it down the road [laughs] and having no safety checks in place.
The financial exchange is so different to how art is sustained. Before, a musician would sell you their album and they would make money, now it's mediated by this huge algorithmic platform of Spotify and they only make money based on certain metrics, based on streams. I think one way to retreat from that, and to go back a little ways is just finding ways to directly support the voices that you like. A designer, or even a curator, or a DJ who makes cool playlists.
The best way we can ensure the survival of those kinds of relationships is just to pay them money. It's more expensive than a Spotify subscription. You're not going to get an infinity of music. But getting that infinity of music for $10 a month means that musicians have a really hard time making a living. Even though it's nice to be on the TikTok feed and see who you like to see, that's ultimately a hard way for your favorite creators of whatever to make a living.
Micah Loewinger: Kyle, thank you very much.
Kyle Chayka: Thank you for having me.
Micah Loewinger: Kyle Chayka is a staff writer for The New Yorker, covering technology and culture on the internet. His latest book is Filterworld: How Algorithms Flattened Culture. That's it for the On The Media Podcast Extra. Tune into the big show this weekend to hear my interview with MSNBC's Chris Hayes about his counterintuitive approach to covering Donald Trump. Subscribe to our subreddit, r/onthemedia, and follow us on Instagram and Threads to keep up with what we're reading throughout the week. Thanks for listening.
[music]
Copyright © 2024 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.