[music]
Melissa Harris-Perry: This is The Takeaway. I'm Melissa Harris-Perry.
This year marks four decades since the birthday of Dr. King was approved as a federal holiday, but celebration was not how the federal government responded to the work of Dr. King during his lifetime. Beginning in 1955 until his death, King was targeted by repeated invasive, covert surveillance operations by the FBI.
According to internal FBI documents, the goal of these operations was to "expose, disrupt, misdirect, discredit or otherwise neutralize" Dr. King and others in the movement for Black Liberation. Wiretapping was the primary technology of surveillance in the 1950s and '60s. Today, the technologies of surveillance are everywhere, and I mean everywhere.
Tawana Petty: Oh, yes. All they have to do is give you a vacuum cleaner.
Melissa Harris-Perry: Wait, wait, wait. [laughs] I'm sorry. The vacuum cleaner is a tool of surveillance? I talked with Tawana Petty, who serves as Director of Policy and Advocacy at the Algorithm Justice League. She terrified me with this statement about vacuum cleaners.
Tawana Petty: Think about it. Okay. Let's just think of a smart home that has all of our luxury dreams. You have Alexa, and you want Alexa to keep time for you. Alexa is monitoring everything that's happening in your home. Then you have a vacuum cleaner that needs to understand the blueprint of your home in order to vacuum throughout each of your rooms.
You have a smart television that you want to be in tune with what's happening on television. You want it to be able to recognize your voice. At some point, there's so much surveillance in our homes that there's nothing left to the imagination for these tech companies, and there aren't really any real policies in place to prevent them from leveraging all of this data that they've extracted from you. They have to do very little now to surveil us because there's so much of it being volunteered.
I don't even want to say volunteered. I would say it's that conflation between consent and coercion where you really want access to this more easy way of being. In order to do that, there's a trade-off. A lot of times we're trading off, and we don't know what we're trading off for.
Melissa Harris-Perry: Are we mostly talking about cameras here?
Tawana Petty: Yes, we're talking about cameras, we're talking about drones, we're talking about surveillance traffic systems, we're talking about pretty much any mechanism attached to any level of power that can inform a form of social control or behavioral control on a community. Without the appropriate regulations and checks and balances in place, we could quickly move into that type of targeting for community members who are still actually fighting for racial equity.
If you don't have a racial justice lens, if you're not integrating racial equity within these systems, then you have a surveillance creep, and you start to recognize that it's targeting particular demographics. It's not just the digital realm. It's actually the policies and the principles and values that institutions hold, that individuals hold, that governmental systems hold that can make these systems very harmful.
It shifts you from this seeing one another to this watching one another. It's really a mindset shift that we have to have when we're starting to think of addressing societal issues.
[music]
Melissa Harris-Perry: Quick pause. More on The Takeaway on MLK Day right after this.
[music]
Melissa Harris-Perry: We're back and still with Tawana Petty, Director of Policy and Advocacy for the Algorithmic Justice League. We're talking about how police are making use of AI in surveillance.
Tawana Petty: When facial recognition or face surveillance was proposed on Detroit, as an example, when it was initially launched upon the community, they were using real-time tracking, which meant that a community member could be walking through the neighborhood and a law enforcement officer could leverage their cell phone, as an example, to track that person's every move and attach it to a facial recognition database.
It was kind of like being in a perpetual lineup all day throughout the city. You're in a lineup and potentially being matched to a crime. There was a lot of social justice advocacy and pushback, and protests that then forced our law enforcement officers and city government to put some policies in place that prevent them from doing that type of targeting.
Melissa Harris-Perry: Talk to me about the cost. Are cities suggesting that they're using these technologies because it lowers the cost of policing for them?
Tawana Petty: It's costing businesses as well as taxpayers money. One of the things that is really mind-blowing about this particular program is that these businesses are paying for policing, essentially. They become priority one when they buy into Project Greenlight. It's almost like coercion. If you want your business to be prioritized, if you want to get police attention, then you will pay this monthly fee to be prioritized as a priority one when a crime happens in your establishment.
It creates a very contentious relationship between businesses who don't want to have that type of surveillance imposed upon the community members who enter their establishment and the businesses who do. Then if you're a community member who feels as though these businesses are being prioritized over you, then you're going to want to be a part of that program as well because you want to be prioritized. It creates this looping cycle of injustice, essentially, where if you don't buy into the system, then what type of service are you going to receive?
Melissa Harris-Perry: Is there a way for us to move into a future that is going to increasingly make use of AI, that is going to be more highly digitized and automated, but to do so with meaningful racial justice gains?
Tawana Petty: I will say, as an example, and it doesn't do enough, particularly around law enforcement use, but there is this conversation now with this blueprint for AI Bill of Rights that I do think is important that came out the White House. There's five principles you should not face discrimination by algorithms, and systems should be used and designed in an equitable way.
There's data privacy. You should be protected from abusive data practices via built-in protections, and you should have agency over how data about you is used. Notice an explanation. You should know that an automated system is being used and understand how and why it contributes to outcomes that impact you.
Finally, and to me, most important, human alternatives, consideration, and fallback, you should be able to opt-out where appropriate and have access to a person who can quickly consider and remedy the problems you encounter. If many more tech companies and corporations and governmental agencies applied even those basic core five principles, we would have less harmful systems.
Melissa Harris-Perry: Tawana Petty is the Director of Policy and Advocacy of the Algorithmic Justice League. Tawana, thank you so much for your time.
Tawana Petty: Thank you for having me.
Copyright © 2023 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.