BRANDY ZADROZNY This is On the Media, I'm Brandy's Zadrozny. Thanks to Micah's reporting, we all have a better sense of how journalists are navigating the use of hacked or stolen data in their reporting, but what about data that's been purchased? Last week, The Pillar, a substack dedicated to Catholic news, used a mysterious data set to identify a high ranking Catholic priest, a monsignor. His name has been widely reported elsewhere, but we're not going to say it here. They used the data to identify the priest as a user of the hookup app Grindr. Next, they plotted out his movements from his home, to work, to gay bars, and private residences. It was all made possible by data that Grindr itself collected. Data subsequently made available for legal purchase by a data broker, which somehow got into Pillar's hands. The priest resigned from his post at the United States Conference of Catholic Bishops the day the story broke. I asked Sara Morrison, a data and privacy reporter at Recode at Vox, how this data came up for sale in the first place.
SARA MORRISON You have the apps on your phone, and a lot of times they'll ask you for your location in order to work, and that data will not just go to the app itself, but a lot of times the app will have code in it that sends it to a bunch of other places. Facebook's one of them, Google's one of them. And then there's probably a bunch of other companies that you have never heard of and you don't really know that this is happening. You think you're using an app, you think you're giving that app your data, you're giving it to who knows who else, or some of these companies might just buy it directly from the app developers, but what they end up with is very granular data.
BRANDY ZADROZNY That data is collected under the pretense that it's anonymous, right? So how was the Pillar able to identify the monsignor from what they call commercially available records, I think, of app signal data?
SARA MORRISON The term that a lot of privacy experts like to use is not anonymized, but de-identified, because this can be re-identified depending on how it's stored. So if you just know a device is going to certain places and they're spending time at night there, it's not that hard to correlate with a specific person.
BRANDY ZADROZNY Cross referencing, right? So if someone knows that I work at 30 Rock and knows where I live, well, there probably aren't so many people going to those two places consecutively, and boom, there I am.
SARA MORRISON Yeah, exactly.
BRANDY ZADROZNY OK, I'm going to ask you about the ethical questions that this is raising for a journalist who use or want to use this kind of data. But first, you say that an ethical line was crossed way back when the data was initially collected. How so?
SARA MORRISON So I don't think the ethical line was crossed by the journalists. I think the issue is that this data exists in the first place, and you don't really know that your data is being collected this way or who it's going to. I think The Pillar said, well, you know, this guy consented to this. So, you know, it's fair game.
BRANDY ZADROZNY By he agreeing to consent to it, is that like in the terms of service, that thing that nobody reads?
SARA MORRISON Yeah, but they always say your data is anonymized, so I think it's very reasonable for people to assume that they're anonymous and it just used to like, market products to you, right?
BRANDY ZADROZNY Are there any rules around who brokers are allowed to share this data with?
SARA MORRISON You know, not really. Apple and Google have like, App Store rules that say you can't sell granular location data sourced from your apps, but Apple and Google are not the police. And actually the police can sometimes buy location data that they use rather than getting a warrant. So it can go to a lot of people, government agencies buy this stuff for investigations. So my point has been for a long time, and I think others, too, that we don't have laws governing how this data is collected, stored or shared. And so nobody gets to know where their data is going and who gets it, and that's just an environment for all kinds of abuse. And this is a story that demonstrates that.
BRANDY ZADROZNY OK, so what kind of awful future does this new story signal, and what can we do to stop it?
SARA MORRISON I would say it more signals an awful present. These companies will say self-regulation is something we do. That's OK, whatever, not enough. So I would say real laws we've seen, like the European Union has a law, California has, Virginia and Colorado have laws. If there's a bright spot to a story like this, I think it better illustrates to people how this data is collected and how it can be used. As a privacy reporter, I think one of the things I struggle with the most is trying to explain how the stuff works. If somebody steals something from your house, or somebody follows you around on the street, you see them. With this stuff, you maybe see an ad for a product that you just bought, and there is when you go, "oh, that's creepy." So I sort of hope an article like this illustrates and shows people the extent of all of this and the power and invasiveness that these companies can have over you.
BRANDY ZADROZNY Sara, thank you so much.
SARA MORRISON Thanks for having me.
BRANDY ZADROZNY Sara Morrison covers privacy and personal data for Recode at Vox.
Copyright © 2021 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.