The Covid Conspiracy Boom on Facebook
Bob: The novel coronavirus is one of the most complicated biological mysteries in human history for doctors and scientists, anyway. For an astonishing number of quacks, scoundrels, and fools, it is easily explained.
Man 1: Facebook says it has removed seven million posts about fake cures for the virus or prevention measures.
Female 1: That you can actually self-check yourself. That if you can hold your breath for 10 seconds, you don't have the coronavirus. That's absurd.
Male 1: A false Facebook post earlier this year claiming that chopped garlic with boiling water could cure coronavirus overnight.
Bob: Not that fake news on Facebook is anything new. In the 2020 survey of misinformation and disinformation during the last election cycle, Facebook spread more lies than both Google and Twitter and was abused by a variety of bad actors and their dupes to sabotage the election. Democracy here and around the world was a casualty of a business model that prized growth over the public good.
Now comes the catastrophic pandemic, and despite the corporate promises to protect its users, the data show that nothing has changed. According to Avaaz, a global non-profit that works online to protect democracies from disinformation on social media, Facebook's algorithm has helped bad actors achieve a staggering 3.8 billion views for COVID conspiracy theories, phony health advice, and political mischief.
Fadi Quran is the campaign director at Avaaz. Fadi, welcome to On the Media.
Fadi: Thank you so much, Bob. It's a pleasure to be on the show with you.
Bob: 3.8 billion servings of bad information is terrible on the face of it, how did that reach compared to the information delivered by authoritative sources like the World Health Organization, the CDC, and so on?
Fadi: What our report finds which is terrifying is that content from the top 10 websites spreading health misinformation had four times as many views as equivalent content from the 10 leading health authorities such as the World Health Organization and the CDC which would explain why so many people believe that for example wearing a mask is not a good idea or that COVID-19 is a hoax or that vaccines are bad.
Bob: All right, so, the hoax narrative, masks are dangerous, what other information was popular on these various pages?
Fadi: The type of misinformation was very diverse from claims that colloidal silver could cure people's infections to claims that chlorine dioxide would cure the coronavirus. Two more wide-scale conspiracies that for example, Dr. Fauci and Bill Gates were working together to put chips into people's bodies using the vaccination effort that would come with COVID-19. It's a wide-scale range of health-related misinformation.
Bob: There's bullshit and then there's bullshit peddlers, who are they?
Fadi: The bullshit peddlers are a wide range again of diverse actors. This is the problem here but some of them, one of the biggest websites we found is called realfarmacy.com, farmacy with an F. For example, this website, apart from spreading some harmful health misinformation, some of these lies, also sometimes seeks to sell products and supplements that it claims that would help cure some of these diseases.
We have found other actors, for example, this website Global Research that has been previously found to be spreading propaganda connected to the Russian government and content connected to disinformation campaigns that the Russian government has been spreading although it's not clear if there's a direct link there.
One of the stories that they shared is this fake claim that the American Medical Association was encouraging doctors across the US to overcount COVID-19 deaths. That article alone had over 160 million views. We found that one-fourth of the super-spreaders that we have highlighted are connected to or the content that they spread is connected to the far right, but we also found 4% of these actors were sharing content connected to the left or far left.
61% of these actors didn't have any clear political leaning, they were just spreading this health misinformation generally to millions and millions of users on Facebook.
Bob: You talked about super-spreaders, are there a thousand of them, are there's three of them?
Fadi: The truth is, we do not know globally how many super-spreaders there are. In our report, we looked specifically at the US, UK, Italy, France, and Germany and we identified 82 websites and then we identified about 42 Facebook pages that we called super-spreaders, but there are probably thousands if not tens of thousands of super-spreaders across the world sharing this type of content.
An earlier report that we had conducted in April, for example, found huge networks of misinformation in Brazil. We found misinformation content being shared across India. The same thing in the Middle East and North Africa. This is a global problem but in the report, we focused on 82 websites and 42 pages that we called our super-spreaders.
Bob: Have you been able to quantify the toll of the misinformation and the disinformation?
Fadi: We've worked with thousands of doctors and nurses and health professionals to ask that exact question. A recent study shows that for one piece of misinformation, the idea that concentrated alcohol in different forms could help kill the virus, that piece of misinformation has led to about 800 people to have died, and over almost 6,000 people to have been hospitalized.
On Monday at the Democratic Convention, we had a speaker who said the story of her father who had believed misinformation and had not taken the necessary precautions and passed away from COVID-19.
Kristin Urquiza: He died alone in the ICU with a nurse holding his hand. My dad was a healthy 65-year-old. His only pre-existing condition was trusting Donald Trump, and for that, he paid with his life.
Fadi: We spoke with doctors who were on the front line in hospitals in New York, and they've told us of cases of people coming into their hospitals who had believed misinformation and ingested some of these hoax cures or who had not come to the hospital until it was very late and have said that they didn't believe that the virus existed.
Can we quantify the exact numbers of people who have been harmed by this misinformation? It's very hard, but I think we can definitely say that people have died, people have gotten the virus, and people continue to be harmed by the existing misinformation that is out there.
There's one piece that I would like to add here which is, this is not just about COVID-19 health-related misinformation, there are health misinformation posts about fake cures for cancer. There are dozens of stories and people we've connected to and spoken to who have believed some of these fake cures that they've been targeted with on Facebook.
Bob: Almost, needless to say, those who believe the stuff sometimes as a political act, sometimes just out of personal negligence run around without masks and do not practice social distancing, and having been infected by super spread lies then become spreaders themselves.
Fadi: The way I would formalize it for people listening in to this call because oftentimes there's an assumption that ignorant people are the ones that fall for these lies, but there was a very well-designed piece of misinformation that was spreading during the pandemic. It claimed that if you could hold your breath for 10 seconds, then that meant that you weren't infected by the virus.
Because this piece of misinformation was so well-designed, I know a lot of people in my circles, including professors at universities, including nurses who had seen it and they had received it or seen it being shared by a friend of theirs on Facebook, and so they believed that story. You can imagine if somebody believes that story and then holds their breath for 10 seconds and decides to go visit a family member and they have the virus, how they can help that virus spread even though they really don't have any political incentive and do not want to cause harm to the people they love.
Oftentimes when people think about these lies that are spreading, they assume it's just bad actors and people who are politically polarized, but in more cases, than you can imagine, it's normal people, like, the listeners, like, you and I, Bob, who can fall for this stuff and end up doing harm without even realizing it.
Bob: Meantime, Facebook has declared a stance about misinformation and has claimed to have instituted various protocols to wipe it from its site in order to keep people "safe and informed about coronavirus" and they have uttered this piety 'misleading health content is particularly bad for our community'. Do you find any evidence to support that the company is doing anything to curb misinformation?
Fadi: Facebook is acting to fight misinformation. Some of the steps that they have taken such as giving free advertisements to the World Health Organization or creating the COVID information center are useful and commendable steps, but the truth is they are not taking the key steps in terms of redesigning their social media algorithm and providing transparency to all users by correcting the record that could really end this problem. That could really at least decrease the reach of health misinformation by between 80% to 90%.
That could decrease the number of people that believe misinformation by 50%. The best way to put this is Facebook's algorithm, the core of how Facebook works, is if these health misinformation super-spreaders are the Pablo Escobars, they're the ones producing all of this harmful content, the Facebook algorithm is the smuggler. It's the one that smuggles this bad content and puts it on people's phones so people see this misinformation.
The steps Facebook is taking are like building one drug abuse center here in this neighborhood, but it's not actually solving the core problem, which is its algorithm. There we can say Facebook isn't acting and taking this issue as seriously as they claim to be.
Bob: I would argue that the core problem isn't the algorithm. The core problem is that the algorithm is the goose that lays the golden egg and that Facebook could make changes to it, but would in so doing cut into its own traffic and the amount of time people spend on Facebook and cost it dollars. That they've made the decision not to make structural changes because it will impede their growth and their revenue. Is that just a paranoid fantasy?
Fadi: No, I would say that you are 100% right. Essentially, because the algorithm is what helps Facebook keep people addicted and make money, that's why Facebook does not move towards fixing it structurally, but I don't think that's the only reason. I also think that there's another reason here when we talk about misinformation more broadly speaking. Which is that Facebook's executives, Facebook's leaders, are afraid of challenging certain political actors, particularly authoritarian regimes and actors that use misinformation and have used misinformation to come to power.
Facebook's leadership instead of putting the health of society first is number one, putting its financial gains first, but number two is not willing to challenge those bad actors because it fears the consequences of regulation or other steps that can be taken.
Bob: All right, Fadi, I don't think I'm quite angry and desperate enough yet. I just want to discuss one other dimension of this. That is that the data that you have produced represents only half the problem because your studies concern only public Facebook pages, and there's this whole universe of private pages as well. Can you tell me about them?
Fadi: Yes. Most of us know what public pages are. You can like them and then you start getting content in your newsfeed from these pages. We found that 43% of the views going to these health misinformation spreading websites were coming from Facebook pages, but then you have a big portion, and it looks to be growing, that comes from whether it's people's private profiles, but more dangerously these secret groups that Facebook now allows users to form.
These are groups that can have up to hundreds of thousands of members, but our investigation and others we can't look inside of these groups on Facebook. What we're beginning to notice based on anecdotal evidence is bad actors are more and more beginning to use these secret groups to add people to them and to use them to spread misinformation, and also election-related disinformation.
Mark Zuckerberg after the Cambridge Analytica scandal and the Mueller Report and what happened in 2016 made the announcement that the platform will do more to fight the problem of disinformation, but also made the announcement that Facebook was moving more to these closed groups and these encrypted messaging tools. Facebook also owns WhatsApp.
What we've been seeing is that as Facebook has begun to create these closed spaces on its platforms that cannot be observed, that cannot be investigated, that cannot be held accountable, it's creating that space for these bad actors to again, come out and influence our politics, influence people's health, and Facebook is not doing anything to mitigate the threat that has been created by these secret groups.
Bob: I want to come back finally to just a previous answer. We were discussing Facebook's conflict of interest and the algorithm it depends on, what specifically could it do to that algorithm if its principal interests were not it's revenue growth, but the wellbeing of the public? What specific steps could it take to eliminate or reduce this problem?
Fadi: That's the core question and we propose two short-term solutions. The first is what we call detoxing the algorithm. What this would mean is that number one, let's say a fact-checker or the World Health Organization or the CDC makes a clear correction about the idea that wearing masks could help decrease the spread of the virus.
Facebook can go to every person on the platform who was targeted on the platform and say, "Hey, user X, hey Bob, last week you saw this post that claimed that masks would suffocate you. In fact, here's a correction from the World Health Organization. Here's a correction from the CDC." Academic studies show that if Facebook were just to train the algorithm to do that for independently fact-checked pieces of misinformation, it would decrease the belief in disinformation by 50%.
The other step then in terms of detox is when you have these systematic spreaders of misinformation that Facebook knows are abusing its system and sharing a lot of bad content, Facebook could redesign the algorithm to ensure that these bad actors are downgraded and of course you would want checks and balances. This is why we call for democratic regulation of the platform.
By taking those steps together, Facebook could begin disincentivizing, number one, bad actors from spreading misinformation, and the algorithm would begin to err more on the side of facts than on the side of amplifying the bad content. In the longterm, what Facebook needs to do is, and this is where not only of us but governments now around the world are beginning to demand that Facebook is more transparent about how its algorithm works so that researchers from across the world can come in and tweak the algorithm so that it does not continually give preference to sensationalist content, to conspiracy theories, and to bad actors.
That would require an audit and then it would require reprogramming of the algorithm. Although our hope has been for the last three, four years, that public pressure and campaigning and good people inside Facebook would take the step independently, it's clear now that we will need to push Congress and we will need to push the European Commission and other actors to force Facebook to move in that direction.
In the end, we are their product the normal people who are on their platforms, we do have the ability to change their cost-benefit analysis.
Bob: Fadi, thank you very much.
Fadi: Thank you, Bob. It's been a pleasure speaking to you.
Bob: Fadi Quran is the campaign director at the online activism nonprofit Avaaz. Thanks for listening to this Podcast Extra of On the Media. If you'd like to support us, consider leaving a review on your preferred podcast app. Of course, you can listen to The Big Show this weekend.
Copyright © 2020 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.