Takeaways From the Twitter Takeover
[music]
Melissa Harris-Perry: I'm Melissa Harris-Perry, and we're glad that you're starting your election week with The Takeaway.
[music]
Melissa Harris-Perry: Now, back this summer, major social media platforms including Twitter pledged to ban and remove content that misleads people about how or when to vote, and they promised to promote accurate information about elections. Last week, the advocacy group, Free Press, released a comprehensive analysis of the election and disinformation-related policies at Facebook, Instagram, TikTok, Twitter, and YouTube. It concluded that these sites have not lived up to their promises. Now, this failure to halt disinformation comes even as the tech sector has been laying off thousands of workers and instituting hiring freezes.
On Thursday Lyft cut 13% of its staff, about 700 workers. Stripe, the payment processing company, laid off 1,000. That's about 14% of its staff. According to a report in the Wall Street Journal Sunday, Meta, the parent company of Facebook and Instagram plans to make "large-scale layoffs" this week after its stock fell more than 70% since the beginning of the year. Then, of course, there's Twitter. On Friday when the company announced it was cutting half of its employees, that was about 3,700 workers. Among those fired were many employees involved in content moderation.
Some of whom had reportedly been locked out of access to moderation tools already. With just one day left to go before the polls finally closed on this midterm election season, all of that is a big concern. Conspiracies and falsehoods about voter fraud and poll security have been prevalent on social media during this season. It's been less than two weeks since Elon Musk bought Twitter for $44 billion, turning a publicly traded social media platform into a company under the control of just one person. Researchers from Montclair State University found that in the 12 hours immediately following the takeover, there was "an immediate visible and measurable spike in hate speech and racial slurs." Musk has even showed his own propensity for misinformation.
Reporter: Just days after taking over as the CEO of Twitter, Elon Musk tweeted to his millions and millions of followers a conspiracy theory about what happened here during that attack on Paul Pelosi.
Melissa Harris-Perry: With me now is Rashad Robinson, President of Color of Change. Rashad was among a small group of leaders who met with Elon Musk last week in an effort to address rising concerns. Rashad, thanks so much for joining The Takeaway.
Rashad Robinson: Always good to be with you.
Melissa Harris-Perry: Tell me a bit about this meeting.
Rashad Robinson: The meeting came about because the groups that met with him were the groups that led the $7 billion boycott of Facebook back in 2020, the Stop Hate for Profit campaign. Also, we have been working, many of us, for years, at both holding these companies accountable, while also advancing and pushing policies that changed the incentive structures. When we sat down to meet with Musk, it was very much focused on the upcoming election and dealing with some of the announcements and some of the rumors about potential changes.
We had about a 45-minute meeting that did focus on three very clear ask. To not re-platform anyone before this upcoming 2022 election. Not re-platforming anyone that has been de-platformed for violating Twitter's policies for inciting violence, et cetera, to not dismantle or change the election integrity work and to keep that infrastructure in place through the end of the election and certification, and to be transparent and have a transparent process about not just the changes about re-platforming, but any other policy changes through this council that he's set up.
He actually agreed to all of those ask on that Zoom call and the groups, we left the call, and we told Mr. Musk that it would be important for him to talk about this publicly for it not to come to us. About 1:30 the next day in the morning, 1:30 in the morning, Elon Musk tweets out about the meeting and list out the agreements, the things that we had agreed to in that meeting. Tagging each of the leaders who were in that meeting, only to see, over the course of the next 24 hours, for him to start dismantling the very infrastructure that would make any of those promises a possibility in reality.
Melissa Harris-Perry: What does that say to you, that dismantling?
Rashad Robinson: In many ways, it shows that Mr. Musk just doesn't know what he's doing. It feels like someone who was a big fan of a sports team or maybe a Broadway show, and they decided what they would do if they were in charge. They started to tinker with the show or tinker with the team and decide who's playing what positions, and how they would change around the budget. With each new decision, it's almost like nothing has been game planned. Nothing has been thought through. These are very complicated companies, and even at our worst moments with some of these companies, as we pushed them to do better, we were operating and engaging with people who were incredibly serious, well-studied.
Understood at least the technology, even if they didn't understand civil rights or human rights. That's not what we're getting here. It does speak to both the challenges that we will be facing this election cycle and beyond, in terms of disinformation and misinformation. It also speaks to the larger challenges we have with these huge communications platforms. Self-regulated companies are unregulated companies. The fact of the matter is that our cars, for instance, are not safe, Melissa, because of the benevolence of the auto industry. They're safe because we have infrastructure accountability.
There are rules. That is part of the problem with Silicon Valley. It's that, because of the lack of rules, the technology that should be bringing us into the future can very well drag us into the past because the people in charge of these platforms will time and time again choose their growth and their profit over everything else from safety, integrity, security, and civil rights. They will choose their growth and profit, and that puts us all in harm's way.
Melissa Harris-Perry: That's helpful for me because I suppose I've felt a little distraught at the notion that the misinformation problem, the civil rights problems of Twitter initiated with Elon Musk, they pretty clearly preexisted not only on that platform but on many others as you've pointed out.
Rashad Robinson: Yes. This is not a problem that has happened just because of Mr. Musk. It's animated because now we have someone that is even more of a challenge than previous CEOs, that moves much quicker, that tweets about policy changes that maybe his staff hasn't even heard about yet, or that they have just recently fired the very people that would implement that policy change. We are dealing with a very tenuous situation, but I don't want anyone to think that it just started now. The fact of the matter is this is why Color of Change has been so focused on the Black Tech Agenda, which is our policy platform for Congress which we released a couple of weeks ago and have been working across Congress to elevate these issues while also supporting and championing other policies.
I've testified before Congress. We've engaged lawmakers in the White House. We actually have to get to some set of rules that change the incentive structure that hold these leaders accountable because we shouldn't have to go to billionaires begging them to protect our civil rights. Most of the time when we get the clearest information out of these platforms about challenges with algorithms, changes in rules that put people in harm's way, misogyny, homophobia, racism that's been allowed to exist, we are getting those things from whistle-blowers.
Companies this size should not be allowed to be so secretive and private over so many of these issues that impact us. Other corporations don't get to act that way and engage that way. This has to begin to change if we're going to deal with information disorder, but if we're going to deal with all the ways in which these challenges on the platform are impacting our democracy, our economy and so many other ways in which we need to work together and live together.
Melissa Harris-Perry: My understanding is that the coalition, which is called Change the Terms, has laid out 15 specific measures that companies could enact to reduce hate speech and disinformation. I just want to zero in on one here for a moment, and that's about the algorithms, which you just mentioned. I think for those of us who aren't techies to hear that is like, what do you mean fix the algorithms to stem hate? In non-techie terms, what does it mean to do that?
Rashad Robinson: Freedom of speech is not freedom for reach and freedom for amplification. There have been a number of studies, one that came from the University of Cambridge, that actually showed that the Twitter algorithm, for instance, was amplifying right-wing and conservative voices at a much higher level and higher frequency. From Frances Haugen, the Facebook whistleblower, we learned that Facebook changed some of its internal structures and algorithms to prioritize content on your page from people that you actually don't know and from people who are saying incendiary things because they came to the conclusion, based off of their research, that that keeps you on the platform longer to be able to see more ads.
You may be friends with a bunch of people or have family members on your page, but they are prioritizing you being part of conversations that outrage you, that upset you because you'll be more likely to share and post and be engaged. They do that and then keep it secretive, the algorithms that feed certain type of content to our young people, and have been exposed for how it impacts body image of young women, and young men, and so many other issues. We believe that there's both a need to have deep transparency and evaluation. Part of these algorithms are their product.
This is part of their business models of what they amplify, of what they lead you to, of, if you click on a white nationalist, that you get fed more white nationalists. It was exposed recently by a colleague organization, Accountable Tech, that up until recently if you went on Facebook's page, and you searched for white nationalist organizations like the KKK, you would be served up list of Black churches. The very groups that have targeted Black churches, if you're looking for that information, then what you get served up. It's because these companies are moving so fast.
There is no accountability, there are no rules. When they get exposed, they first oftentimes deny that this ever existed. Then journalists and others will expose it, and then they will apologize and say they're working on it but won't tell us how. Over and over again, we're expected to trust these people to engage, and with a deep understanding that they simply don't even have the right people around the table even if they wanted to. Mark Luckie, a former Facebook employee who left very famously and really exposed some of the challenges around diversity on their staff, said that there were more Black Lives Matter signs on their campus than there were Black people.
Melissa Harris-Perry: You'd said that a self-regulated company is an unregulated company and that you've been testifying before Congress. We are in the final day of voters deciding who will sit in that Congress for the next two to six years. What is your sense of the willingness of Congress to act?
Rashad Robinson: Just to be very clear, there is a willingness to engage on these issues by Democratic members, all of the legislation on Section 230, some of it which we really support, some of which we want to push, but all of it, Section 230 reform, which is the section of the Communications Decency Act, which allows these platforms to have levels of immunity for things that are placed on their platform. Even ads, and even things that relate to their business model outside of freedom of speech. The only folks sponsoring this legislation are Democrats.
Melissa Harris-Perry: Rashad Robinson is President of Color of Change. Thanks so much for your time today, Rashad.
Rashad Robinson: Thanks for having me.
[music]
Melissa Harris-Perry: We should note, we reached out to all the tech companies mentioned. Meta responded with details from this summer on the kinds of misinformation they will remove. Twitter sent us a tweet that said in part, "While we said goodbye to incredibly talented friends and colleagues yesterday, our core moderation capabilities remain in place." We have a statement from a TikTok spokesperson that reads in part, "We continue to invest in our policy, safety, and security teams to counter election misinformation." Be sure to check out our website at thetakeaway.org where we will post more from their responses.
[music]
Copyright © 2022 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.