Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Why misinformation is America's greatest election security threat

FILE - In this Oct. 6, 2020, file photo, a public service announcement from the FBI and the Department of Homeland Security cybersecurity agency. As President Donald Trump sows doubts about the election, an obscure government agency he created is working behind the scenes to inspire confidence in the vote amid unprecedented challenges. The Cybersecurity and Infrastructure Agency, which Trump signed into existence in 2018, is working with other parts of the government to safeguard an election in the middle of a pandemic. (AP Photo/Jon Elswick, File)
FILE - In this Oct. 6, 2020, file photo, a public service announcement from the FBI and the Department of Homeland Security cybersecurity agency. As President Donald Trump sows doubts about the election, an obscure government agency he created is working behind the scenes to inspire confidence in the vote amid unprecedented challenges. The Cybersecurity and Infrastructure Agency, which Trump signed into existence in 2018, is working with other parts of the government to safeguard an election in the middle of a pandemic. (AP Photo/Jon Elswick, File)

What’s the biggest threat to American elections, and to people’s trust in them?

Conspiracy theories.

That’s according to none other than Christopher Krebs, the man once in charge of the federal Cybersecurity and Infrastructure Security Agency.

Today, On Point: Conspiracy theories and the threat their spread poses to democracy with Chris Krebs.

Guests

Chris Krebs, partner at the Krebs Stamos Group. He served as director of the Cybersecurity and Infrastructure Security Agency in the DHS from November 2018 to November 2020. (@C_C_Krebs)

Jack Beatty, On Point news analyst. (@JackBeattyNPR)

Also Featured

Rashad Robinson, president of Color of Change. Co-chair of the Commission on Information Disorder.

Interview Highlights

On how conspiracy theories threaten elections

Chris Krebs: “Earlier this week, I believe it was over the weekend or maybe late last week, the former president issued a statement. … He challenged anyone, any takers to a debate on the credibility of the 2020 election. Despite having hundreds of takers, he then claimed that no one took him up on his offer, and so he rescinded his offer for a debate. But look, the point here more than anything is there are no consequences, effectively, for the former president to spew these lies.

“In fact, the incentive structures work the other way. He can fundraise. He can have these rallies and continue to amplify the lies, and activate his base. He’s got hundreds of millions of dollars in a political action committee war chest. So it actually works to his incentive, or to his interest, to continue driving these lies. So to your broader question, what is the role of technology?

“And I think there is a significant obligation across the social media platforms to do a better job of being transparent in how certain actors certain so to speak, super-spreaders of disinformation, to limit the reach or at least provide insight to researchers and journalists on how their platforms are being abused at the expense of democracy, now.”

How concerned are you that these conspiracy theories get massively amplified into the 2022 midterms next year? Could this spill over into actual physical disorder?

Chris Krebs: “It’s already happened, right? January 6th was a physical manifestation of disinformation around the 2020 election. And in fact, we essentially at CISA, we predicted this. In 2019, we released an awareness campaign called the War on Pineapple, and the idea here was to educate the American people on how disinformation operations work. And it’s effectively a five-step process.

“But first, you start with identifying the issue you want to drive out. And here, it was election disinformation. The second was you start putting accounts into place and getting your amplifiers. And that happened through some of the president’s acolytes, aides and even his own family. And the third is you start hitting that on the platforms, Facebook, Twitter or elsewhere. Fourth, is you take it mainstream, and we saw that on Fox News. We saw that on Newsmax, OAN, Bannon’s War Room, all these fringe networks. And then fifth is you take it to the real world, and that’s what January 6th was. It was the activation of the amplification that had been taking place for almost, what, nine months at that point. So I fully expect to see that happen again.

“And I think we’re starting to see some of the seeds of that happen today with … the constant goalpost moving of — you mentioned the Arizona ‘fraudit’ or the fake audit that took place out there. You’re starting to see some of the pushers, the amplifiers start to talk about citizen canvasses, where they want volunteers to start going out and knocking on doors throughout the various states. And start talking to people about, Did you vote? Who would you vote for? And that’s where you’re going to start seeing that friction again in the real world, that’s going to lead to ultimately to political violence, as I see it, at ‘22 at a minimum, but certainly ‘24.”

On how leaders can overcome the influence of misinformation

Chris Krebs: “There is no silver bullet, of course, or there’s no magic wand we can wave, but it’s through engagement. It’s through constant work. Again, this is not easy. It’s going to take others in industry, in the media community to call out Fox News and Tucker Carlson, right? We have to continue pushing that. We have to have business leaders who actually have a higher degree of trust, based on public polling than political leaders.

“We need business leaders to step up and say, You know what? We’re not going to advertise on these programs like Lara Logan, that’s comparing Fauci to Dr. Josef Mengele. I mean, beggars belief that we’re still in the space, but that’s where it is. So we need more leadership in the business community. We need more leadership in the government space to actually say, Look, we understand this is a problem. It’s not just an intelligence community thing, it’s a broader societal issue.

“So what is our plan? How are we going to do it? All within the constraints — or rather the construct — of the First Amendment and protected speech? So let’s reduce harms, improve transparency in the social media platforms again and back to that point of increasing trust in institutions.”

On the importance of increasing ad transparency 

Chris Krebs: “Stepping back a little bit, the way I see where we are with the social media platforms right now, I’ve likened it to kind of a post-Enron moment where we had a failure of the financial auditing and oversight system due to misreporting and lack of transparency. And how did they fix that? Sarbanes-Oxley required a set of auditing and transparency requirements for publicly traded companies. That’s kind of where I see us right now in the social media ecosystem, where there’s no transparency that’s led to this collapse in trust in institutions, in these public harms.

“And so what is the equivalent of a Sarbanes-Oxley for social media platforms that doesn’t necessarily say you have to allow this content, but not that content. But in fact, what instead it does [is] say, Look, you’ve got to let us understand how your processes work, what your content moderation policies are, how you’re enforcing them, and then opening that up to researchers and journalists in an authorized, protected way.

“Ad transparency is just one of those. Because we are seeing — and even just yesterday, Donie O’Sullivan with CNN talked about how Facebook accepted ads for … Red Voices, I believe is the company that was promoting T-shirts and other paraphernalia that promoted either anti-COVID, or anti-election or whatever. It’s different when it’s not political speech from political action committees, but when it’s from interest groups selling T-shirts, they doesn’t have the same kind of scrutiny.

“So what we’re trying to open up is more information about how programs and platforms are moderating their content themselves that then can lead to more informed policy choices. Because we cannot make good policy decisions that would lead to any sort of regulation or legislation that would have a meaningful change. Because we don’t have enough information. And imperfect information leads to imperfect decisions. So in part, just with the transparency piece on ads, trying to get a better sense of how these algorithms work behind promoting content.”

How are we going to get a system of accountability here?

Chris Krebs: “We have every two or four years at the ballot box, or six years, depending on who your senator is. We are seeing some in the Republican Party, the Cheneys and the Kinzingers of the world … there are few and far between. But there are others. And so I think as we get further from the 2020 election, as we get through ‘22, you may see more of those. But that’s not good enough, right? That is absolutely not good enough.

“So we again, we need the business community to step up. We need the civil society community to step up, call these people out. And we need to continue to put pressure on the platforms that give the liars, the big liars the opportunity to spew their lives. But lastly, we do have the judiciary, we have the court system. It does take too long, but we do see lawsuits like Dominion Voting Systems against a number of these platforms that are going to take them. And they’re not going to settle. They are going to go for their 1.3 billion. Whether they win, I don’t know. But that is one of the mechanisms that is available.”

On how a federal approach could protect election integrity

Chris Krebs: “The comprehensive federal approach basically says, Hey, government, get your act together. There is no consistent understanding of the information disorder space, whether it’s a foreign threat or a domestic threat. So we need the federal government to come together just like with any other national security threat.

“Where they have a plan, and they’ve tasked organizations and agencies to lay out their work streams and how they’re going to approach the problems set, we don’t have that. We need that. And ultimately, where I think we’re going is we need to rethink the way we do government. Just like in 1939, when FDR reorganized the federal government, I think we need to rethink about how government is organized to handle technology in digital risk issues, which could result in a digital agency.”

From The Reading List

Aspen Institute: “Commission on Information Disorder Final Report” — “America is in a crisis of trust and truth. Bad information has become as prevalent, persuasive, and persistent as good information, creating a chain reaction of harm.”

This article was originally published on WBUR.org.

Copyright 2021 NPR. To see more, visit https://www.npr.org.