Six months ago, intelligence officials informed the House Intelligence Committee that Russia was intervening to support President Trump’s reelection in 2020. They warned that Russia had updated its 2016 playbook toward newer, less easily detectable tactics. Rather than impersonating Americans, they’re skirting social media platform rules against “inauthentic speech” by nudging Americans to repeat misinformation and hiring them to write for fake websites. They’ve shifted their servers from Russia to the United States because US intelligence agencies are barred from domestic surveillance. They’ve infiltrated Iran’s cyberwar department, perhaps to launch attacks made to seem like they originated in Tehran.
The chaos caused by the coronavirus pandemic and the activation of new adversaries like China have elevated the threat of election manipulation even higher. With uncertainty around the viability of in-person voting, questions about voting by mail, and calls to delay the election, there can be no doubt that foreign actors are looking to leverage the confusion and upheaval caused by COVID-19 and civil unrest to disrupt our democratic process. Director of National Intelligence John Ratcliffe’s recent decision to restrict in-person congressional briefings on foreign election interference highlights how politicized, and thus how dangerously vulnerable to foreign manipulation, the upcoming election really is. Just as our adversaries are launching a powerful and escalating attack on our democracy, we’re letting our guard down. We can and must be more vigilant.
On social media, fake news spreads faster, farther, deeper, and more broadly than truth. In 2018, my colleagues Soroush Vosoughi, Deb Roy, and I demonstrated this in the largest-ever longitudinal study of the spread of fake news online, published in Science magazine. We analyzed the diffusion of all the fact-checked, true and false rumors that had ever spread on Twitter from its inception in 2006 to 2017. As I described in my book The Hype Machine, during the 2016 election, Russian fake news spread to 126 million people on Facebook and garnered 76 million likes, comments, and other reactions. It reached 20 million people on Instagram and was even more effective there, amassing 187 million likes, comments, and other reactions. Russia also reached millions of Twitter users with misinformation.
Was Russian interference sufficient to change the election result? We can’t rule it out. Research shows that the more we hear something, even if it’s false, and the more it aligns with what we know, the more likely we are to believe it. Although exposure to fake news was much less than exposure to real news and was concentrated among a select set of voters, it likely reached between 110 million and 130 million people. It didn’t need to affect everyone to tip the election—just under a hundred thousand persuadable voters in key swing states. Which is exactly who Russia targeted. And it didn’t need to change any vote choices, it just needed to affect voter turnout, which social media experiments have shown is not hard to do.
Social media platforms could greatly reduce election manipulation, research suggests, by increasing fact checking and machine learning efforts to identify fake news, and demonetizing manipulative content so it doesn’t earn ad revenue. Limiting reshares of false information, demoting false search results, and nudging users to be more reflective would also help. Applying crowdsourced labels of trustworthiness to posted content could be effective, but platforms must tread carefully there because “false news” warnings can also signal that unlabeled content must be true and can reduce our overall confidence in real news.
With less than two months to Election Day and even less time to the start of mail-in voting, it’s become obvious during this contentious election cycle that we can’t rely on either the platforms or lawmakers who don’t seem motivated to pass targeted legislation like the Foreign Influence Reporting in Elections Act, the SECURE Our Democracy Act, the Honest Ads Act, and the Voting System Cybersecurity Act. So what can we, as ordinary citizens, do to protect our democracy? A few simple steps could go a long way.
First, think before sharing. I often see social media content shared with preambles like this: “I’m not sure if this is true, but it’s interesting if it is.” Stop doing that. That’s exactly what manipulators are hoping you’ll do—they’ve designed the content to be “interesting if it’s true,” even though it isn’t.
Second, Google it. Usually the most egregious misinformation campaigns are easily debunked with a simple Google search. Before you believe or share what you read, spend just a few clicks checking it out. Most of the time, the top search results will tell you if the information is credible.
Third, be aware of the original source. Frequently, the most salacious fake news is associated with known lists of seemingly reputable but obviously fake websites. Look for unusual URLs or website names that seem legitimate but aren’t. Although research shows labeling information sources doesn’t increase discernment of fake from real news, that’s because most fake news is so obviously fake that noting the source doesn’t add much discriminating information. But if you’re unsure, the source is likely to tell you a lot about how trustworthy something is. Consider whether an odd source is being corroborated by other credible, mainstream news outlets.
Finally, check your emotional pulse. Our research shows fake news is salacious and attempts to elicit strong emotions like surprise, anger, and disgust. If the news you’re reading shocks you or makes you really angry, it could be a sign that it’s fake. Check legitimate sources before trusting it and conduct a smell test by looking for poor writing, words in all caps, grammatical errors, or shocking content or images.
No matter who you support in the upcoming election, when it comes to protecting our democracy, we’re all in this together. And right now, during one of our fragile democracy’s most vulnerable moments, it’s all hands on deck.
Sinan Aral is the director of the MIT Initiative on the Digital Economy. This essay was adapted from “The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, and Our Health—and How We Must Adapt” by Sinan Aral. Copyright 2020 by Sinan Aral. Published by Currency, an imprint of Random House, a division of Penguin Random House LLC. All rights reserved. Reprinted with permission. Send comments to firstname.lastname@example.org.