The United States Should Not Act Like It's the Only Country Facing Foreign Interference
“Right now, Russia’s security services and their proxies have geared up to repeat their interference in the 2020 election. We are running out of time to stop them.” This stark warning from former National Security Council official Fiona Hill—during impeachment hearings in the U.S. Congress—serves as a sharp reminder of the threat to democracy posed by foreign interference and disinformation. The United States has made much progress since the 2016 elections, when the scope and sophistication of disinformation flowing through social media platforms caught the country flat-footed—but this work is far from over.
Russia’s systematic attack on U.S. democracy in 2016 was unprecedented, but its playbook is not unique. The Kremlin has used information warfare to attack countries in Europe before and after Donald Trump and Hillary Clinton ran for the White House. It seized on Brexit in the United Kingdom, the Yellow Vests protests in France, the rise of far-right politics in Germany, and the Catalan independence conflict in Spain to amplify discord and divide societies in its favor.
"The U.S. government is still fumbling over how to put together an effective arsenal to guard against disinformation and other foreign influences."
Russia’s ongoing interference in U.S. affairs is just a small piece on a big chessboard. A key foreign policy goal of the Kremlin is to discredit, undermine, and embarrass what it sees as a liberal international order intent on keeping Russia down and out. By perpetuating disinformation online and bending reality in a way that confuses and divides, it shapes a narrative that can affect what happens in real life.
The U.S. government is still fumbling over how to put together an effective arsenal to guard against disinformation and other foreign influences. It should look abroad for lessons and partnerships. Many European countries have recognized the problem and are adopting creative approaches to solving it at the national and EU levels.
While there are many different examples that could be covered, a look at the Baltic states, Sweden, and the EU more generally provides a diverse slate of examples from places that have had different current and historical experiences with Russian interference.
The Baltic Model: Citizen Involvement
Estonia, Latvia, and Lithuania, which sit on NATO’s eastern flank, are acutely aware of the toll information warfare takes on daily life. Throughout the Cold War and to this day, Russian disinformation tactics—including the use of warped historical narratives and “hybrid trolls” to systematically push a specific message. These tiny Baltic countries have lessons for the United States.
Importantly, their citizens have taken active roles in finding and debunking disinformation. Estonia has created a Cyber Defense Unit, an army of volunteer specialists in information technology that serves as an extended response capability. The unit’s role is to “protect Estonia’s high-tech way of life, including protection of information infrastructure and supporting broader objectives of national defence.” After Ukraine’s EuroMaidan in 2014—when the Kremlin launched a massive influence operation that paved the way for the annexation of Crimea—a decentralized group of volunteers known as the Baltic Elves began to challenge disinformation and propaganda on social media as well as in the comments sections of online news articles, and to expose the trolls spreading it.
Sweden’s Comprehensive Take
While not as near Russia as the Baltic states, Sweden is located in a geopolitically important part of Europe—and, even more importantly, is a not a NATO member. It has been a target of information operations from Russia bent on undermining its democracy and spoiling any prospects of a future NATO membership.
Sweden is taking a multi-faceted approach to the Russian disinformation problem—including redefining the way it classifies the threat and mobilizing the media. The 2017 national security strategy emphasized the necessity of identifying and neutralizing propaganda campaigns. In addition to the Swedish Civil Contingencies Agency (MSB), which is tasked with protecting citizens from disinformation and influence operations, last year the government announced plans to establish a new authority for the country’s “psychological defense.” According to the government this authority will “aid the population's defense capacities during peace time and its capacity to resist during war, ensure that factual public information can be quickly and effectively communicated even under disruptive conditions, as well as identify, analyze and confront influencing operations.”
Sweden is also treating media as a critical stakeholder. Ahead of its 2018 elections, the Swedish Innovation Authority collaborated with local media companies to create a digital platform designed to prevent the spread of online misinformation and disinformation through automated fact checking and counteracting “filter bubbles.” The Minister of Digitalization also worked with Facebook to establish a “hotline” through which the MSB and any political parties could directly report instances of “problematic content” during the campaign.
This is a ripe area for sharing information and best practices. Given the global, borderless nature of the social media giants, better information sharing between the United States, the EU, and the platforms could help to foster a much-needed open dialogue between public and private sector, and more efficiently eliminate disinformation.
Brussels’ Institutional Approach
The European Union—a body often bogged down by bureaucracy and internal politics—has shown leadership. Last year, it launched its Action Plan Against Disinformation, which increased funding for finding and exposing disinformation in real time with a “rapid alert system.” The EU also pledged to promote media literacy and launched a voluntary “Code of Practice on Disinformation” to which Facebook, Twitter, and Google agreed.
The European External Action Service’s EastStratCom Task Force, established in 2015, is charged with responding to Russian disinformation campaigns. The U.S. federal government and Congress can take a cue from this effort for how to expose and debunk fake news stories on easy-to-navigate, publicly accessible platforms. Media and a cottage industry of independent think tanks, academic institutions, foundations, and private companies already play a critical role by doing this on a daily basis. But the complex disinformation landscape warrants a government response, as the EU experience demonstrates.
"Teaming up on information sharing and best practices could be an opportunity for transatlantic common ground."
At a moment when transatlantic relations are facing harsh realities, teaming up on information sharing and best practices could be an opportunity for common ground as the United States looks to 2020 and beyond. For one, as calls for better media literacy stateside persist, the country should look to the ingenuity in the Baltics where ordinary citizens have taken active roles in finding and debunking disinformation. And to the EU, where Russian tactics and disinformation are called out, labeled, and debunked on publicly accessible online platforms.
The United States was slow to realize the damage done by disinformation and has yet to demonstrate at the highest level that it appreciates the expansive ways this can degrade trust in democracy. Understanding the problem as a new class of security issue could be bold step in the right direction and should catalyze fresh thinking across government and different sectors.
The United States and other democracies are not necessarily built to counter the state-run information machines of their adversaries tit for tat. But they all share an interest in combatting foreign interference and preserving democratic institutions. As the 2020 elections approach, the United States can take lessons from its partners across the Atlantic. Their adversaries are trying to undermine them. Democracies working together can limit their chances of success.
The views expressed in GMF publications and commentary are the views of the author alone.