Hamilton 68: A New Tool to Track Russian Disinformation on Twitter

August 02, 2017
by
Laura Rosenberger
J.M. Berger
4 min read
*New Brief: The Methodology of the Hamilton 68 Dashboard*

*New Brief: The Methodology of the Hamilton 68 Dashboard*

Since Russia’s interference in the 2016 U.S. election, many have warned that Putin will be back in 2018 and 2020.  But the reality is that Russian influence operations never left. As former Director of National Intelligence James Clapper recently stated, the Kremlin is already beginning to “prep the battlefield” for the 2018 elections. But what does this mean?

Russia’s activities continue on multiple fronts. One happening right under our nose and in plain sight is its continued information operations aimed at spreading propaganda and disinformation online. Indeed, Russia’s information operations in 2016 did not happen overnight — they were enabled by a foundation built over several years of operations in U.S. information space. Since the election, Russia’s efforts to shape what Americans think has continued.  Americans deserve to know what messages Russian disinformation networks are pushing.

"In the Federalist Papers No. 68, Alexander Hamilton wrote of protecting America’s electoral process from foreign meddling. Today, we face foreign interference of a type Hamilton could scarcely have imagined."

The Hamilton 68 dashboard, launching today as part of the Alliance for Securing Democracy, provides a near real-time look at Russian propaganda and disinformation efforts online. The top of the page shows tweets from official Russian propaganda outlets in English, and a short post discussing the themes of the day. This is Russia’s overt messaging.

But these disinformation networks also include bots and trolls that synchronize to promote Russian messaging themes, including attack campaigns and the spreading of disinformation. Some of these accounts are directly controlled by Russia, others are users who on their own initiative reliably repeat and amplify Russian themes. Our analysis is based on linked 600 Twitter accounts to Russian influence activities online, and the lower section of the dashboard features charts that display topics, hashtags, and links currently promoted by this network.   

The content this network tweets reflects Russian messaging priorities, but that does not mean every name or link you see on the dashboard is pro-Russian. The network sometimes amplifies stories that Russia likes, or people with like-minded views but no formal connection to Russia. Importantly, the network also tweets about stories and people that Russia seeks to discredit or attack.

We identified these accounts through three different methods. First, we tracked disinformation campaigns that synchronized with overt Russian propaganda outlets like Sputnik and RT (Russia Today). We analyzed the social networks of users who were promoting this disinformation to identify which users were centrally involved, and to remove users who tweeted disinformation casually, after encountering it online.

Second, we identified a group of users online that openly professed to be pro-Russian and tweeted primarily in support of Russian government policies and themes. We analyzed followers of these accounts to identify a large and interconnected social network that tweeted the same themes and content.

Third, we identified accounts that appear to use automation to boost the signal of other accounts linked to Russian influence operations. We assessed this group by looking at accounts that had disproportionately large numbers of interactions with other accounts (including sending and receiving retweets) along with a very high number of tweets per day. These accounts may be bots, meaning a piece of computer code that tweets automatically based on pre-determined rules, or they may be cyborgs, meaning that some of their activity is automated, which is then manually supplemented by a human user.

Samples from each of these three datasets were combined to populate the dashboard monitoring list, with an eye toward providing a representative snapshot of Russia’s English-language influence operations on Twitter. Within the network are numerous subgroups specializing in certain kinds of content, such as Ukraine, Syria, or far-right views. The content on the dashboard varies widely from day to day depending on what is going on in the news.

Our objective in providing this dashboard is to help ordinary people, journalists, and other analysts identify Russian messaging themes and detect active disinformation or attack campaigns as soon as they begin. Exposing these messages will make information consumers more resilient and reduce the effectiveness of Russia’s attempts to influence Americans’ thinking, and deter this activity in the future by making it less effective.

We are not telling you what to think, but we believe you should know when someone is trying to manipulate you. What you do with that information is up to you.

 

EXPLORE HAMILTON 68