Bret Schafer is a senior fellow, Media and Digital Disinformation, for the Alliance for Securing Democracy. Bret is the creator and manager of Hamilton 2.0, an online open-source dashboard tracking the outputs of Russian, Chinese, and Iranian state media outlets, diplomats, and government officials. As an expert in computational propaganda, state-backed information operations, and tech regulation, he has spoken at conferences around the globe and advised numerous governments and international organizations. His research has appeared in the New York Times, USA Today, the Wall Street Journal, and the Washington Post, and he has been interviewed on NPR, MSNBC, CNN, Al Jazeera, and the BBC. Prior to joining GMF, he spent more than ten years in the television and film industry, including stints at Cartoon Network and as a freelance writer for Warner Brothers. He also worked in Budapest as a radio host and in Berlin as a semi-professional baseball player in Germany’s Bundesliga. He has a BS in communications with a major in radio/television/film from Northwestern University, and a master’s in public diplomacy from the University of Southern California, where he was the editor-in-chief of Public Diplomacy Magazine.

Media Mentions

If your ultimate objective is to reduce support for Ukraine, your inroad might be talking about how bad things are on the southern border. Their path to win this thing is to get the U.S. and the E.U. to stop sending weapons and aid to Ukraine.
Translated from English
Taking attention off Ukraine is only a good thing for Russia. The more the Western public is focused on Israel and Hamas, the less they're paying attention to the fact that Congress is about to not fund Ukraine's war effort. Shining a light on other places pulls attention away from Ukraine.
This is an exceptionally bizarre arrangement for a journalist to get paid covertly by an ally of his subject matter, but it fits in nicely with Russia’s efforts to influence the West through influencers. He was always close to Putin, and we were wondering, how was he managing this?
The pandemic was so incredibly disruptive to everyone. The intensity of feelings about COVID, I don’t think that’s going to go away. And any time something new comes along, it breathes new life into these grievances and frustrations, real or imagined.
Of the 100 most retweeted tweets about Ukraine posted by GOP candidates for the House since August, roughly 90% opposed continued support for Ukraine. Though much of that messaging plays to simple pocketbook concerns – essentially saying, ‘Why are we supporting Ukraine when Americans are struggling to pay their bills?’ – there is also a strain of anti-Ukrainian disinformation that colors some of their commentary.
If they lose, that just reaffirms beliefs that the whole thing is rigged. And if they win, you have people running elections who have pretty wild thoughts about how elections should be run.
We tend to think of election day as the peak event for disinformation. But for the past two election cycles, the most problematic narratives tend to take hold in the days after the election — especially if the vote counting stretches over a period of days/weeks.
People were looking for things to go wrong to prove their preconceived notions that the election was rigged. And there are always things that go wrong.
Absence of information is a very bad thing around elections, because it's very quickly filled by others, often with a very specific agenda.
We're seeing the injection of conspiratorial narratives into the conversation around security. And an election administrator who has bought into or elevated those who push wild conspiracy theories is fundamentally dangerous.
If you look at candidates who mention Ukraine the most often in the U.S. political ecosystem, it’s almost universally pro-Russian, anti-support for Ukraine.
Most owners of these platforms have had to remain neutral on issues related to politics and geopolitics. [Musk's] freewheeling style of communicating with authoritarians is certainly going to create challenges with how the platform is perceived.
Being an election denier inherently suggests that you are partisan in your leanings and have at least engaged with, on some level, conspiracy theories. And this is not just a problem of narrative. You’re seeing in some cases that decisions are being made, laws are being changed.
In the past when we've seen China weigh in on ostensibly domestic issues like police brutality, it's always had a China flavor to it. But if this new activity is indicative of a larger strategy, I think it would indicate a bit of a sea change.
What I am disheartened by is, if you look at the major influencers on both sides and you’re sitting on the other side of the (political) fence, you would consider them to be voices that are divisive and potentially toxic.
If you speak to people in Latin America, RT is viewed as just another media outlet to be read and trusted. It is hugely influential.
Russia's ability to promote its disinformation has gone unchecked in many parts of the world. Its audience [in Europe] may have dwindled since the war began. But that does not mean it's not finding an audience elsewhere.
They [Russia] lost the information war in the West in the first week and haven’t got it back. Really good communications and disinformation cannot overcome really bad policies.
Russia and China have long shared distrust and animosity toward the West. On Ukraine, it’s a level above that — just the extent to which they have parroted some pretty specific and in some cases pretty far-fetched claims from Russia.
With governments and tech platforms moving to censor or limit the spread of Russian propaganda, pro-Kremlin talking points are now being laundered through influencers and proxies, including Chinese officials and state media outlets that obviously do not face the same restrictions that have been placed on Russian state media outlets. This has allowed the Kremlin to effectively skirt bans meant to limit the spread of Russian propaganda.
As long as Russian state media continues to be either banned, downranked or impacted in some way, they're going to want to fill that messaging gap. The best way to do that, to control the narrative, is through their diplomatic accounts.
We often see a two-way flow of conspiratorial narratives moving from the rightwing American information ecosystem to the Kremlin and back again, in a way that creates a feedback loop that reinforces and bolsters messaging from both groups.
It’s a pretty massive messaging apparatus that Russia controls — whether it’s official embassy accounts, bot or toll accounts or anti-Western influencers — they have many ways to circumvent platform bans.
The question is how much the far-right figures are going to impact the broader media discussion, or push their party. It serves them, and Russia, to muddy the waters and confuse Americans.
[Russia is] trying to get dirty information into the online ecosystem and hope it is picked up by websites and individuals with larger reach.
In a way, this will be a test for American social media companies if things really escalate.
[Fort Detrick] has often played this sort of central role in conspiracy theories. But this one, of course, has tried to connect the origins of COVID-19 to the lab by essentially saying that the outbreak jumped from Fort Detrick to Wuhan brought over by members of the U.S. military.
Whether or not anyone is buying into lobster or Fort Detrick being the source of Covid, it’s at least having the effect of muddying the truth and confusing people.
There's clearly a huge demand for what Russia is selling here among Germans. And that, I think, across the West is deeply concerning.