Trust Issues Becoming the Norm

For decades, if not longer, intelligence agencies worldwide have worried about disinformation, whether from adversaries or their own efforts to influence others.It was persistent, and at times pitched, though it was not often on the minds of everyday p…

For decades, if not longer, intelligence agencies worldwide have worried about disinformation, whether from adversaries or their own efforts to influence others.

It was persistent, and at times pitched, though it was not often on the minds of everyday people.

In 2022, however, that seems to have changed.

“In this age of misinformation — of ‘fake news,’ conspiracy theories, Twitter trolls and deepfakes — gaslighting has emerged as a word for our time,” the Merriam-Webster English language dictionary announced in November, naming it the official word of the year.

Online searches of the word, which means “the act or practice of grossly misleading someone especially for one’s own advantage,” jumped by 1740% throughout the year, the dictionary said, noting consistent interest in the word and modern efforts at deception.

In the United States, such concerns consistently dominated the public discourse, starting with President Joe Biden’s appeal to defend democracy on the first anniversary of the January 6, 2021, attack on the U.S. Capitol.

“Are we going to be a nation that lives not by the light of the truth but in the shadow of lies?” he asked.

“We cannot allow ourselves to be that kind of nation,” Biden said. “The way forward is to recognize the truth and to live by it.”

The U.S. president’s comments came just weeks after U.S. Homeland Security officials warned of ongoing — and more volatile — efforts by foreign intelligence services and terrorist organizations to seed the country with disinformation.

And those concerns grew as Russia prepared for its invasion of Ukraine.

Russia – Ukraine

“We’re seeing Russian state media spouting off now about alleged activities in eastern Ukraine,” U.S. Defense Secretary Lloyd Austin told reporters in late January, as 100,000 Russian troops took up positions along Ukraine’s border.

“This is straight out of the Russian playbook,” Austin said of the Russian disinformation efforts. “And they’re not fooling us.”

By mid-February, senior U.S. Homeland Security officials were warning that Moscow had fine-tuned its disinformation operations, “trying to lay the blame for the Ukraine crisis and the potential escalation in that conflict at the feet of the U.S.”

Russian disinformation efforts took another turn in the days following the invasion, according to U.S. defense officials, with the Kremlin publicizing false reports about the widespread surrender of Ukrainian troops to erode Ukrainian morale and resistance.

Russian-government affiliated news outlets also sought to use the war in Ukraine to boost Moscow’s standing in Africa, amplifying accounts in late February and early March of Africans and other people of color being subjected to racism as they sought to evacuate.

Other Russian disinformation campaigns focused on claims the U.S. was running biological weapons labs in Ukraine and on efforts to undermine Western support for Ukraine by targeting countries perceived as weak links.


Russia’s disinformation efforts and influence campaigns, however, did not go unanswered.

Even before the first Russian troops crossed the border into Ukraine in February, U.S. intelligence officials made a decision to fight disinformation with evidence and facts, taking the unprecedented step to declassify assessments to share with allies and even the public.

“The work that we’ve done, and it’s not without risk as an intelligence community to declassify information, has been very effective,” CIA Director William Burns told lawmakers in early March.

“We hopefully can provide some credible voice of what is actually happening,” added U.S. Director of National Intelligence Avril Haines. “That’s both for the domestic population, but that’s also for the international audience.”

Other Western countries and allies of Ukraine followed with pushback of their own.

In early March, the European Union banned broadcasts and websites affiliated with Russian state-funded media outlets.

Ukraine also ran its own counter-disinformation efforts, targeting audiences in Russia and Belarus, hoping to sow doubts and erode support for Moscow’s invasion.

“I’m not realistic about changing their minds,” Heorhii Tykhy, with Ukraine’s Foreign Ministry, said during a virtual forum in March, though he added that overall, Kyiv was “winning this information war and winning it massively.”

U.S. domestic fears

At the same, the U.S. intelligence agencies and their Western allies were focused on Russia’s disinformation efforts surrounding Ukraine, U.S. Homeland Security efforts were focused on disinformation at home.

In June, the Department of Homeland Security (DHS) reissued a National Terrorism Advisory System (NTAS) Bulletin, citing the pervasive disinformation environment, much of it originating in the U.S., as a key concern.

“It’s really the convergence of that myth and disinformation with the current events that creates those conditions that we’re concerned about in terms of mobilization to violence,” a senior DHS official said at the time.

Some of those fears had already manifested a month earlier when 18-year-old Payton Gendron, who consumed online conspiracy theories, shot and killed 10 Black people at a grocery store in Buffalo, New York.

Other acts of violence across the U.S., such as the shooting at a gay nightclub in Colorado in November, an attack against the husband of U.S. House Speaker Nancy Pelosi and a rash of threats against religious institutions, also had links of various sorts to the online disinformation environment.

“One of the things we’ve seen with violent extremist ideologies is that they often commingle or cross over,” a second senior DHS official said this past November. “It just contributes to an environment where individuals … might grab on to those narratives in a way that motivates and animates their violent or potentially violent activity.”

U.S. elections

Some of the most targeted disinformation efforts, though, centered on the U.S. midterm elections in November.

“We are concerned malicious cyber actors could seek to spread or amplify false or exaggerated claims of compromise to election infrastructure,” a senior FBI official said in early October.

Other senior U.S. officials warned that adversaries like Russia, China and Iran would seize upon false narratives, originating in the U.S., questioning the integrity of the electoral process, and seek to amplify them.

Researchers also found evidence that Russia and China, in particular, had resurrected dormant social media accounts as part of intensified disinformation campaigns to spread doubts about the U.S. election.

And as the election neared, top U.S. officials called the threat of foreign influence operations and disinformation sparking violence a “significant concern.”

In the end, fears of potential violence never materialized into actual incidents, though U.S. officials did find themselves pushing back against domestic and largely partisan efforts to take scattered malfunctions and cast them as evidence of a larger conspiracy.

A report by the cybersecurity firm Mandiant concluded that in the end, efforts by Russia, China and Iran, some targeted specific contests but were mostly “limited to moderate in scale.”

A number of experts warn the threat of election disinformation is here to stay.

Future disinformation threats

“Narratives like the Big Lie have become systemic,” Graham Brookie, senior director of the Digital Forensic Research Lab at the Atlantic Council, said about former President Donald Trump’s disproven claims that the 2020 U.S. presidential election was stolen from him.

“[There is] not a huge amount of audience growth on that narrative, but for the audiences and communities that are engaged in and believe in that narrative, their engagement has gone up and become more hardened,” Brookie told VOA.

Some U.S. lawmakers are likewise warning the threats are not dissipating.

“After each election cycle, social media platforms like Meta often alter or roll back certain misinformation policies, because they are temporary and specific to the election season,” Democratic Representative Adam Schiff, chair of the House Intelligence Committee, and Democratic Senator Sheldon Whitehouse wrote in a letter to the social media giant December 14.

“Doing so in this current environment, in which election disinformation continuously erodes trust in the integrity of the voting process, would be a tragic mistake,” they added. “Meta must commit to strong election misinformation policies year-round, as we are still witnessing falsehoods about voting and the prior elections spreading on your platform.”

Other lawmakers are looking at social media apps from China and Russia, calling for some, such as TikTok, to be banned in the U.S.

“TikTok is digital fentanyl that’s addicting Americans, collecting troves of their data, and censoring their news,” Republican Representative Mike Gallagher said in a statement regarding a bill designed to block such apps.

“This isn’t about creative videos,” Republican Senator Marco Rubio, vice chair of the Senate Intelligence Committee said in a statement regarding TikTok.

“This is about an app that is collecting data on tens of millions of American children and adults every day. We know it’s used to manipulate feeds and influence elections,” he said. “We know it answers to the People’s Republic of China.”

Source: Voice of America