Home : News : Display
May 29, 2024

Understanding Russian Disinformation and How the Joint Force Can Address It

Michael J. Kelley
©Michael J. Kelley 2024

ABSTRACT: Russia will dominate information warfare if the United States does not treat disinformation as central to Russian strategy. This article examines the vital role disinformation played in post–Cold War Russian strategy, including its strategy in the current Russia-Ukraine War, and in a departure from previous scholarship, this article observes that US defense leaders are aware of Russian disinformation but have failed to assess its impact or sufficiently negate Russian influence. The article also reviews current US efforts and suggests proactive ways to counter Russia’s disinformation strategy.

Keywords: Russia, Ukraine, information operations, disinformation, information literacy

 

Following the 2022 Ukraine invasion, BBC reporters launched a yearlong investigation into Yala News after it posted a story alleging an American plot to conduct biological warfare against Russia by releasing infected birds that would fly into Russian territory. The investigation was prompted by a remarkably similar story broadcast by Russian state media two hours earlier. The investigation revealed that Yala’s most popular stories mirrored sources owned by or affiliated with the Russian government, including false accounts of Ukrainian President Volodymyr Zelensky giving a drunken speech and Ukrainian soldiers fleeing the front lines, both of which appeared on Yala shortly after originating from official Russian sources. Belén Carrasco Rodríguez, one of the BBC’s contacts, coined the term “information laundering” to describe the phenomenon of false narratives gaining credibility through repetition by multiple sources. In the United States, most observers likely dismiss stories about bio-weaponized birds. Can the same be said about information consumers outside the West?1

Alongside traditional national power elements like military and economic operations, Russia generally controls perceptions among targeted audiences to shape the environment to its benefit. While every country controls narratives to some extent, few emphasize information operations on par with Russia. The United States could have taken advantage of this knowledge when Russian interference in the 2016 US presidential election surfaced. Instead, partisan squabbling about which side Russia preferred to win muted those reactions. Subsequent fighting over “fake news” in media, political parties, and across American kitchen tables has provided Russian disinformation practitioners with cover as they ply their craft.

Lagging behind the Russian adversary, the United States has slowly realized that the information element can be crucial to outcomes, with the US Army only recently releasing a doctrinal publication on the topic. Perhaps this document represents a dawning realization in the defense establishment that lethality alone is unlikely to be decisive. This new publication begins by refreshing a lexicon for information operations, including disinformation, a key aspect of Russia’s information operations campaigns.

Disinformation is incomplete, incorrect, or out of context information deliberately used to influence audiences. Disinformation creates narratives that can spread quickly and instill an array of emotions and behaviors among groups, ranging from disinterest to violence. Relevant actors employ disinformation to shape public opinion, attract partners, weaken alliances, sow discord among populations, and deceive forces. Disinformation has a malicious intent.2

While news stories like the one about the infected birds may seem ridiculous, especially to Western audiences, repeating such narratives to vulnerable audiences erodes global trust in the United States. For the United States to succeed in the information domain, it must recognize that disinformation is central to Russian strategy and address it proactively. This article begins by reviewing disinformation’s central role in Russian strategy from the Cold War to the Russia-Ukraine War. It then examines the steps the United States has taken to counter Russian disinformation and recommends ways the US defense community can meet the challenge Russia poses in the information domain.

The Russian Approach to Disinformation

Not to be confused with Western debates surrounding the term, Russian doctrine embraces hybrid warfare, which it defines as “a strategic-level effort to shape the governance and geostrategic orientation of a target state in which all actions, up to and including the use of conventional military forces in regional conflicts, are subordinate to an information campaign.” A 2016 RAND Corporation report described the contemporary Russian approach to information operations as a “firehose of falsehood” due to the number and diversity of communication channels and Russia’s willingness to broadcast partial truths and complete fiction. Contrary to typical communication strategies, Russian information campaigns demonstrate no commitment to consistency among narratives. Instead, they focus on the volume and repetition of many themes, seeking acceptance from familiarity with the message.3

Disinformation has been a pillar of Russian strategy since the Cold War, with significant time and treasure spent on disinformation relative to other forms of espionage. Journalist Mark Hollingsworth devotes a chapter to the KGB’s use of disinformation in his book Agents of Influence: How the KGB Subverted Western Democracies. He cites a former KGB disinformation chief who claimed the KGB spent around 25 percent of its budget on traditional espionage, spending the remainder on “a slow process focusing on what we called ideological subversion or active measures.”4

During the Cold War, the Soviet Union effectively employed disinformation to undermine American legitimacy around the world. In one campaign, the Soviets convinced many that the AIDS epidemic was born in a US lab and released by the FBI to target minority communities. Russia uses disinformation to damage trust in the US government abroad and exploit the American fascination with conspiracy theories, damaging trust regarding the September 11 attacks on the World Trade Center and Pentagon, the COVID-19 pandemic, election security, and other events. Russia has also repeatedly used disinformation campaigns to legitimize military operations. Its actions in the Republic of Georgia and Moldova’s Transdniestria region exemplify a common theme: protecting ethnic Russians from various questionable threats.5

While Russia broadcasts facts that suit its purposes, it uses disinformation in its efforts to employ reflexive control strategies. Reflexive control theory was born in the Soviet era and is used to convince another actor to make decisions against its own interests. Russia accomplishes this strategy by manipulating the information environment to allow its adversary to perceive that Russia’s desired outcomes are the best choices the adversary can make. Russia often employs this method to mask or obscure the true risks and rewards of an adversary’s courses of action. Since reflexive control is designed to influence decisions, disinformation can temporarily or durably alter perceptions that drive a given decision. Given that facts do not constrain its information campaigns, Russia can manufacture anything necessary to yield desired decision outcomes.6

Russia uses the full spectrum of disinformation tactics to shape the environment to its benefit. Through direct seizure of control or forced shutdowns of outlets that oppose the government’s actions, the Russian government exerts near-total control of media within its borders. Russian outlets such as Sputnik and Russia Today own or influence proxy outlets around the world not explicitly branded as Russian. These outlets range from newspapers to blogs to YouTube news channels. Each repeats pro-Russia talking points, with each medium using its unique style to appeal to specific, targeted audiences. Russian information warriors create false social media personas and think tanks to sow discord among adversaries, as they did before the 2016 US presidential election. These fabricated sources often strike a chord with nonmainstream Western media outlets, which echo the messages to further their agendas and unintentionally promote Russian causes.7

Western audiences are unfamiliar with the media environment inside Russia. The control and coordination across channels are alien to those accustomed to a free press. Iuliia Iashchenko has written about Russian media from Sapienza University of Rome.

The [Russian] state-run national TV has a few very important primetime political shows that serve one purpose: to spread disinformation among Russian citizens to support the main ideological track of state propaganda. The most popular channel is Russia 1, with shows like “Duel” (Vladimir Solovyov), “60 Minutes” (Olga Skabeyeva), “Evening with Vladimir Solovyov” (Vladimir Solovyov), the same ideological content is also prominent on Channel 1, with shows like “Time Will Show” (Artem Sheynin), and many others. Usually, the level of subtlety of the disinformation is much higher for international audiences. Meanwhile, in Russia, in these shows, the creators present pure ideological propaganda without any proof, and with a heavy usage of simple or vulgar speech. These shows never end up in the global feed mostly because they are too open and would ruin the political goals of the Russian government.8

Controlling all information spaces within its borders and having significant influence over other outlets allow Russia to exert an outsized influence over its neighbors. Many former Russian Empire or Soviet Union countries have seen the benefits of turning westward, something Putin’s regime strongly opposes. Russia has used various methods centered on information warfare to prevent this theme from continuing.

Russia Uses Disinformation for Its “Special Military Operation” in Ukraine

Russia’s first salvo in the Ukraine invasion was a disinformation operation. As Russia built up combat power along the border, Russian spokesmen announced it was a large-scale training exercise and insisted that troops and ships would soon return to normal activities. Russian officials kept up this line, objecting more stridently as Western officials pointed out inconsistencies between their statements and activities on the ground. As the world knows, Russia’s statements proved false; nonetheless, its messaging prevented Ukraine and the West from responding to the buildup until Russia launched the assault. This situation is a prime example of reflexive control laden with disinformation. Russia understood its adversaries’ decision processes and self-image and realized it could lie long enough to freeze their willingness to move beyond words, allowing it to seize the operational initiative. After all, it would seem unreasonable for the US European Command or NATO to respond aggressively to a training exercise. Fortunately, that disinformation initiative was insufficient to consolidate the rapid military victory Russia expected. Western intelligence sharing allowed Ukraine to steel itself at the last minute while preparing the world to see through Russia’s narratives regarding the invasion.9

Consistent with the “firehose of falsehood” approach, Russia initially explained its actions through several competing narratives, including reuniting ethnic Russians with the motherland, maneuvering to counter NATO expansion, and fighting to excise the Ukrainian government’s “Nazi” influences. Putin even tried to deny Ukraine’s very existence, claiming, “Ukraine never had a tradition of genuine statehood.” None of these narratives survive scrutiny, and they all represent Russian disinformation to legitimize blatant efforts to expand territory and delegitimize opponents. The last significant NATO expansion occurred in 2004— hardly a credible reason for Russia to panic nearly two decades later.10

In 2004, Bulgaria, Romania, Slovakia, Slovenia, and former Soviet republics Estonia, Latvia, and Lithuania joined NATO. Since then, only six new countries have joined, including Finland and Sweden, which joined only after and because Russia invaded Ukraine. At the other end of the reasonableness spectrum, Russia has signed several agreements affirming Ukrainian sovereignty, which undercuts Putin’s statements about its right to exist and his fanciful claims that Ukraine was never a legitimate state. Still, NATO has repeatedly had to debunk claims that the Alliance is trying to threaten and encircle Russia, lest other countries take these assertions at face value.11

Russia has used disinformation to attack the global coalition aligned in support of Ukraine by undermining that support within those countries and sowing disunity among them. One independent analysis of Russian messaging determined that French, German, Polish, and Turkish audiences were heavily targeted through at least five specific themes, which aligned with those shown in an alleged Russian document published by Ukrainian intelligence services. The themes addressing the costs of Ukrainian aid and dealing with refugees could be dismissed as competing narratives. Still, one theme listed in the document is clearly disinformation:

Update information about neo-Nazis in Europe, make a comparison with Ukraine in order to show the European community how Nazism is born and ask why they ban Nazis in their countries, but support them in Ukraine? For these purposes, it is advisable to use BBC documentaries about neo-Nazis and Nazis.

This analysis lists dozens of information operations tied to official and proxy Russian sources that do everything from claiming imminent NATO aggression against Russia to discrediting Western media. Weaponizing the US military’s credibility, it was likely Russia behind a 2023 leak of US military documents modified to make Russian casualty numbers look better than the original documents represented.12

Although many Russian disinformation narratives may seem unlikely or even ridiculous to Western countries and allies, they may seem more plausible to audiences in countries already predisposed to dislike and distrust the West. The disinformation about bio-weaponized birds was not aimed at the West and is unlikely to affect public opinion. It is among hundreds of Russian-origin fake news designed to cause distrust of Russian adversaries. When water falls on a rock, erosion occurs one drop at a time. It is impossible to notice changes at first, but the rock is completely reshaped over time. Similarly, in spaces where the West is less likely to detect and contest its narratives, Russia employs disinformation like the bird story more aggressively. Where these narratives take root, American combatant commanders will likely find more resistance to their actions in their respective regions.

Russian support has propped up the Alexander Lukashenko regime in Belarus to control messaging and the electoral process. Lukashenko expresses his appreciation for these efforts by making Belarus a vassal state to serve Russian interests. Russian military forces have occupied portions of the Republic of Georgia for several years to buffer the Russian border and have sufficiently managed the information surrounding the operation to prevent worldwide outrage. In 2014, Russia used a combination of special operations forces and information operations to manipulate a referendum, forcing Ukrainian citizens from Crimea to give Russia control of the strategically important peninsula in the Black Sea. Having effectively seized Crimea, Russia did not need a referendum to hold power there. Nonetheless, the façade of legitimacy represented an information operation targeting those who may have considered intervening to eject Russia by force or other means. Before and after the 2022 Ukraine invasion, Russia undertook similar operations within the Ukrainian Donbas and Luhansk regions along the border with Russia.13

Given Russia’s potency, observers may expect a certain amount of “cognitive dissonance” within the population should Ukraine recover control of these places. The liberation of Russian-held territory is, therefore, more than a kinetic concern. Winning the fight will be insufficient. Russia understands the population must be convinced to be liberated, as demonstrated in its approach to Crimea, Donbas, Luhansk, and Transdniestria. Russia’s ability to wield information to achieve its strategic goals poses challenges for regional governance. Both inside and outside Russia, information manipulation challenges many vital aspects of governance, including transparency, accountability, inclusion, public participation, and the rule of law.14

Does Disinformation Work?

It may be that the American national security establishment has failed to focus on the disinformation problem because its effects are unquantifiable. Despite ample examples of Russian actions, cause and effect in the information environment is not always clear. Disinformation does not exist in a vacuum; it is one variable in the information environment that influences and is influenced by culture, previous events, bias and opinion, misinformation, and the observed truth. The same channels that deliver disinformation deliver all these factors, resulting in great difficulty associating a singular cause and effect. Most disinformation campaigns are not designed to yield precise results but to create chaos, doubt, mistrust, and confusion. Clearly, Russia places great emphasis on shaping the environment in this manner. With the past as a prologue, Russia will conduct information operations using disinformation in concert with more conventional power in future military operations, as it has attempted in Ukraine, with mixed results.

One way to assess effectiveness is to see whether disinformation enters public discourse. For many years, Russia claimed biological research facilities in Ukraine were centers for biological weapons production. In 2022, a viral Twitter (now X) hashtag led to a US Senate hearing. These claims were subsequently repeated on Tucker Carlson’s popular show on Fox News, using selectively clipped testimonies to bolster those claims. The following day, Russian state television cited Carlson’s program. Following an explosion that damaged a gas pipeline in the Baltic Sea, Russian officials quickly lashed out at the United States, publicly accusing it of sabotage. Carlson also picked up that story, again bolstering Russia’s sensationalized version. Carlson had tremendous reach as the host of America’s most popular infotainment program. His Kremlin-friendly approach led to Putin’s first Western interview since the Russia-Ukraine War began, despite multiple previous requests from other journalists worldwide. Carlson’s programming offers a stark example of Russian disinformation creeping into American mainstream media, but there are also more subtle ones.15

Recent analysis revealed the Russian promotion of anti-immigration themes in the United States to weaken support for Ukraine assistance funding. These efforts are evident in Sputnik and Russia Today broadcasts, on X (formerly Twitter), and other Russian-affiliated social media sites. Russian efforts in the Western Hemisphere are not limited to the United States. A recently declassified US intelligence product demonstrated how Russia deliberately fed stories and talking points to media outlets across Latin America, a charge Russia quickly denied. To borrow the BBC’s term, each example represents a form of information laundering. Although challenging to quantify, Russian disinformation efforts seem to be sowing a measure of chaos and distrust in the United States at home and abroad. The Joint Force plays a role in resolving this problem.16

Considerations for the Joint Force

The Joint Force has already addressed information as a critical feature in future conflict by acknowledging, in written doctrine, the value of information operations. In 2022, the US Department of Defense published Information in Joint Operations (Joint Publication 3-04). The Army followed with Information (Army Doctrine Publication 3-13), the next year. (The Navy is still developing its information operations community, and the Marine Corps and Air Force have their supporting doctrine.) The Army publication cites the Joint document early on, stating, “[t]he essence of informational power is the ability to exert one’s will through the projection, exploitation, denial, and preservation of information in pursuit of objectives.” The development of specific doctrines means the Department of Defense and the services intend for commanders to use information in their plans and operations.17

The Army’s Information publication lists reasons Joint commanders should apply “informational power.” The first item, “[t]o operate in situations where the use of destructive or disruptive physical force is not authorized or is not an appropriate course of action,” bears the greatest need for emphasis. Except within US Central Command with continuing strikes in Syria and Yemen, no US combatant commander is using destructive physical force. Often, most commanders are not engaged in combat operations, leaving information among the few tools available for daily operations, including cyber and space efforts. As such, combatant commanders should allocate more of their organizations’ daily focus to the information environment. With that additional focus, combatant commands will more likely detect and understand disinformation campaigns directed toward the United States and its allies and disarm them more rapidly and effectively.18

American information warriors counter disinformation in real time by deliberately and aggressively employing truthful information. Critically, they must act rapidly to counter disinformation and even to anticipate it and “pre-bunk” expected narratives. In the age of 24-hour news cycles and memes that seize attention and fade quickly, velocity matters. The current processes for preparing, clearing, and approving information releases are woefully unresponsive and slow, leaving the initiative to the bad actors on the information battlefield. At the same time, combatant commanders must coordinate these efforts among other commands and interagency organizations with the authority to act within the information space. This approach will require inviting other commands and interagency organizations into the process at a level high enough for information releases to happen with appropriate rapidity.

The US intelligence community recognizes the need for change in this arena. While being mindful of protecting sensitive sources and methods, it has challenged previous paradigms to declassify and release classified information to counter disinformation rapidly. When Russia initiated its invasion of Ukraine with lies regarding its reasons for aggregating combat power along the border, the United States and the United Kingdom partnered to share sensitive intelligence, including intercepted communications, demonstrating that Russia was misleading observers about their intentions. Since then, the United States has continued this approach on a limited basis, including providing a warning that Russia intended to employ chemical weapons in Ukraine.19

These releases are not without risk. Astute intelligence services like those in Russia can use them to find and address intelligence vulnerabilities. For some in the intelligence community, this concept goes against long-held preferences to keep intelligence buried deep, lest the collection method get “burned” and become unusable. Still, the rapid declassification and release of select intelligence has been effective enough that commanders and leaders across the US government should continue to use this approach judiciously. It fits well with the idea that sharing the truth benefits the United States and the West. Note the word “judiciously” here. The intelligence community must continue considering risks to sources and methods, and the secondary effects of publicizing the information. Nevertheless, by changing the intelligence community’s culture to value this approach, the United States can find additional success in the information contest.

Other US government efforts to counter disinformation have yielded mixed results. On April 24, 2022, the Biden-Harris administration launched an official “Disinformation Governance Board” within the Department of Homeland Security.20 Political opposition forced it to disband after 25 days (on May 18, 2022). The Foreign Malign Influence Center came to life in September 2022 and has been spared partisan flames.21 Congress approved the center, which was placed under the Office of the Director of National Intelligence. Its principal mission is coordination and analysis across the interagency and intelligence community, but it relies on other agencies to intervene against foreign actions.

In 2017, the FBI set up its Foreign Influence Task Force. Unlike the Foreign Malign Influence Center, the task force can direct and conduct investigations leading to criminal convictions that disrupt disinformation and other influence operations. In 2019, Maria Butina was sentenced to 18 months in prison for operating as an unregistered agent of Russia. Butina used Russian connections to access and influence American politicians associated with the National Rifle Association organization until an FBI investigation undid her efforts. Agents of influence like Butina represent more precisely directed information operations than those designed to create chaos. Joint Force authorities must collect and share information about these matters to support counter-disinformation efforts.22

The proposals to coordinate pre-bunking disinformation across US government agencies have limitations. This approach relies on using the truth to repair or prevent damage wrought by disinformation. Unfortunately, research indicates that challenging false narratives with the truth is largely ineffective. Target audiences of disinformation campaigns must be better prepared to avoid their effects to overcome this limitation. To this end, a new program supporting information literacy should be launched. Combatant commanders are charged with promoting various strategic programs within their areas of responsibility. An information literacy program would be an important tool in their arsenals. The Association of College and Research Libraries defines information literacy as “the set of integrated abilities encompassing the reflective discovery of information, the understanding of how information is produced and valued, and the use of information in creating new knowledge and participating ethically in communities of learning.” Information literacy is a growing area of interest within academia, given the increasing prevalence of faulty information sources, including certain AI-generated information.23

Combatant commanders are well positioned to champion this effort. The Joint Force has various options for delivering such an effort, either unilaterally or in partnership with other agencies, such as the US Agency for International Development. Alongside psychological operations professionals, US Army Civil Affairs personnel represent one potential source of forces to execute such a campaign alongside foreign partners. Charged with promoting civic and social conditions necessary for the Army, civil affairs practitioners work with foreign partners and populations on governance, infrastructure, and civil population concerns. They could be trained to roll out information literacy packages to inoculate foreign partners against infusions of disinformation.

In constructing this program, the United States should avoid the hubris of acting alone and collaborate with allies and partners with Russian experience. By combining best practices with the amplification uniquely available in the United States, this campaign may succeed over time. If we believe telling authentic stories about the West’s conduct is a powerful attraction tool, people must absorb and accept those attractive narratives. Partners prepared to evaluate information sources, challenge narratives, differentiate fact from fiction, and approach information with appropriate skepticism are less likely to fall for Russian fake news designed to dissuade them from working with the United States.

Conclusion

Russian disinformation is here to stay. The Soviet Union and Russia have relied on hybrid warfare and reflexive control throughout modern history. Disinformation is a critical tool for those strategies. Now that the need to join the information fight has been officially acknowledged, operational commanders must implement the Joint Force’s doctrine in the ongoing, daily information contest. Regular, timely, and accurate intelligence and information releases can blunt the effects of Russian operations and improve conditions for the United States worldwide. A well-designed global information literacy program would support this campaign.

This proposal may raise the ire of those who believe the US military’s only purpose is to fight and win kinetic wars. The Joint Force is involved in many other activities to advance America’s strategic interests. The recent publication of information operations doctrine across the Joint Force represents the realization that information is a critical tool for commanders. From this perspective, an information literacy campaign can be seen as battlefield preparation. Correctly done, this campaign will better prepare allies, partners, and others to accept truthful Western narratives and reject disinformation disseminated by adversaries.

 

Michael J. Kelley
Colonel Michael J. Kelley (US Army Reserve) is a military governance specialist. He is a distinguished graduate of the US Army War College and the Advanced Strategic Art Program.

Endnotes

  1. Hannah Gelbart, “The UK Company Spreading Russian Fake News to Millions,” BBC (website), April 4, 2023, https://www.bbc.com/news/world-65150030. Return to text.
  2. Quote from Headquarters, Department of the Army (HQDA), Information, Army Doctrine Publication 3-13 (Washington, DC: HQDA, November 2023), https://armypubs.army.mil/epubs/DR_pubs/DR_a/ARN39736-ADP_3-13-000-WEB-1.pdf. See also Randi Stenson, “Army Publishes First Doctrinal Manual Dedicated to Information,” U.S. Army (website), November 27, 2023, https://www.army.mil/article/271932/army_publishes_first_doctrinal_manual_dedicated_to_information. Return to text.
  3. Jack Watling, Oleksandr V. Danylyuk, and Nick Reynolds, “Preliminary Lessons from Russia’s Unconventional Operations during the Russo-Ukrainian War, February 2022–February 2023,” RUSI (website), March 29, 2023, https://rusi.org/explore-our-research/publications/special-resources/preliminary-lessons-russias-unconventional-operations-during-russo-ukrainian-war-february-2022; Keir Giles, “Russian Cyber and Information Warfare in Practice,” Chatham House (website), December 14, 2023, https://www.chathamhouse.org/2023/12/russian-cyber-and-information-warfare-practice; and Mason Clark, Russian Hybrid Warfare, Military Learning and the Future of War series (Washington, DC: Institute for the Study of War, 2020), 11, http://www.understandingwar.org/sites/default/files/Russian%20Hybrid%20Warfare%20ISW%20Report%202020.pdf; and Christopher Paul and Miriam Matthews, The Russian “Firehose of Falsehood” Propaganda Model: Why It Might Work and Options to Counter It (Santa Monica, CA: RAND Corporation, 2016), https://www.rand.org/pubs/perspectives/PE198.html. Return to text.
  4. Mark Hollingsworth, Agents of Influence: How the KGB Subverted Western Democracies (London: Oneworld Publications, 2023). Return to text.
  5. Ilya Yablokov, “Russian Disinformation Finds Fertile Ground in the West,” Nature Human Behaviour 6, no. 6 (June 2022): 766– 67, https://doi.org/10.1038/s41562-022-01399-3; and Iuliia Iashchenko, “Russian Disinformation in Europe: Justifying Violence and Spreading Propaganda,” Aspenia Online (website), December 14, 2023, https://aspeniaonline.it/russian-disinformation-in-europe-justifying-violence-and-spreading-propaganda. Return to text.
  6. John Merriam, “One Move Ahead—Diagnosing and Countering Russian Ref lexive Control,” Journal of Slavic Military Studies 36, no. 1 (January 2023): 1–27, https://doi.org/10.1080/13518046.2023.2201113. Return to text.
  7. Paul Ratner, “The Past – The Primer on Russia’s ‘Active Measures,’ Its Information Warfare Strategy,” Big Think (website), April 7, 2017, https://bigthink.com/the-past/the-primer-on-russias-active-measures; Maya Vinokour, “Argument – Russia’s Media Is Now Totally in Putin’s Hands,” Foreign Policy (website), April 5, 2022, https://foreignpolicy.com/2022/04/05/russia-media-independence-putin; U.S. Department of State (DoS) Global Engagement Center (GEC), Kremlin-Funded Media: RT and Sputnik’s Role in Russia’s Disinformation and Propaganda Ecosystem, GEC Special Report (Washington, DC: DoS GEC, January 2022), https://www.state.gov/report-rt-and-sputniks-role-in-russias-disinformation-and-propaganda-ecosystem; and United States Attorney’s Office for the District of Columbia, “Indictment: United States of America v. Internet Research Agency, LLC, et al.” (United States District Court, District of Columbia, February 16, 2018), https://www.justice.gov/d9/f ieldable-panel-panes/basic-panes/attachments/2018/02/ 16/internet_research_agency_indictment.pdf. Return to text.
  8. Iuliia Iashchenko, “Understanding Russian Disinformation Strategies inside and outside the Country,” Aspenia Online (website), July 4, 2023, https://aspeniaonline.it/understanding-russian-disinformation-strategies-inside-and-outside-the-country. Return to text.
  9. Alex Horton et al., “Launch of Russian Military Drills Stokes Fears of Preparations for Attack on Ukraine, as Diplomatic Sparring Continues,” Washington Post (website), February 10, 2022, https:// www.washingtonpost.com/world/2022/02/10/ukraine-russia-putin-nato-belarus. Return to text.
  10. Sandra Knispel, “Fact-Checking Putin’s Claims That Ukraine and Russia Are ‘One People,’ ” News Center, University of Rochester (website), March 3, 2022 https://www.rochester.edu/newscenter/ukraine-history-fact-checking-putin-513812; Reuters, “Extracts from Putin’s Speech on Ukraine,” Reuters (website), February 21, 2022, https://www.reuters.com/world/europe/extracts-putins-speech-ukraine-2022-02-21; and NATO, “NATO Member Countries,” NATO (website), June 8, 2023, https://www.nato.int/cps/en/natohq/topics_52044.htm. Return to text.
  11. Reuters, “What Are the Minsk Agreements on the Ukraine Conf lict?,” Reuters (website), February 21, 2022, https://www.reuters.com/world/europe/what-are-minsk-agreements-ukraine-conflict-2022-02-21; and NATO, “Setting the Record Straight: De-Bunking Russian Disinformation on NATO,” NATO (website), January 12, 2024, https://www.nato.int/cps/en/natohq/115204.htm. Return to text.
  12. Insikt Group, “Russian Information Operations Aim to Divide the Western Coalition on Ukraine,” Recorded Future (blog), July 7, 2022, https://www.recordedfuture.com/blog/russian-information-operations-divide-western-coalition-ukraine; and Phil Stewart, “Russia Likely behind U.S. Military Document Leak, U.S. Officials Say,” Reuters (website), April 7, 2023, https://www.reuters.com/world/us/russia-likely-behind-us-military-document-leak-us-officials-say-2023-04-07. Return to text.
  13. Antony J. Blinken, “Marking Fifteen Years since Russia’s Invasion and Occupation of Georgia” (press statement), DoS (website), August 7, 2023, https://www.state.gov/marking-fifteen-years-since-russias-invasion-and-occupation-of-georgia; and Pavel Polityuk, “Russia Holds Annexation Votes; Ukraine Says Residents Coerced,” Reuters (website), September 24, 2022, https://www.reuters.com/world/europe/ukraine-marches-farther-into-liberated-lands-separatist-calls-urgent-referendum-2022-09-19. Return to text.
  14. Giles, “Russian Cyber.” Return to text.
  15. DoS GEC, “The Kremlin’s Never-Ending Attempt to Spread Disinformation about Biological Weapons” (Washington, DC: DoS GEC, March 2023), https://www.state.gov/the-kremlins-never-ending-attempt-to-spread-disinformation-about-biological-weapons; Glenn Kessler, “How the Right Embraced Russian Disinformation about ‘U.S. Bioweapons Labs’ in Ukraine,” Washington Post (website), March 12, 2022, https://www.washingtonpost.com/politics/2022/03/11/how-right-embraced-russian-disinformation-about-us-bioweapons-labs-ukraine; David Klepper and Associated Press (AP), “Tucker Carlson Repeats Baseless Russian Talking Point That U.S. Was behind Gas Pipeline Sabotage,” Fortune Europe (website), October 1, 2022, https://fortune. com/europe/2022/10/01/tucker-carlson-russian-propaganda-pipeline-sabotage-fox-news; Dominick Mastrangelo, “Tucker Carlson Most Popular Individual Americans Follow for News: Survey,” Hill (website), May 24, 2023, https://thehill.com/homenews/media/4018546-tucker-carlson-most-popular-individual-americans-follow-for-news-survey; Gabrielle Settles, “Western Media Outlets Have Tried to Interview Putin, Contrary to Carlson Claim – Fact Check,” Yahoo! News (website), February 8, 2024, https://news.yahoo.com/western-media-outlets-tried-interview-232654575.html; and Yulia Talmazan and Phil Hesel, “Former Fox News Host Tucker Carlson interviewed Russian President Vladimir Putin,” NBC News (website), February 8, 2024, https://www.nbcnews.com/news/world/tucker-carlson-interviewed-russian-president-vladimir-putin-rcna137420. Return to text.
  16. David Klepper, “Russian Disinformation Is about Immigration. The Real Aim Is to Undercut Ukraine Aid,” AP (website), March 1, 2024, https://apnews.com/article/russia-election-trump-immigration-disinformation-tiktok-youtube-ce518c6cd101048f896025179ef19997; and Jonathan Landay, “US Says Russia Funds Latin America-Wide Anti-Ukraine Disinformation Drive,” Reuters (website), November 8, 2023, https://www.reuters.com/world/us-says-russia-funds-latin-america-wide-anti-ukraine-disinformation-drive-2023-11-07. Return to text.
  17. Quote from HQDA, Information, 1-4. See also Mark Pomerleau, “DOD Publishes Revised Doctrine on Information,” DefenseScoop (website), October 7, 2022, https://defensescoop.com/2022/10/07/dod-publishes-revised-doctrine-on-information; and Bart D’Angelo, “Navy Information Operations: Time for a New Unrestricted Line Community,” Proceedings 149, no. 6 (June 2023), https://www.usni.org/magazines/proceedings/2023/june/navy-information-operations-time-new-unrestricted-line-community. Return to text.
  18. HQDA, Information, 1-5. Return to text.
  19. Joshua C. Huminski, “Russia, Ukraine, and the Future Use of Strategic Intelligence,” Prism 10, no. 3 (September 2023): 9–25, https://ndupress.ndu.edu/Media/News/News-Article-View/Article/3511951/russia-ukraine-and-the-future-use-of-strategic-intelligence; and Ken Dilanian et al., “In a Break with the Past, U.S. Is Using Intel to Fight an Info War with Russia, Even When the Intel Isn’t Rock Solid,” NBC News (website), April 6, 2022, https://www.nbcnews.com/politics/national-security/us-using-declassified-intel-fight-info-war-russia-even-intel-isnt-rock-rcna23014. Return to text.
  20. Geneva Sands, “DHS Shuts Down Disinformation Board Months after Its Efforts Were Paused,” CNN (website), August 25, 2022, https://www.cnn.com/2022/08/24/politics/dhs-disinformation-board-shut-down/index.html. Return to text.
  21. Office of the Director of National Intelligence (ODNI), “Foreign Malign Inf luence Center,” ODNI (website), n.d., accessed January 24, 2024, https://www.odni.gov/index.php/ncsc-what-we-do/340-about/organization/foreign-malign-influence-center. Return to text.
  22. FBI, “What We Investigate – Counterintelligence: Combating Foreign Inf luence,” FBI (website), n.d., accessed January 24, 2024, https://www.fbi.gov/investigate/counterintelligence/foreign-influence; and Ashraf Khalil and Chad Day, “Russian Maria Butina Gets 18 Months for Being Kremlin Agent,” AP (website), April 26, 2019, https://apnews.com/article/e997050bdd6b4932bce54db9555c49f4. Return to text.
  23. Association of College & Research Libraries (ACRL), Framework for Information Literacy for Higher Education (Chicago: ACRL, 2015), 26, https://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/infolit/framework1.pdf, as quoted in Ohio State University, “Information Literacy: Concepts and Teaching Strategies,” Teaching and Learning Resource Center (website), n.d., accessed January 24, 2024, https://teaching.resources.osu.edu/teaching-topics/information-literacy-concepts. See also Man-pui Sally Chan et al., “Debunking: A Meta-Analysis of the Psychological Efficacy of Messages Countering Misinformation,” Psychological Science 28, no. 11 (September 2017): https://doi.org/10.1177/0956797617714579. Return to text.
 

Disclaimer: Articles, reviews and replies, and book reviews published in Parameters are unofficial expressions of opinion. The views and opinions expressed in Parameters are those of the authors and are not necessarily those of the Department of Defense, the Department of the Army, the US Army War College, or any other agency of the US government. The appearance of external hyperlinks does not constitute endorsement by the Department of Defense of the linked websites or the information, products, or services contained therein. The Department of Defense does not exercise any editorial, security, or other control over the information you may find at these locations.