Home : News : Display
Sept. 16, 2025

Disinformation as Ground-Shifting in Great-Power Competition

Sorin Adam Matei
©2025 Sorin Adam Matei

ABSTRACT: Disinformation, distinct from misinformation, replaces accepted principles of objectivity and verifiability with novelty, framing, authority, self-reference, and conformity to create a new “truth paradigm.” This article introduces a novel definition and framework for understanding disinformation as a strategic tool in great-power competition. It includes a review of case studies, such as Russian disinformation campaigns during the Russia-Ukraine War, and analyzes cognitive biases and social behaviors that facilitate the spread of disinformation. Policy and military practitioners will find actionable insights into countering disinformation, including its sociopsychological mechanisms and proposed targeted counterstrategies to protect the integrity of information flows in defense and security contexts.

Keywords: disinformation, great-power competition, ground-shifting, misinformation, spiral of silence

 

Are disinformation and misinformation different? Current research about disinformation and military information operations doctrine distinguishes disinformation from misinformation based on harmful intentions and effects. This article broadens this distinction, proposing that disinformation has another, more crucial, feature: “ground-shifting,” which is its capacity to redefine reality. This perspective enhances the current understanding of disinformation and offers new strategies for countering it, which the article presents in detailed recommendations.1

Redefining disinformation complements conventional definitions to emphasize intent while neglecting how it is realized. For instance, Information, Army Doctrine Publication (ADP) 3-13, defines disinformation as “incomplete, incorrect, or out-of-context information deliberately used to influence audiences” and to mislead with malicious intent. Conversely, it defines misinformation as “unintentional incorrect information from any source” disseminated due to ignorance or a mistaken belief that the information is correct. The Government Accounting Office (GAO) report to congressional inquiry, Foreign Disinformation: Defining and Detecting Threats, notes similar distinctions. Disinformation differs from misinformation by producing “false or misleading information deliberately created or spread with the intent to deceive or mislead.” Neither document sufficiently explores the mechanisms by which deception and misdirection operate, though.2

This article fills that gap and reviews broader scientific literature to clarify and explain what disinformation does differently by analyzing how it changes the perceptions of reality of a variety of target audiences. It expands the GAO report’s definition proposing that disinformation is false or misleading information deliberately created or spread to deceive or mislead by altering the perception of reality. This article explores the reality-changing aspect of disinformation, which, for reasons presented below, is called “ground-shifting,” to offer new strategies for countering it.

The definition, description, and recommendations for disarming disinformation also expand upon the analysis of Meghan Fitzpatrick et al. (from the Parameters Spring 2022 issue) of how disinformation employs emotionally charged techniques within our minds to create “alternate realities.” Nevertheless, psychology might not fully explain the impact of disinformation. Insights from the social sciences and communication studies can help practitioners understand better how disinformation influences its targets.3

This article uncovers the social and psychological processes through which disinformation creates alternate realities. It explores how disinformation uses framing and novelty-seeking strategies, transforming unusual perceptions of alternate realities into commonly accepted views. It tackles the core features of disinformation and its consequences, providing a clearer picture of disinformation’s complex nature, which operates on multiple levels. The definition of disinformation should not be limited to its outcomes, as clarified below.

Disinformation: The Deeper Story

It is important to revise how practitioners think about disinformation operations with defense and security implications because disinformation is considered an extreme or weaponized form of misinformation. When seen through the lens of weaponization, the distinction between dis- and misinformation becomes moot since it is one of degree, not quality. Occasionally, the distinction might mention the deliberate nature of disinformation. While necessary, such distinctions might be insufficient to understand the hidden power of disinformation because they omit the possibility that disinformation could act deeper than imagined, disrupting cognitive processes that misinformation does not touch. Understanding disinformation more deeply has significant implications for imagining means to counter it, especially in great-power competitions with defense implications.4

When taken to the extreme, disinformation replaces reality with a subjective interpretation of facts by advancing its new criteria for judging what is real and what is not. When most effective, disinformation engages in ground-shifting by replacing accepted principles for judging public claims—objectivity and verifiability—with novelty, framing, normalizing incredible claims by authority or relying on individual interpretations (self-reference), and conformity created by spirals of silence. Normalization refers to making the unusual and incredible accepted and credible, while self-reference means using one’s interpretation of reality rather than reality based on objective criteria. The spiral of silence idea describes the tendency of individuals who perceive themselves as a minority, even if they are not, to remain silent in public discussions. This self-censoring may transform vocal minorities into emerging majorities.5

The claims made by Russian propaganda operators during the Russia-Ukraine War provide an example of disinformation. For instance, Russian propagandists claim the Ukrainian authorities and military fabricated the evidence for the Bucha massacres. The Ministries of Foreign Affairs and Defence of the Russian Federations claimed the “video footage from Bucha are another production by the Kiev regime for the Western media.” The Russian propaganda campaigns relied on their shock value to reverse what was known about the massacre. Russian propagandists framed sequences of the videos to suggest the bodies of the victims were actors, and they manipulated the video of the victims lying in the street to appear as if they were moving and to reveal staging. Other Ministry of Foreign Affairs of the Russian Federation communiques created perceived authority based on appeals to a global silent majority. The documents state the “special military operation” and its killings give voice to the world’s silent majority as a form of “civil disobedience.” The mix of cliches and contradictions—military operations are not civil—is intentional and conflates disparate concepts, such as “majority” and “civil disobedience.”6

Russian foreign influence operations used to good effect the claim that though 141 United Nations (UN) members condemned Russia’s attack on Ukraine, countries like China—representing more people than the European Union (EU)—refused to impose sanctions and continued to do brisk business with Russia. While this Russian assertion may overlook China’s internal motivations and actions, it creates the illusion of a majority by combining allies with differing agendas to try to sway parts of Western public opinion.7

Given the complexity of the methods employed in the disinformation campaigns above, it is essential to expand the definition of disinformation as it relates to security and defense situations and add novelty, framing, normalization by appeal to authority, and conformity created by spirals of silence and consent. These additions conclude that disinformation might, in certain situations, shift the ground of perceived reality. This enhanced understanding of disinformation can counter its effects in military or great-power competition. The two main goals of the article go beyond the expanded definition of disinformation. First, the article discusses disinformation’s reliance on false perceptions and deliberate distortions and how malicious great-power actors use it. Second, the article highlights practical opportunities presented by the new definition and theory of disinformation for countering it.

The Mechanisms of Disinformation

Disinformation campaigns rewrite reality by using new truth validation criteria, thereby engaging in ground-shifting. The information campaign conducted by Chinese official propaganda operations to deflect suspicions—some rooted in hard factual evidence—that the Wuhan Institute of Virology synthetically produced and accidentally released COVID-19 is an excellent example of an information operation that engages in ground-shifting. Instead of providing evidence that these suspicions are unfounded by identifying the animal that served as an intermediate host for the pandemic pathogen—a scientific goal yet to be accomplished—the Ministry of Foreign Affairs of the People’s Republic of China, through the voice of its spokesperson, Zhao Lijian, launched a media counteroffensive that shifted the conversation. He affirmed the virus was produced in the United States at a bioweapons lab at Fort Detrick, Maryland, and that the illness was brought to China during the Wuhan military games in October 2019, before the COVID-19 outbreak. The Global Times, a publication specializing in tabloid-style stories and controlled at a distance by the Chinese government for plausible deniability, amplified the story.8

Nevertheless, no hard evidence has been found to connect any American-funded or conducted research with COVID-19 viruses at US military facilities. No American participant attending the Wuhan games was ill while at the games. The Global Times used unrelated facts, including a maintenance order meant to address perceived shortcomings of the Fort Detrick wastewater system and an unrelated outbreak of pulmonary illness produced by vaping, to stitch together a story that had only audacity to validate its claims. Its appeal relied on the shock value of the counterattack, which Zhao Lijian executed in a series of tweets arguing that the disease was brought to China by US military personnel. Later, Lijian amplified the counterattack, promoting the idea that the American-funded bioweapon laboratories in Ukraine and Georgia could be used in future conflicts.9

The evidence invoked by Lijian or the Global Times reframed and reinterpreted known but unrelated facts, false authority-based evidence provided by Chinese or Russian media, and the testimony of anonymous experts who referred to the original claims in a circular fashion. These techniques demonstrate how disinformation builds claims on novelty-seeking, fact framing (reinterpretation), appeal to authority, and manipulating the perception of what constitutes the majority opinion to create conformity.10

The intrinsic characteristics of disinformation discussed above are significant because they generate different effects from those created by misinformation. These effects are:

  1. Disinformation operations create stories that overcome barriers imposed by logic and practicality with extraordinary claims whose appeal builds on the shock value of their novelty. Misinformation distorts facts and reality with ulterior and practical motives but always retains some relationship with related and verifiable facts.

  2. Disinformation creates a new perception of truth and new methods for validating it, including replacing objective criteria with subjective interpretation, appeal to authority, and conformity. Misinformation distorts information, but not the methods for validating it.

  3. Disinformation’s alternate version of reality refuses reason; it actively converts the audience to a new set of values and attitudes reinforced by appeals to conformity and selective sourcing based on subjective interpretation derived from personal and at times superficial observations. Misinformation operations do not expect to structure the audience’s expectations and beliefs, though it wishes it did so. Changing some perceptions is sufficient for misinformation agents.

Essentially, at a deeper level, disinformation shifts the ground truth; it requires new premises and rules of evidence. Disinformation refuses a common ground for defining facts and carrying arguments. It is an anti-rhetorical attempt to disarm the adversary, denying the opposing side the right to argue against the reality-creating proposition put forth by the disinformation agent. Disinformation is, because of this pre-emptive technique, an important component of information operations.

As mentioned, disinformation aims to shift the ground under its audience’s feet by exploiting its cognitive and social biases for novelty seeking, framing, normalization by appeal to authority, and dissemination by conformism. The remaining sections will define and discuss these mechanisms and propose means to counter them.

Novelty Seeking

Humans are wired to seek out novelty. When processing information, we pay attention to what deviates from what we already know or contradicts our assumptions. Disinformation exploits this cognitive bias by offering content that dramatically differs from usual information flows. A study of Twitter (now X) information cascades revealed that false information spreads further and faster than accurate information when it departed significantly from what the users already knew. Novelty bias encouraged individuals to share content that shocked or surprised them.11

For example, Russian propaganda conducted a high-novelty disinformation campaign during the war against Ukraine by claiming that the United States had been conducting biological warfare research in Ukraine to deploy weapons of mass destruction against Russia. This extraordinary and unexpected news turned a previously known fact—the support offered by the US Biological Threat Reduction program to Ukraine, Russia, and other former Soviet nations—into a non-falsifiable, self-supporting claim. Turning one real piece of information into a completely different piece of news is a classical disinformation technique.12

In this case, while it is true that the US funded laboratories in Ukraine, Kazakhstan, and other inheritor states of the Soviet Union, including Russia, they did it under the Biological or Nuclear Warfare Threat Reduction Program. This program aimed to keep the labs and scientists employed within a legal and peacebuilding framework. The support offered by the United States and other partners, such as the EU or UN-affiliated organizations, aimed to reduce or eliminate nuclear, biological, and chemical weapons. Absent a support program for the gradual decrease in the number and lethality of weapons of mass destruction in the former Soviet states, the labs and their scientists might have been lured to work for countries interested in building nuclear, chemical, or biological weapons capabilities, such as Iran, Libya, North Korea, or other countries nourishing regional or global grievances. Despite being a former beneficiary of the program, Russia, supported by Chinese propaganda operations, reframed the program around the shocking claim that the United States was, in fact, conducting biological warfare research for its benefit in Ukraine. The story had no basis, yet it was hard to counter because the Russian side counted on the fact that the unusual nature of the claim served as its validation.13

This incident revives propaganda techniques first used in the Korean War. At the time, the Chinese, North Korean, and Russian operators went as far as inoculating prisoners with deadly diseases and burying them in areas temporarily occupied by the United States and recaptured by China and North Korea. The prisoners were disinterred in “fact-finding” campaigns “proving” American use of biological warfare. The Chinese and Soviet propaganda operators used an American covert fact-finding health mission behind enemy lines as another opportunity for disinformation. The mission, led by General Crawford F. Sams, a leading health expert and military physician, was sent in March 1951 to investigate the possibility of a plague outbreak behind the frontlines. If the mission found evidence of plague, the UN forces intended to develop a vaccine and administer it on a large scale to its troops and the civilian population. Fortunately, the mission recognized the instances of “black death,” as referred to by the locals, as a form of smallpox.14

The Chinese used the covert nature of US activities to claim that American health investigators had been dispatched to spread, not identify, the plague in question. Then, as now, the goal was to create a shock-value media event of such a magnitude that it would travel far and fast on account of its novelty alone, especially in nonaligned countries whose populations had limited access to education or nonlocal media. Notably, now, as in the past, Chinese official channels amplify disinformation directed at the United States by its allies.15

Counterstrategy

Disinformation often presents itself as a “revelation,” requiring faith rather than evidence. It is essential to reveal the faith-based nature of disinformation and address the emotional and social needs that drive individuals to accept it on faith.

Framing

Human minds do not deal with raw facts; they comprehend and interpret everything through language. Semantic choices are the most fundamental framing method. Framing is, thus, a way of shaping and giving meaning to facts by using specific verbal nuances, which influence how we see the world. Because of this, framing is central to how disinformation reshapes reality. More importantly, actions with moral significance are strongly influenced by the language we use to portray them. Framing thrives on language’s capacity to transform and explain the world when new situations appear. Framing is not a harmful process by itself; it is a necessary cognitive technology. We use framing by metaphoric terms whenever we encounter a new phenomenon. For example, a truly epic attempt to solve a difficult situation in one fell swoop with massive investment and daring is labeled with the metaphor “moonshot.”16

Framing as a method of choice in international disinformation operations changes through ill-intended metaphors what, how, and why reality is seen. It is a challenging game that requires patience and attention to nuance. Framing campaigns are also challenging to deconstruct; refraining from engaging their premises and ideas entirely is the best way to reject them.17

An example of framing in international relations is Russia’s designation of the Ukrainian territories it has acquired since 2014 with the handy metaphor of “people’s republics.” The Donetsk and Luhansk “people’s republics” were never more than metaphorical political entities. For example, the term “republic,” covers and legitimizes an entirely different reality: neo-feudal fiefdoms controlled by local and Russian operators backed up by Russian Federation military power. Although, over time, a form of popular consent emerged due to the basic need to keep services going, the political reality was and is that the so-called republics were created by the intervention of former secret service officers like Igor Girkin and were run by local politicians closely aligned with Moscow, some of whom became fully integrated with the Russian political machine. For example, the leaders of the Donetsk and Luhansk political entities became members of United Russia, Vladimir Putin’s political party, in December 2021, before the Russian invasion of Ukraine.18

Framing can be more sophisticated, as was the case with the complex rationalizations of child deaths in the Middle East conflicts. The deaths of two boys killed in the conflict following the Hamas attack on Israel on October 7, 2023, one Palestinian and the other Israeli, were used as examples of how the other side tried to hide “reality.” In each case, the pictures of the dead bodies were reframed by recontextualizations to suggest they were created by staging and acting. Although the two situations were not interchangeable, disinformation was used by social media actors in each case to reverse the meaning of verifiable facts and to justify morally challenging or indefensible actions.19

Counterstrategy

Combat disinformation by reframing conversations around verifiable facts. Use language that emphasizes truth-validation and disrupts the metaphors that disinformation relies on. Bringing conversations back to objective reality is essential, as is using terms that undermine false frames. According to the United Nations (UN) resolution ES-11-1, the war against Ukraine is a war of aggression and should be presented as such. Specific death incidents of civilians should be presented in their context, highlighting the moral cause and consequence of each death but also including an honest representation of the intentional or unintentional cause of the deaths.20

Normalization

Disinformation aims to normalize previously unacceptable ideas. To this end, it exploits the audience’s tendency for selective exposure, narrowing the information acquisition circle. The strategic placement of information in multiple media sources creates, in social media parlance, “rabbit hole effects” or, in scientific terms, “echo chambers.” Overreliance on opinion leaders who agree with one’s position also contributes to normalization.21

The moral interpretation of facts and acceptance of what seemed marginal and unusual as normal occur through a “two-step flow” of communication and selective exposure. We are more likely to accept the “new normal” descriptions or evaluations of human actions or concepts when we live in an increasingly insular world in which opinion leaders adopt, wittingly or not, the marginal and unusual ideas peddled as disinformation.22

The cascading process by which information moves in “two steps,” from media to influencers and from them to consumers, was proposed for the first time during in the decades following World War II to explain how people make electoral decisions and adopt new ideas. Studying how people chose candidates and how physicians adopted antibiotics, sociologist Elihu Katz—following in the footsteps of Paul Lazarsfeld, Bernard Berelson, and Hazel Gaudet—noticed the importance of the advice or attitudes of individuals that the target audience members considered “opinion leaders.” These were people whose opinions mattered not due to their official qualifications but to the frequency with which their opinions were solicited. Otherwise said, these influencers, before the term took on the social media connotation, were opinion leaders by increased exposure to mass media.23

The intermediary role of influencers in projecting broader media messages has been replicated many times in the context of mass and social media. Multiple studies showed that viral information tends to be disseminated by a small group of individuals, following a Pareto distribution where a small percent of users are responsible for a majority (more than 50 percent) of communication and its effects.24

A study analyzing posts issued during the early stages of the medium also remarked that overall, viral messages might be started even by influencers with small followings, opening the possibility that everybody can be an influencer. This theoretical perspective can be identified in practice in the Chinese social media campaigns in Taiwan, which aim to focus on the “small influencers” that share specific characteristics agreed upon by the communist government in Beijing.25

Influence, however, becomes even stronger when the users selectively expose themselves to the sources of information that the leaders point toward. Selective exposure reinforces unusual and marginal ideas and pushes them to the center. People expose themselves and share content based on their ideological affiliations, which they seek out. A Facebook study showed that conservatives and liberals surround themselves online with friends who share the same content as them, and their chances of being exposed to crosscutting party content are 20 percent for liberals and 35 percent for conservatives. More importantly, selective exposure seems to affect human choice more than algorithmically induced filter bubbles.26

The surprising increase in support for Russia among populist parties in the West after the start of the war against Ukraine is in part explainable by partisan echo chamber effects. Western populists embraced Russia more out of domestic opposition to their internal political enemies’ support for Ukraine than genuine concern for Putin’s war. New media conversations played an important role in narrowing the circle of knowledge about the significance of the war against Ukraine as they acted as shortcuts for information seeking.Opinion leaders often amplify marginal ideas until they become mainstream.27

Counterstrategy

Disarming two-step flow and selective exposure are some of the most challenging tasks in countering misinformation—but not without hope. Since selective exposure requires validating one’s opinions via interaction with influencers, which, according to the studies mentioned above, are often quite ordinary message carriers, a successful method is to encourage the emergence of local influencers. An effective method of growing local influencers could be “seeding and weeding.” This metaphor suggests that encouraging consistent and constant flows of micro-information (seeding), spread by local opinion carriers who express opinions that counter the ideas proposed by disinformation campaigners, may be an effective way to counter disinformation carriers (weeding) at the micro level where they operate. In international affairs, the public targeted by disinformation campaigns should be made aware of grassroots dissenting voices. Peer leaders are essential for countering malignant information campaign. Seeding and growing local voices is a core requirement for creating resilient communities, which depend on local communication infrastructures to generate social capital and tackle external challenges. Countering normalization requires desaturating the field of the ideas promoted by the disinformation campaigners.28

Consolidating Disinformation by Conformity Seeking

Disinformation takes root and spreads through a process called the “spiral of silence” and consent effect in communication research, whereby individuals, fearing social isolation, align with perceived majority opinions. Disinformation campaigns are particularly adept at creating the illusion that their views represent the majority, even when they do not. Illusory majorities foster a perception of isolation among the unconverted, who will align with the emerging majority out of a natural tendency to conform. Alexis de Tocqueville was the first author to identify the danger of conformity in public conversations. He talked about the danger of a “tyranny of the majority,” remarking that, “Dreading isolation more than the stigma of heresy, [many] professed to share the sentiments of the majority.”29

Noelle-Neumann built on this insight 120 years later by observing that mass media coverage of specific topics, even if not popular, creates pressure to conform out of the desire not to be left behind: “According to the social psychological mechanism here called ‘the spiral of silence,’ the mass media have to be seen as creating public opinion: they provide the environmental pressure to which people respond with alacrity, or with acquiescence, or with silence.” Her subtler point is that vocal public opinion minorities can become majorities by creating the perception that their position is about to become popular. Perceptions of being on the losing side of a debate, even if the side is still in the majority, make the majority members cautious about expressing their opinions in public. As more and more people become silent, perceived majorities become real ones.30

Noelle-Neumann explored the spiral of silence process in Germany during the 1970s, a time of high political polarization and terrorism. She observed that in conditions of high polarization, even perceptions that a particular opinion is on the increase can lead to alignment with what appears to be the winning side. Alignment with the prevailing public opinion drift was also noticeable during the COVID-19 crisis when even a perception that vaccines might not be as effective as initially thought led to opposition to them. Russian and Chinese disinformation campaigns effectively use claims of majority support to validate their actions. Russia, for example, often portrays itself as representing a global silent majority in its justifications for military aggression.31

The People’s Republic of China has repeatedly and astutely accompanied its claims related to the South China Sea with a successful information operation that converted a minority into an emerging majority that supports China’s occupation of the South China Sea. Asked in 2023 if they sided with China or the United States regarding the South China Sea crisis induced by China’s occupation and militarization of the region, only 39 percent of the Southeast Asian public opinion sided with China, even though China annexed large swaths of maritime territory, including in the Economic Exclusion Zone (EEZ) of nations like Indonesia or Malaysia. In 2024, 51 percent of Southeast Asian public opinion indicated a preference for China if there were a conflict with the United States over this matter.32

The alignment is not, however, a matter of solid alliance, since in 2024, 50 percent of respondents indicated that they distrusted China while only 38 percent distrusted the United States. Southeast Asian public opinion is potentially reacting to the US global response to the multiple crises it faced, which are interpreted either as a failure to respond or as an unwise use of force to prop up causes that failed, specifically, the war in Gaza. Yet, regardless of the deeper causes, Chinese propaganda succeeded in turning a minority into a majority by assiduous information campaigns that repeated the mantra of the silent majority.33

Counterstrategy

Counter the spiral of silence by revealing the actual size and nature of the opposition to disinformation. Publicize accurate surveys and data that show the actual majority opinion, undermining the false narrative of widespread support for disinformation.

Counter-Disinformation Strategic Policy Recommendations

Extreme disinformation distorts its target subjects’ perceptions of reality by utilizing core psychological and social mechanisms: novelty, framing, normalization, and conformity through spirals of silence. Each mechanism has specific vulnerabilities that can be exploited to undermine resistance to disinformation. Open societies and defense organizations can safeguard their information integrity only by comprehending and addressing the factors that permit disinformation to thrive. Based on the expanded definition of disinformation, its underlying mechanisms, and proposed tactical countermeasures outlined above, policymakers, military leaders, and national security experts focused on countering disinformation in great-power competition need to develop more specific and comprehensive methods of identifying and countering disinformation that engages in ground-shifting. In what follows, I will propose recommendations for specific methods and strategies for countering disinformation directly derived from the proposition that extreme disinformation engages in ground-shifting.

The recommendations partially overlap with existing practices and strategies. The propositions are more specific and grounded in an expanded theory of disinformation than the typical injunctions found in the information operations doctrines or governmental guidance. For instance, a review of relevant information operations doctrines, including Information, Army Doctrine Publication 3-13, Information in Joint Operations, Joint Publication (JP) 3-04, and Military Information Support Operations, JP 3-13.2, reveals that their main recommendations are at an abstract level. For example, ADP 3-13 emphasizes the importance of being truthful, timely, and legal when countering information operations. While data literacy is also noted, it remains a general statement. Continuing this trend with high-level advice, JP 3-04 recommends timely identification and public discussion of misinformation and disinformation to preempt and debunk adversarial narratives. The document states that counter-disinformation “activities [should] pre-empt, undermine, or counter adversary and enemy use of narratives, especially those that convey misinformation or disinformation. [T]he goal is to provide accurate and useful information to relevant actors in a timely manner to increase its credibility and relevance.” The only more specific recommendations are the use of non-specific computerized monitoring of disinformation and wargaming information operations.34

Expanding the review of common recommendations for combating disinformation in the GAO report Disinformation: Defining and Detecting Threats shows that governmental efforts to detect and counter disinformation focus on broad objectives and simple actions, like identifying and publicizing instances of disinformation. While the report mentions methods such as media monitoring, network analysis, and synthetic content detection, it fails to specify the distinct signatures of disinformation or effective strategies to combat it that these methods may discover.

The Carnegie Endowment offers a more detailed analysis of effective anti-disinformation practices for maintaining international peace. It highlights vital counter-disinformation strategies, including supporting local journalism, fact-checking, labeling social media content, and adjusting recommendation algorithms. These practical strategies are grounded in observing their outcomes, marking a positive step forward. Nonetheless, the report lacks the necessary and sufficient criteria for proposing methods of identifying disinformation and new counter-disinformation techniques or measuring their effectiveness.

The recommendations below build on existing ideas and practices, providing more specific suggestions. This article justifies each through the theoretical framework discussed earlier and proposes specific implementation methods explaining how the recommendations expand upon and refine existing doctrine and policy on defense-related information operations. Ultimately, the recommendations complement standard practices while introducing a new level of specificity and theoretical backing.

Provide More Specific Definitions and Descriptions of Disinformation Mechanisms

While disinformation definitions are provided in current practice, the defining characteristics, including and especially ground-shifting, are not mentioned in the current information operations doctrine or governmental practices. Add the ground-shifting dimension to any method or metric for identifying disinformation.

Triage information operations, separating disinformation from misinformation, starting with the premise that disinformation engages in ground-shifting. Recognize that disinformation is a qualitatively different form of falsehood, actively creating a new version of reality rather than distorting existing facts. Keep in mind that while disinformation might have a harder time catching on, it is more difficult to dislodge once established. In medical terms, disinformation has low virality and high lethality. While misinformation is more like a severe case of the flu, disinformation is like Ebola. Criteria and metrics for disinformation should include reality perception shifting measures and departure from accepted reality. Create and use threat matrices centered on the disinformation mechanisms described above: novelty seeking, framing, normalization, and dissemination through conformity. Use advanced language models to detect the mechanism automatically and assign a probability of departure from reality and effectiveness for specific disinformation narratives.

Disrupt Novelty Seeking

Expose hype and faith-based claims. Enhance generic guidelines for identifying and publicizing disinformation. Include in the relevant publications a proactive approach to immediate response strategies to address extraordinary claims made by disinformation campaigns. Employ the “naked emperor” technique to counter disinformation’s extraordinary and faith-based characteristics. Without extraordinary evidence, these claims typically undermine themselves.

Map out potential extraordinary claims by closely monitoring social and traditional media. Prepare immediate response templates that expose the true nature of disinformation: an effort to construct reality solely through shocking claims. Avoid engaging in a piecemeal counterargument of these claims, as this can inadvertently lend legitimacy to the scant evidence presented by propagandists. For instance, the doctored Bucha videos should be accurately labeled as fabrications rather than dissected into arguments about their validity. Adopting this approach will strengthen our capacity to combat disinformation effectively and ensure a clear, unified stance in our communication efforts.

Preempt novelty. Expand the generic recommendations provided by documents such as ADP 3-13 for timely and truthful release of information methods to provide truthful, well-contextualized information about controversial events proactively and anticipatorily, reducing the space for disinformation to introduce shocking claims later on. The accusations that the US-funded biolabs in Ukraine could have been countered by more frequent and open publicity around the threat reduction activities in Eastern Europe or by divesting any support before the conflict, as Ukraine did not present a threat of proliferation anymore.

Run audits of potential exposure to novelty-seeking stories by identifying presence, events, and situations that could put US troops, interests, or international activities in an exploitable situation. Increase the degree to which operational security is maintained and ensure that sensitive operations are not exposed to open-source investigations while preemptively informing the public about activities and presence that could be framed maliciously.

Counter-Framing

Reframe objectively. Framing is not currently recognized as a specific method of disinformation. Doctrine documents should generalize and institutionalize framing assessment in disinformation. Identify the most likely vectors for framing offensives, such as storylines, sensitive topics or individuals, and tropes the adversaries could weaponize. When framing attacks occur, use neutral, fact-based language to redirect conversations toward verifiable truths. Use counter-framing instead of disassembling the enemy frames. For instance, call actions like the invasion of Ukraine by its legal and moral definition (for example, aggression) but do not engage in supplementary explanations as to why the Russian phrase “special military operations” is invalid.

Expose framed narratives. Identify specific narratives distorted by framing and out-of-context use and launch counternarratives that present the same facts without the adversarial frames. Do not reference the framed narratives directly, as this feeds the adversary influence operations. The Chinese claims that Fort Detrick engages in biological weapons research should be countered by a blitz media campaign highlighting the actual mission of the United States Army Medical Research Institute of Infectious Diseases, which is to prevent biological attacks against the United States. It should highlight the role of Fort Detrick and other defensive labs in countering the anthrax attacks post 9/11 and in preventing the spread of diseases such as Ebola in Africa. Release stories and fact sheets that remind the world that President Richard Nixon renounced US research on offensive biological weapons. Expose Chinese and Russian violations of international agreements related to biological weapons research and proliferation using selectively released classified information.

Combat Normalization

Reduce selective exposure and echo chambers. None of the doctrine documents or relevant literature mentioned above emphasizes the role of selective exposure, subjective interpretation of reality, and echo chambers as necessary mechanisms for making the alternate realities created by disinformation accepted, normalized narratives. It is necessary to mention and utilize them as core elements of disinformation processes. Echo chambers appear when individuals selectively expose themselves to information, confirming their biases. Each adversarial information campaign activates multiple sources of information and storylines directly and indirectly. Foster diverse perspectives if they disagree with your adversary. Complete alignment with your position is not necessary if the combined effect of these diverse perspectives weakens adversary claims. Empower credible local influencers to counteract biased narratives within target communities. The Iranian anti-Western narratives can often be countered by stimulating Middle Eastern voices that see the danger of a Shia apocalyptic war for the stability and well-being of the Middle East as a whole.

Monitor federated sources. Identify and neutralize networks by exposing them and conducting media campaigns that reveal the real identities of the agents behind coordinated disinformation across platforms. Revealing the sources of disinformation may disrupt their capacity to create echo chambers and act in the shadows.

Undermine Conformity Induced by the Spiral of Silence

Publicize real majorities. Use accurate data and opinion surveys to challenge the perception that the narratives promoted by disinformation represent a dominant viewpoint. For example, public opinion in Southeast Asia is deeply divided about China’s role in the region. Disarm claims that the locals see the “far enemy” (the United States) as the real culprit in the crisis by activating those local voices that fear China more, especially in Indonesia, Malaysia, and Vietnam.

Encourage open dialogue. Promote open, transparent discussions, including opinions you do not directly agree with, by which dissent against disinformation is visible and supported.

Employ Technological and Strategic Tools

AI detection. Use artificial intelligence to identify disinformation patterns and trace their sources. With the advent of large language models (LLM), content summarization and theme detection have become much easier and more precise. Using AI to generate counternarratives is another avenue to reduce the impact of adversarial storylines.

Rapid response teams. Establish units addressing emerging disinformation narratives in real time, using threat matrices and AI-based social media monitoring tools.

Information resilience training. Train personnel and allied forces to recognize and counter disinformation at tactical and strategic levels. Provide response templates for the most frequently encountered situations, including defensive and offensive ones. Offensive plans should be construed within the limits of US laws and ethics. Regardless, they should not shy away from calling the adversary’s disinformation campaigns what they are: blatant attempts to gain an advantage by changing the ground on which the conversation occurs.

Strengthen Public and Partner Engagement

Learn from the locals and build trust. Establish partnerships with media organizations and civil society in third countries targeted by common adversaries and create a robust information environment resistant to manipulation. The selection of allies should be based on common strategic interests. The partners should have just as much at stake as the United States. Organize listening and learning sessions with the partners to learn from them about sensitive topics and vulnerable publics. Create joint operations that ensure achieving your and their goals, even if they are only partially convergent. Enhance and share threat matrices, social media monitoring, listening platforms, and AI disinformation discovery tools with the partners. Western nonprofit assistance teams in Ukraine currently employ some of these recommendations. The model can be expanded to other regions, especially Africa, the Middle East, and Southeast Asia.

Educational campaigns. Conduct public awareness initiatives that teach critical thinking and digital literacy, reducing susceptibility to disinformation.

Leverage International Frameworks

Collaborate with trusted international organizations rooted in common values and ideas to create unified standards for identifying and countering disinformation by creating and sharing threat and threat-reduction matrices. An example to be emulated in the space of disinformation is the Krach Institute Trusted Tech Standard (xGTT). The xGTT Standard specifies the criteria for countries, companies, and organizations to be labeled as purveyors of trusted technology based on an evaluation across three key technology pillars: development, deployment, and governance. A similar approach could be used for certifying trusted content or for identifying and labeling untrusted disinformation.35

Build Trust Mechanisms between Counter-Disinformation Sources and Target Audiences

Create trust bridges with target audiences. Mistrust in governments, armed forces engaged in conflict, intelligence organizations, and other official sources is high across the entire world. Before deploying counter-disinformation messages, the sources create trust bridges with target audiences. These should include identifying and partnering with multiple channels trusted by the audiences affected by disinformation, checking the validity of counter-disinformation messages, including raw data and third-party observations, and engaging in dialogue with the target audience. Timeliness and openness of disclosure remain important methods and values in counter-disinformation strategies.

The recommendations provided create a suite of disinformation diagnosis, intervention, and assessment procedures that start from theory-derived principles and go beyond directly intuitive ideas. The goal is to expand counter-disinformation by recognizing one of its most malicious potential effects and to replace factually verifiable perceptions of reality with subjective ideas that rely on emotional needs of social conformity and validation. The recommendations disarm these mechanisms through appropriate measures that look at disinformation’s characteristics and the sociopsychological factors that facilitate it. By understanding and countering disinformation through these strategies, military and policy planners can safeguard the integrity of their operations and influence campaigns and reinforce resilience in the broader information ecosystem.

Conclusion

This article redefines disinformation not merely as false information, but as a “ground-shifting” technique that may fundamentally alter our perception of reality. This technique taps sociopsychological mechanisms that exploit socio-cognitive biases. Disinformation builds on novelty-seeking to introduce shocking, faith-based claims that bypass logic. It employs linguistic framing through metaphors that reinterpret facts and legitimize alternative realities. Disinformation also thrives on normalization through selective exposure and echo chambers, which funnels individuals into segregated information cocoons that amplify ground-shifting. Finally, the spiral of silence and consent creates illusory majorities, pressuring individuals who fear social ostracism, and consolidates disinformation.

If we accept that disinformation is a “ground-shifting” phenomenon, counter-disinformation strategies should also be reconceptualized and redesigned. We must move beyond simply debunking falsehoods to exposing actively the underlying mechanisms that redefine truth itself. Future work should focus on developing precise metrics for detecting ground-shifting, preempting novelty-seeking narratives, disrupting echo chambers by fostering diverse local voices, and publicly challenging fabricated majorities.

Future research should also empirically validate the effectiveness of the proposed counter-disinformation strategies, particularly through experimental and real-life observational research. Further interdisciplinary studies are also crucial to deepen the understanding of how sociopsychological factors interact with influence operations that utilize bots and AI-generated content to facilitate ground-shifting. Moreover, exploring the long-term societal impacts of ground-shifting disinformation should guide interventions aimed at strengthening information resilience.

 
 

Sorin Adam Matei
Dr. Sorin Adam Matei is the associate dean of research and graduate education in the College of Liberal Arts at Purdue University, a professor of communication in the Brian Lamb School for Communication at Purdue University, and the director of the FORCES (4S – Strategy, Security, and Social Systems) Initiative dedicated to defense technology studies. He holds a PhD from the University of Southern California with a focus on human-computer interaction, a master of arts degree in international relations from Tufts University, and a bachelor of arts degree in history and philosophy from Bucharest University. His primary research interests are interdisciplinary, including human-technology and AI interaction with defense applications.

 
 

Disclaimer: Articles, reviews and replies, review essays, and book reviews published in Parameters are unofficial expressions of opinion. The views and opinions expressed in Parameters are those of the authors and are not necessarily those of the Department of War, the Department of the Army, the US Army War College, or any other agency of the US government. The appearance of external hyperlinks does not constitute endorsement by the Department of War of the linked websites or the information, products, or services contained therein. The Department of War does not exercise any editorial, security, or other control over the information readers may find at these locations.

 
 

Endnotes

  1. Headquarters, Department of the Army (HQDA), Information, Army Doctrine Publication (ADP) 3-13 (HQDA, November 2023), 1–10; Joint Chiefs of Staff (JCS), Military Information Support Operations, Joint Publication (JP) 3-13.2 (JCS, 2024); and JCS, Information in Joint Forces, JP 3-04 (JCS, 2022). Return to text.
  2. US Government Accountability Office (GAO), Foreign Disinformation: Defining and Detecting Threats, GAO-24-107600 (GAO, September 26, 2024), 8, https://www.gao.gov/assets/gao-24-107600.pdf. Return to text.
  3. All sentences that refer to disinformation or misinformation as subjects are stylistic conventions meant to make the reading more fluid. In each instance, disinformation refers, of course, to campaigns or operations and should not be interpreted that disinformation as a concept has its own volition or agency. Meghan Fitzpatrick et al., “Information Warfare: Lessons in Inoculation to Disinformation,” Parameters 52, no. 1 (Spring 2022), https://press.armywarcollege.edu/parameters/vol52/iss1/9/; and Elena Broda and Jesper Strömbäck, “Misinformation, Disinformation, and Fake News: Lessons from an Interdisciplinary, Systematic Literature Review,” Annals of the International Communication Association 48, no. 2 (June 2024). Return to text.
  4. “Foreign Influence Operations and Disinformation,” Cybersecurity & Infrastructure Agency (CISA), 2024; Benjamin A. Lyons, “Misinformation, Disinformation, and Online Propaganda,” in Social Media and Democracy, ed. Nathaniel Persily and Joshua A. Tucker (Cambridge University Press, 2020); and Samuel Spies, “Defining ‘Disinformation’ ” (lecture at “Battling Disinformation with Realistic Solutions in a Modern Society,” National Security Program, Foreign Policy Research Institute, June 2020). Return to text.
  5. Dietram A. Scheufele, “Spiral of Silence Theory” in The SAGE Handbook of Public Opinion Research, ed. Wolfgang Donsbach and Michael W. Traugott (Sage, 2008); and Elisabeth Noelle-Neumann, The Spiral of Silence: Public Opinion—Our Social Skin (University of Chicago Press, 1993). Return to text.
  6. Amanda Seitz and Arijeta Lajka, “Amid Horror in Bucha, Russia Relies on Propaganda and Disinformation,” PBS News, April 6, 2022, https://www.pbs.org/newshour/world/amid-horror-in-bucha-russia-relies-on-propaganda-and-disinformation; and Robert Mackey, “Russian TV Is Filled with Images of Bucha’s Dead, Stamped with the Word ‘Fake,’ ” The Intercept (April, 12, 2022), https://theintercept.com/2022/04/12/bucha-massacre-russia-tv-fake-ukraine-war/; Minoborony Rosii, “Russian Defence Ministry Denies Accusations of Kiev Regime of Allegedly Killing Civilians in Bucha, Kiev Region” – Минобороны России” Facebook, April 3, 2024, https://www.facebook. com/mod.mil.rus/posts/3197015560541178; Arijeta Lajka and Sophia Tulp, “Video Does Not Show Staged Bodies in Bucha,” AP News, April 4, 2022, https://apnews.com/article/fact-checking-847134637960; and Segei Karaganov, “Russia’s Policy toward World Majority,” Russian Federation: Council on Foreign and Defence Policy (2023), https://www.mid.ru/upload/medialibrary/c98/cjmfdf73760bme0y99zqllj51zzllrvs/Russia%E2%80%99s%20Policy.pdf. Return to text.
  7. “Russia Can Count on Support from Many Developing Countries,” Economist Intelligence Unit, March 30, 2022, https://www.eiu.com/n/russia-can-count-on-support-from-many-developing-countries/. Return to text.
  8. Alina Chan and Matt Ridley, Viral: The Search for the Origin of COVID-19 (Fourth Estate, 2021); Steven Lee Myers, “China Spins Tale That the U.S. Army Started the Coronavirus Epidemic,” The New York Times, updated July 7, 2021, https://www.nytimes.com/2020/03/13/world/asia/coronavirus-china-conspiracy-theory.html; Fan Lingzhi et al., “Suspect No. 1: Why Fort Detrick Lab Should Be Investigated for Global COVID-19 Origins Tracing,” Global Times, June 28, 2021, https://www.globaltimes.cn/page/202106/1227219.shtml; and “China’s Global Times Plays a Peculiar Role: It Is Unfashionable in China to Take the Fiery Tabloid Seriously,” The Economist, September 20, 2018, https://www.economist.com/china/2018/09/20/chinas-global-times-plays-a-peculiar-role. Return to text.
  9. Myers, “China Spins Tale”; and “Online Petition for Fort Detrick Probe Draws 20m Signatures; China Urges US to Open UNC Lab, Disclose Military Games Patients,” Global Times, July 30, 2021, https://www.globaltimes.cn/page/202107/1230123.shtml; Lijian Zhao 赵立坚 [@zlj517], “2/2 CDC Was Caught on the Spot. When Did Patient Zero Begin in US? How Many People Are Infected? What Are the Names of the Hospitals? It Might Be US Army Who Brought the Epidemic to Wuhan. Be Transparent! Make Public Your Data! US Owe Us an Explanation!” ; and Lijian Zhao, “Foreign Ministry Spokesperson Zhao Lijian’s Regular Press Conference,” March 16, 2022, http://gy.china-embassy.gov.cn/eng//fyrth/202203/t20220316_10652302.htm. Return to text.
  10. “Online Petition for Fort Detrick.” Return to text.
  11. D. E. Berlyne, “Novelty, Complexity, and Hedonic Value,” Perception & Psychophysics 8 (September, 1970); Lauren Itti and Pierre Baldi, “Bayesian Surprise Attracts Human Attention,” Vision Research 49, no. 10 (June 2009); Sorin Adam Matei and Lucas Hunter, “Data Storytelling Is Not Storytelling with Data: A Framework for Storytelling in Science Communication and Data Journalism,” The Information Society: An International Journal 37, no. 5 (August 2021); Soroush Vosoughi et al., “Automatic Detection and Verification of Rumors on Twitter,” ACM Transactions on Knowledge Discovery from Data 11, no. 4 (November 2017). Return to text.
  12. “The Ministry of Defense Has Uncovered a Network of Biolabs in Ukraine That Were Working on Orders from the Pentagon,” Topwar.ru, March 7, 2022. https://topwar.ru/193207-minoborony-vskrylo-na-ukraine-set-biolaboratorij-rabotavshih-po-zakazu-pentagona.html; David R. Franz et al., The Biological Threat Reduction Program of the Department of Defense: From Foreign Assistance to Sustainable Partnerships (The National Academies Press, 2007), https://nap.nationalacademies.org/read/12005/chapter/1#ix. Return to text.
  13. Olga Robinson et al., “Ukraine War: Fact-Checking Russia’s Biological Weapons Claims,” BBC, March 15, 2022; Franz et al., Biological Threat Reduction; and Robinson et al., “Ukraine War.” Return to text.
  14. Conrad C. Crane, “Korean War Biological Warfare Allegations against the United States: A Playbook for the Current Crisis in Ukraine” (US Army War College Press, 2022), https://press.armywarcollege.edu/articles_editorials/458/; Robinson et al., “Ukraine War”; Conrad C. Crane, “Chemical and Biological Warfare during the Korean War: Rhetoric and Reality,” Asian Perspective 25, no.3 (2001); and Jonathan D. Clemente, “Medical Intelligence Mission to North Korea, March 1951,” The Intelligencer: Journal of U. S. Intelligence Studies 28, no. 1 (Winter-Spring 2023). Return to text.
  15. Clemente, “Medical Intelligence”; and Zhao, “Foreign Ministry.” Return to text.
  16. Anton Angwald and Charlotte Wagnsson, “Disinformation and Strategic Frames: Introducing the Concept of a Strategic Epistemology towards Media,” Media, Culture, & Society 46, no. 7 (2024), https://doi.org/10.1177/01634437241265045; Dietram A. Scheufele, “Agenda-Setting, Priming, and Framing Revisited: Another Look at Cognitive Effects of Political Communication,” Mass Communication and Society 3, no. 2-3 (2000), https://doi.org/10.1207/S15327825MCS0323_07; David H. Weaver, “Thoughts on Agenda Setting, Framing, and Priming,” Journal of Communication 57, no. 1 (March 2007), https://academic.oup.com/joc/article-abstract/57/1/9/4102624?redirectedFrom=fulltext; George Lakoff, The Political Mind: Why You Can’t Understand 21st-Century Politics with an 18th-Century Brain (Viking, 2008); and Gideon Keren ed., Perspectives on Framing (Routledge, 2011). Return to text.
  17. Paul H. Thibodeau and Lera Boroditsky, “Metaphors We Think With: The Role of Metaphor in Reasoning,” PLoS ONE 6, no. 2 (February 2011), https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0016782; Maria Hellman, “Narrative Analysis and Framing Analysis of Disinformation” in Security, Disinformation and Harmful Narratives: RT and Sputnik News Coverage about Sweden, https://link.springer.com/chapter/10.1007/978-3-031-58747-4_4; and Dennis Chong and James N. Druckman, “Counterframing Effects,” The Journal of Politics 75, no. 1 (January 2013), https://www.journals.uchicago.edu/doi/abs/10.1017/S0022381612000837. Return to text.
  18. Tomasz Piechal, “The War Republics in the Donbas One Year after the Outbreak of the Conflict” (OSW Centre for Eastern Studies, 2015); and Nataliia Kasianenko, “Internal Legitimacy and Governance in the Absence of Recognition: The Cases of the Donetsk and Luhansk ‘People’s Republics,’ ” Ideology and Politics 1, no. 12 (2019), https://facpub.library.fresnostate.edu/items/show/99. Return to text.
  19. Marianna Spring, “Omer and Omar: How Two 4-Year-Olds Were Killed and Social Media Denied It,” BBC, October 25, 2023, https://www.bbc.com/news/world-middle-east-67206277; and Dietram A. Scheufele and David Tewksbury, “Framing, Agenda Setting, and Priming: The Evolution of Three Media Effects Models,” Journal of Communication 57, no. 1 (2007): 9–20. https://doi.org/10.1111/j.0021-9916.2007.00326.x. Return to text.
  20. United Nations (UN), Aggression against Ukraine, G.A. Res. ES-11-1 (UN, March 2, 2022), https:// www.securitycouncilreport.org/un-documents/document/a-res-es-11-1.php. Return to text.
  21. Carlos Diaz Ruiz and Tomas Nilsson, “Disinformation and Echo Chambers: How Disinformation Circulates on Social Media through Identity-Driven Controversies,” Journal of Public Policy & Marketing, 42, no. 1 (May 2022), https://doi.org/10.1177/07439156221103852. Return to text.
  22. Elihu Katz, “The Two-Step Flow of Communication: An Up-to-Date Report on a Hypothesis,” Public Opinion Quarterly 21, no. 1 (Spring 1957): 61–68. Return to text.
  23. Paul F. Lazarsfeld et al., The People’s Choice: How the Voter Makes Up His Mind in a Presidential Campaign (Duell, Sloan and Pearce, 1944); Katz, “Two-Step Flow of Communication,” 61–68; Sorin Adam Matei, “Is Two-Step Flow Theory Still Relevant for Social Media Research?,” matei.org, July 30, 2010, https://matei.org/ithink/2010/07/30/is-stop-step-flow-theory-still-relevant-for-social-media-research/; and Gabriel Weimann, “Communication, Two-Step Flow of Communication,” International Encyclopedia of the Social & Behavioral Sciences, 2nd ed. (Elsevier, 2015), 291–93. Return to text.
  24. Sorin Adam Matei and Robert J. Bruno, “Pareto’s 80/20 Law and Social Differentiation: A Social Entropy Perspective,” Public Relations Review 41, no. 2 (June 2015), https://www.sciencedirect.com/science/article/abs/pii/S0363811114001787. Return to text.
  25. Eytan Bakshy et al., “Everyone’s an Influencer: Quantifying Influence on Twitter,” WSDM ’11: Proceedings of the Fourth ACM International Conference on Web Search and Data Mining (February 2011); and Tzu-Chieh Hung and Tzu-Wei Hung, “How China’s Cognitive Warfare Works,” Journal of Global Security Studies 7, no. 4 (December 2022), https://academic.oup.com/jogss/article/7/4/ogac016/6647447. Return to text.
  26. Silvia Knobloch-Westerwick, Choice and Preference in Media Use (Routledge, 2015); Dominic Spohr, “Fake News and Ideological Polarization: Filter Bubbles and Selective Exposure on Social Media,” Business Information Review 34, no. 3 (August 2017); Eytan Bakshy et al., “Exposure to Ideologically Diverse News and Opinion on Facebook,” Science 348, no. 6239 (May 2015); and Andrew Guess et al., “Less Than You Think: Prevalence and Predictors of Fake News Dissemination on Facebook,” Science Advances 5, no. 1 (January 2019), https://www.science.org/doi/10.1126/sciadv.aau4586. Return to text.
  27. William G. Nomikos et al., “American Social Media Users Have Ideological Differences of Opinion about the War in Ukraine,” Humanities and Social Sciences Communications 12, no. 1 (February 2, 2025): 125, https://doi.org/10.1057/s41599-024-04304-7; Yvonne Wingett Sanchez and Abigail Hauslohner, “Top Republican Warns Pro-Russia Messages Are Echoed ‘on the House Floor,’ ” The Washington Post, updated April 7, 2024, https://www.washingtonpost.com/politics/2024/04/07/russian-propaganda-republicans-congress/; Michael Dimock and Richard Wike, “America Is Exceptional in the Nature of Its Political Divide,” Pew Research Center, November 13, 2020, https://www.pewresearch.org/short-reads/2020/11/13/america-is-exceptional-in-the-nature-of-its-political-divide/; Stefan Borg, “ ‘A Battle for the Soul of This Nation’: How Domestic Polarization Affects US Foreign Policy in Post-Trump America,” Contemporary International History 79, no. 1 (February 2024), https://doi.org/10.1177/00207020241232986; and Weimann, “Communication.” Return to text.
  28. Sorin Matei and Sandra J. Ball-Rokeach, “The Internet in the Communication Infrastructure of Ethnically-Marked Neighborhoods: Meso or Macro-Linkage?” Journal of Communication 53, no. 4 (2003): 642–57. Return to text.
  29. Elisabeth Noelle‐Neumann, “The Spiral of Silence A Theory of Public Opinion,” Journal of Communication 24, no. 2 (1974): 43–51, https://doi.org/10.1111/j.1460-2466.1974.tb00367.x; Scheufele, “Spiral of Silence Theory”; and Alexis de Tocqueville, The Old Regime and the French Revolution, trans. Stuart Gilbert (Anchor, 1983), 155. Return to text.
  30. Noelle‐Neumann, “Spiral of Silence,” 51; and Elisabeth Noelle-Neumann, The Spiral of Silence: Public Opinion – Our Social Skin, 2nd ed. (The University of Chicago Press, 1993). Return to text.
  31. Jianhui Liu et al., “Understanding the Role of Risk Preferences and Perceptions in Vaccination Decisions and Post-Vaccination Behaviors among U.S. Households,” Scientific Reports 14 (February 2024); and Karaganov et al., “Russia’s Policy toward World Majority.” Return to text.
  32. Il Hyun Cho and Seo-Hyun Park, “The Rise of China and Varying Sentiments in Southeast Asia toward Great Powers,” Strategic Studies Quarterly 7, no. 2 (Summer 2013); Hung and Hung, “China’s Cognitive Warfare”; Kirk Spitzer, “The South China Sea: From Bad to Worse,” TIME, July 15, 2012, https://nation.time.com/2012/07/15/the-south-china-sea-from-bad-to-worse/; Xi Jinping, “Let the Sense of Community of Common Destiny Take Deep Root in Neighboring Countries,” China’s Diplomacy in the New Era, October 24, 2013, http://en.chinadiplomacy.org.cn/2021-01/27/content_77158881.shtml; Sorin Adam Matei and Matt Ellis, “China’s South China Sea Strategy Is All about Taiwan,” The National Interest, September 17, 2021, https://nationalinterest.org/blog/buzz/chinas-south-china-sea-strategy-all-about-taiwan-193943; and Sharon Seah et al., The State of Southeast Asia 2024 Survey Report (ASEAN Studies Center, 2024). Return to text.
  33. Seah et al., Southeast Asia 2024. Return to text.
  34. HQDA, Information, ADP 3-13 (HQDA, July 2019), 5-9, https://armypubs.army.mil/epubs/DR_pubs/DR_a/ARN39736-ADP_3-13-000-WEB-1.pdf; JCS, Information in Joint Forces, JP 3-04 (JCS, 2022), IV-26; HQDA, Information; JCS, Information Support Operations; and JCS, Information in Joint Forces. Return to text.
  35. “Krach Institute for Tech Diplomacy at Purdue Announces Initiative to Establish the Global Trusted Tech (xGTT) Standard,” Global Tech Security Commission, May 12, 2024, https://techdiplomacy.org/globaltechsecuritycommission/news/krach-institute-for-tech-diplomacy-at-purdue-announces-initiative-to-establish-the-global-trusted-tech-xgtt-standard/. Return to text.