International warfare has developed significantly through technological development, and it stands today in a unique place due to how much warfare takes place by proxy or in and through affiliated countries. In this regard, cognitive warfare has become a keystone of the modern military, and COSMOS director Dr. Nitin Agarwal’s formal research on mitigating or countering it, which he presented in November 2023, was recently published by NATO.
Cognitive warfare is a primary concern of NATO today because it is currently transforming under the new context of AI. Cognitive warfare has existed for as long as there has been an intersection between technology and war. NATO’s Science & Technology Organization Human Factors and Medicine exploratory team 356 (STO-TR-HFM-ET-356) titled “Mitigating and Responding to Cognitive Warfare” used the example of famous WWII fighter pilot John Boyd. In his study of military history and from his experience as a pilot, Boyd noted how successful war tactics went beyond simple strength and force—tactics like deception, speed, and fluidity of action could turn a conflict just as much; in his own words, these were the tactics that caused an enemy to “unravel before the fight.” Success in conflict stems from the ability to do each of these things—and research must continue to learn how cognitive warfare develops with new technology.
And such warfare, in our technological age, has progressed significantly, now that the Internet and social media connect peoples of different countries while opening countries to cyber attacks. NATO has for several years recognized the importance of research on cognitive warfare of this nature, and for this reason, NATO has sought to define cognitive warfare, which, in their ET-356 report, is warfare which “gives rise to the adversaries’ ability to shape human cognition, perception, sensemaking, situational awareness, and decision making” and that “aims at disrupting relationships and targets human vulnerabilities, such as trust and cognitive bias.” In particular, they note the developing technologies of Artificial Intelligence (AI), Machine Learning (ML), and Information Communication Technologies (ICT) as key to the cognitive warfare of the new era.
Dr. Agarwal’s research published by NATO, titled “Developing Socio-Computational Approaches to Mitigate Socio-Cognitive Security Threats in a Multi-Platform Multimedia-Rich Information Environment,” discusses the growing influence of social media weaponization on various military operations and the subsequent socio-cognitive security threats. This research was published along with others presented back in November 2023 at the NATO Science & Technology Organization Symposium on Mitigating and Responding to Cognitive Warfare (STO-MP-HFM-361) held in Madrid, Spain.
In his research, Dr. Agarwal acknowledges the rapid evolution of information technology and the corresponding increase in sophisticated online influence operations. These operations exploit multimedia-rich online information environments (OIEs) like social media platforms to manipulate public perception, spread misinformation, and undermine social trust—necessitating advanced socio-computational frameworks that can keep pace with these evolving threats. Traditional methods of addressing misinformation and online influence, which often rely heavily on text-based analysis, are no longer sufficient. Dr. Agarwal argues for a comprehensive, multi-modal approach that includes the analysis of video, audio, real-time broadcasting, and interactive chat environments to effectively monitor, detect, and counteract malicious influence operations. Understanding the dynamics of information actors—including both producers and consumers of information, and the processes of information campaigns—is crucial for developing effective countermeasures against cognitive threats.
Additionally, the research touches upon the limitations of existing frameworks and the need for new methodologies that can address the unique challenges posed by multimedia platforms. It suggests that incorporating computational and analytical tools capable of handling the complexity and richness of data from various media types is essential for enhancing the effectiveness of countermeasures. Many existing frameworks are primarily text-based and do not adequately consider the dynamic nature of multimedia platforms. They often fail to account for the complex interactions between different types of media and the real-time nature of many online influence operations.
He presses for a more comprehensive approach that includes the characterization of multimedia content types such as video, audio, and real-time broadcasts, emphasizing the importance of considering the dynamics of information actors and the processes of information campaigns. His new framework includes a taxonomy of content types and analysis types that are essential for monitoring, tracking, detecting, and mitigating emerging threats in the information environment.
Dr. Agarwal continues by introducing a collective action (CA) theory–based framework to characterize coordinated cognitive attacks. He argues that disinformation campaigns are increasingly becoming a collective phenomena. To address this, he has developed a CA framework that combines several theories, including social identity, deindividuation, information manipulation theory, motivated reasoning theory, resource mobilization theory, and social movement spillover theory, among others.
The CA framework also integrates the Deviant Cyber Flash Mob (DCFM) model and focal-structure analysis (FSA). The DCFM model measures a mob’s utility, interest, control, and power, while the FSA identifies key units within social networks that are most powerful in mobilizing and coordinating cognitive attacks. This combination allows analysts to identify and target adversarial campaigns effectively.
He explains how the CA framework examines the processes of collective identity formation, mobilization, and network organization, and identifies various factors that influence these processes. These factors are categorized into cognitive, relational, and environmental categories. By understanding these underlying factors, the framework provides a richer understanding of the dynamics of collective action in multimedia OIEs. He also emphasizes the importance of social network analysis in understanding the connections and ties between participants in collective actions. Social media platforms offer various affordances, such as posting, resharing, replying, and mentioning, which lead to different types of interdependencies among individuals. The CA framework helps identify key individuals who have the power to mobilize crowds and lead collective actions.
Dr. Agarwal’s research also focuses on population characterization and impact assessment. He discusses approaches for characterizing populations as susceptible, exposed, infected, or skeptic (SEIZ) to cognitive threats like toxicity or dis/misinformation, allowing decision-makers to target the right groups for effective mitigation of cognitive attacks. These approaches leverage the Diffusion of Innovation theory and epidemiological theories to understand how information spreads and influences different segments of the population. The SEIZ model specifically characterizes information actors and consumers, showing how toxicity spreads like a contagion, polarizes communities, and leads to a breakdown of discourse. This model provides indicators of cognitive attacks and ways to measure their impact. By understanding the dynamics of information spread and the susceptibility of different population segments, decision-makers can develop targeted interventions to mitigate the effects of cognitive attacks.
Finally, Dr. Agarwal discusses real-world applications and tools developed based on the research presented. He presents several software tools developed by COSMOS, including BlogTracker, VideoTracker, and the COVID-19 Misinformation Tracker, which have been recognized as top solutions in various challenges and competitions. These tools are designed to monitor, track, and mitigate the effects of misinformation and cognitive attacks in real-time, and he emphasizes the importance of transitioning research-driven models into usable software tools that can be deployed in real-world scenarios. The COVID-19 Misinformation Tracker, for example, was deployed in partnership with the Arkansas Office of the Attorney General to educate the public about false claims in misinformation campaigns. It was also recognized by the World Health Organization as a key technological innovation in addressing the COVID-19 pandemic.