Old Propaganda Tricks Add a New Digital Upgrade
A look to the past may soon seem like a look into the future, according to Illinois Tech researchers who recently explored how psychological operations have evolved and how artificial intelligence is changing them.
“Weaponising the Mind: AI, Cyberspace, and the Future of Psychological Operations,” written by Illinois Tech Associate Professor of Information Technology and Management Maurice Dawson and students Cedric Nartey (ITM/M.A.S. CYF 4th Year) and Abdul Hadi Khan (M.A.S. CYF 2nd Year) was published in the Journal of Military Studies.
The paper traces the evolution of psychological warfare from World War II to present-day disinformation campaigns, and it highlights how adversaries can exploit democratic transparency to undermine public trust. The research team also explores how AI has added to the psychological warfare toolkit and changed how psychological operations work across physical, digital, and emotional platforms.
“I’m especially interested in cyberpsychology and cyber warfare and how influence operations evolve alongside communication tools,” Nartey says. “AI is part of that evolution, but the core issue is psychological manipulation at scale. That broader intersection of human behavior, security, and strategy is something I hope to explore further.”
The team examined how propaganda has morphed from posters, state-run broadcasts, and educational programs into modern techniques such as utilizing social media campaigns and news outlets as part of operations, which makes it more difficult for targets to distinguish it as propaganda. The team also reviewed how AI tools such as deepfakes and AI-generated memes are becoming more sophisticated and more difficult to detect.
“The biggest challenge was avoiding technological determinism, or the idea that AI alone is reshaping everything,” Nartey says. “Influence operations have existed for centuries. I addressed this by grounding my perspective in historical case studies and then analyzing how modern tools modify, rather than reinvent, those strategies. It helped keep the analysis measured instead of dramatic.”
The team also argues that democracies often are at a disadvantage compared with more authoritarian regimes, as democracies tend to be constrained by legal norms and civic values, such as free speech. As a result, the team argues that the battle for influence in cyberspace may be won by those with the most data and the best algorithms rather than with the truth.
Navigating this terrain requires technical innovation as well as a renewed commitment to defending cognitive sovereignty in the age of algorithmic persuasion.
“Whether it’s AI-assisted or manually crafted, psy-ops succeed by exploiting emotion, bias, and information overload,” Nartey says. “Effective defense requires strong critical thinking, institutional transparency, technical monitoring, and public resilience. It may seem like a new beast, but it’s the same old tricks.”
Image: (From left to right) Cedric Nartey, Abdul Hadi Khan, and Maurice Dawson