Extremist groups are now using artificial intelligence to create convincing voice clones. These advanced tools help them spread propaganda more effectively across various platforms. Experts warn this new development significantly enhances their reach and influence.
Key Takeaways
- Extremist groups use AI voice cloning to mimic prominent figures.
- AI tools enable seamless, contextually accurate translation of extremist content.
- Neo-Nazis have used AI to create English versions of historical speeches, gaining millions of views.
- Pro-Islamic State media leverages AI for text-to-speech renditions and multilingual content.
- Counterterrorism efforts face new challenges in keeping pace with AI-driven propaganda.
AI Transforms Propaganda Landscape
The rise of artificial intelligence has brought changes to many industries. It also provides new opportunities for extremist movements. These groups now employ AI voice-generating bots to replicate the voices and speeches of key figures within their ideologies. This practice is significantly boosting their propaganda efforts.
Lucas Webber, a senior threat intelligence analyst at Tech Against Terrorism and a research fellow at the Soufan Center, highlights this shift. He monitors the online tools used by terrorist and extremist groups globally. Webber notes that AI-enabled translation marks a major evolution in digital propaganda strategies.
Previously, these groups relied on human translators or basic machine translation. These methods often lacked fidelity and nuanced style. With advanced generative AI tools, groups can now produce seamless, accurate translations. These translations preserve tone, emotion, and ideological intensity across multiple languages.
Understanding AI Voice Cloning
AI voice cloning involves using artificial intelligence to generate speech that mimics a specific human voice. This technology learns from existing audio samples of a person's voice. It then creates new audio in that same voice, even for text the person never spoke. The realism of these clones has improved rapidly.
Neo-Nazis Leverage AI for Historical Figures
The neo-Nazi far-right has quickly adopted AI voice cloning software. They use it extensively to spread their messages. Several English-language versions of Adolf Hitler’s speeches have appeared online. These AI-generated audios have garnered tens of millions of streams across platforms like X, Instagram, and TikTok.
According to research from the Global Network on Extremism and Technology (GNet), extremist content creators are utilizing voice cloning services, such as ElevenLabs. They feed these services archival speeches from the Third Reich era. The AI then processes this data to mimic Hitler's voice in English.
"The adoption of AI-enabled translation by terrorists and extremists marks a significant evolution in digital propaganda strategies."
— Lucas Webber, Senior Threat Intelligence Analyst
Spreading Hyper-Violent Messaging
Neo-Nazi accelerationists, who advocate for terrorism against Western governments to trigger societal collapse, also use these tools. They spread updated versions of their hyper-violent ideology. For instance, 'Siege,' an insurgency manual by American neo-Nazi James Mason, was transformed into an audiobook in late November.
A prominent neo-Nazi influencer, active on X and Telegram, confirmed involvement in creating the 'Siege' audiobook. The influencer stated using AI tools and a custom voice model of Mason. This allowed them to recreate every newsletter and most attached newspaper clippings from the original publications.
'Siege' Audiobook Impact
- The 'Siege' manual gained cult-like status among some online extreme right groups.
- It promotes lone actor violence.
- Several neo-Nazi groups endorsing terrorism consider it required reading.
The influencer praised the ability to take Mason’s writing from "pre-internet America" and give it a modern voice. They noted the "startling accuracy of predictions made through the early eighties." This, they claimed, fundamentally changed their view of their shared cause.
At its peak in 2020, the Base, a neo-Nazi organization, held a book club focused on 'Siege.' This manual heavily influenced several members. They discussed its perceived benefits in a hypothetical war against the US government. A nationwide FBI counterterrorism investigation later led to the arrest of over a dozen members on terrorism-related charges.
Joshua Fisher-Birch, a terrorism analyst at the Counter Extremism Project, explained the significance of the 'Siege' audiobook. He noted that while the creator had released similar AI content, 'Siege' carries a more notorious history due to its promotion of violence and its influence on groups endorsing terrorism.
Islamic State Uses AI for Global Reach
Jihadist terrorist groups also find AI useful, particularly for translations. Pro-Islamic State media outlets on encrypted networks are actively using AI. They create text-to-speech renditions of ideological content from official publications. This transforms text-based propaganda into engaging multimedia narratives, supercharging their message spread.
AI helps these groups translate extremist teachings from Arabic into easily digestible, multilingual content. In the past, figures like American-imam turned al-Qaeda operative Anwar al-Awlaki had to personally record English lectures. These lectures served as recruitment propaganda for the English-speaking world. The CIA and FBI have consistently cited al-Awlaki’s voice as a key factor in spreading al-Qaeda’s message.
On Rocket.Chat, a platform favored by the Islamic State for communicating with followers, a user posted a video clip in October. The video featured slick graphics and Japanese subtitling. The user highlighted the difficulty of creating such content without AI. They noted that Japanese would be extremely hard to translate while maintaining eloquence, with exceptions for audio.
Broader AI Adoption by Extremists
Beyond voice cloning, groups across the ideological spectrum are using free AI applications like OpenAI’s ChatGPT. They use these tools to amplify their overall activities. The Base and related groups have used ChatGPT for imagery creation. They acknowledged as early as 2023 that these tools streamline planning and research.
Counterterrorism authorities have long viewed the internet and technological advancements as a constant game of catch-up. Terrorist groups quickly exploit new technologies. Extremists have already leveraged emergent tools like cryptocurrency for anonymous fundraising. They also use technology to share files for 3D-printed firearms. AI voice cloning represents the latest challenge in this ongoing struggle.





