The Emergence of Generative AI May Trigger a New Global Race for Military and Disinformation Dominance


Recent developments in the world of artificial intelligence (AI) have caught the attention of governments worldwide. They are rushing to embrace algorithms that have introduced a level of intelligence into systems like ChatGPT, primarily due to the anticipated significant economic benefits. However, concerning reports indicate that nations are not only keen on adopting this technology for legitimate purposes but are also quickly adapting it to create powerful tools for disinformation campaigns and potentially triggering an alarming AI arms race among global superpowers.

The RAND Corporation, a non-profit think tank that provides advice to the United States government, has detected instances of a Chinese military researcher discussing the potential use of generative AI for information campaigns. While there is currently no concrete evidence of these strategies being put into action, experts are growing increasingly concerned about the scale and potency that generative AI could bring to influence operations. For example, generative AI could be used to create millions of fake social media accounts, seemingly originating from various countries, to disseminate a state’s narrative. Such a capability would represent a significant qualitative and quantitative shift from traditional methods.

Historically, online disinformation campaigns, like those conducted by Russia’s Internet Research Agency during the 2016 US election, relied heavily on manual labor, with human operators working tirelessly behind keyboards. However, the recent advancements in AI algorithms open up the possibility of automating the creation of deceptive content, including text, images, and videos. Moreover, these AI-driven entities could engage in convincing interactions on social media platforms. Shockingly, one project estimated that launching such a campaign might cost just a few hundred dollars.

The RAND report underscores that many nations, including the United States, are likely exploring the potential use of generative AI for their own information campaigns. The widespread availability of generative AI tools, including open-source language models that can be accessed and modified by anyone, has significantly reduced the barriers to entry for those seeking to launch information warfare. Consequently, the report warns that a diverse range of actors, including technically adept non-state entities, could leverage generative AI to manipulate social media platforms and disseminate disinformation.

A separate report, issued by the Special Competitive Studies Project (SCSP), a tech-focused think tank, highlights the geopolitical implications of generative AI. It recommends that the US government make substantial investments in this technology because of its potential to revolutionize multiple industries and offer newfound military capabilities, economic prosperity, and cultural influence to the nation that pioneers its use.

Both reports sound an alarm, suggesting that the vast potential of generative AI could set off an arms race as nations seek to harness the technology for military and cyber warfare purposes. If these reports are accurate, the world may be on the brink of an information-space arms race, which could prove exceptionally challenging to control.

Preventing the nightmare scenario of an internet inundated with AI-driven bots programmed for information warfare requires a collaborative effort by humans. The SCSP report recommends that the United States take a leading role in global efforts to promote transparency, build trust, and foster cooperation regarding generative AI. The RAND researchers propose that diplomats from the US and China engage in discussions about generative AI and the associated risks. William Marcellino, an AI expert and senior behavioral and social scientist at RAND, emphasizes the shared interest in preventing a polluted and incredulous internet, a goal that transcends borders.


Please enter your comment!
Please enter your name here