Image courtesy of geralt/Pixabay
The weaponisation of hyper-realistic synthetic video, audio, images, or texts –generally known as of synthetic media– may affect national security. How and to what extent?
Artificial intelligence (AI) is often portrayed as a single omnipotent force — the computer as God. Often the AI is evil, or at least misguided. According to Hollywood, humans can outwit the computer (“2001: A Space Odyssey”), reason with it (“Wargames”), blow it up (“Star Wars: The Phantom Menace”), or be defeated by it (“Dr. Strangelove”). Sometimes the AI is an automated version of a human, perhaps a human fighter’s faithful companion (the robot R2-D2 in “Star Wars”).
Image courtesy of ev/Unsplash
How is artificial intelligence (AI) affecting conflict and its resolution? Peace practitioners and scholars cannot afford to disregard ongoing developments related to AI-based technologies – both from an ethical and a pragmatic perspective. In this blog, I explore AI as an evolving field of information management technologies that is changing both the nature of armed conflict and the way we can respond to it. AI encompasses the use of computer programmes to analyse big amounts of data (such as online communication and transactions) in order to learn from patterns and predict human behaviour on a massive scale. This is potentially useful for managing corporations and shaping markets, but also for gaining political influence, conducting psychological warfare and controlling populations.
According to some, artificial intelligence (AI) is the new electricity. Like electricity, AI will transform every major industry and open new opportunities that were never possible. However, unlike electricity, the ethics surrounding the development and use of AI remain controversial, which is a significant element constraining AI’s full potential.
This graphic outlines the potential benefits and disadvantages of using of Artificial Intelligence (AI) in the military field regarding 1) strategic decision making, 2) training and the organization of armed forces; and 3) military operations. To find out more, read Niklas Masuhr’s recent CSS Analyses in Security Policy on ‘AI in Military Enabling Applications’