Image courtesy of Army Medicine/Flickr. (CC BY 2.0)
This article was originally published by the Stockholm International Peace Research Institute (SIPRI) on 29 July 2019.
The states parties to the Biological and Toxin Weapons Convention (BWC) gathered in Geneva from 29 July to 8 August for a series of Meetings of Experts. Among other topics, states reviewed scientific and technological developments that impact the objectives of the treaty. Additive manufacturing (AM)—also referred to as 3D printing—is one of the technologies that is starting to receive attention, next to more well-known biotechnologies and genetic engineering techniques. Advances in AM have been met with concerns over its potential to facilitate the development, production, delivery and thus proliferation of biological weapons—and have highlighted the potential role of export controls in reducing these risks.
This article was originally published by War on the Rocks on 4 June 2019.
Has global strategic competition become a race for dominance in artificial intelligence (AI) between the United States and China? Versions of this claim have become something of an axiom, offered by officialdom and the analytical community alike. That AI will be the primary axis of future strategic competition is contestable, however. Moreover, the notion of an AI race in and of itself will generate policy risk. Making policy based on those assumptions could lead to narrowing options, not only in the realm of competition between states but regarding human affairs in general.
This graphic provides an overview of the nations in which major cyber theft incidents were initiated as well as the countries affected by these attacks between 2000 and 2018. To find out what this highlights about the eclipse of Western military-technological superiority, read Michael Hass’ chapter for Strategic Trends 2019 here.
Image courtesy of Moraima Johnston/DVIDS
This article was originally published by the Center for International Maritime Security (CIMSEC) on 6 May 2019.
The U.S. Navy faces a future where large portions of its fleet will be composed of non-traditional assets. Specifically, unmanned systems comprise a significant portion of the Chief of Naval Operations’s (CNO) “key platforms and payloads” which the Navy seeks to acquire.1 That direction from the top is further born out in the Navy’s most recent shipbuilding plan which includes 10 large unmanned surface vessels and 191 unmanned undersea vehicles of various sizes. These numbers contrast with the total of 55 “battle force ships” planned to be built over the same period.2 Tonnage obviously also plays a role in this type of comparison, but by sheer numbers the Navy is moving toward unmanned vice manned platforms. The Navy must think past the engineering hurdles and determine how to effectively employ these new assets. To do so, we propose that the Navy revisit history and revitalize the complex learning system it used to exploit an earlier set of new capabilities prior to World War II. Specifically, we call for the Navy to accelerating standing up a dedicated experimental squadron with the purpose of exploring advanced tactics for employing unmanned systems in a series of tactically challenging, objective-based exercises.
Image courtesy of Geralt/Pixabay
This article was originally published by the ETH Zukunftsblog on 24 May 2019.
The growing politicisation of AI harbours risks. Sophie-Charlotte Fischer and Andreas Wenger propose a hub for AI research in Switzerland committed to the responsible development of the new technologies.
The surge of progress in Artificial Intelligence (AI) over the last few years has been driven primarily by economic market forces and the manifold commercial applications. Large global technology companies, particularly in the US and China, lead the field in AI. Yet this concentration of AI resources in a few private corporations is increasingly undercutting the competitiveness of public research institutions and smaller companies. Such oligopolistic market dynamics threaten to exacerbate existing economic and social inequalities.