The CSS Blog Network

The Moral Code: How To Teach Robots Right and Wrong

“PackBot”, a battlefield robot used by the US military. Image: Sgt. Michael J. MacLeod/Wikimedia


This article was originally published by Foreign Affairs on 12 August 2015.

At the most recent International Joint Conference on Artificial Intelligence, over 1,000 experts and researchers presented an open letter calling for a ban on offensive autonomous weapons. The letter, signed by Tesla’s Elon Musk, Apple co-founder Steve Wozniak, Google DeepMind CEO Demis Hassabis, and Professor Stephen Hawking, among others, warned of a “military artificial intelligence arms race.” Regardless of whether these campaigns to ban offensive autonomous weapons are successful, though, robotic technology will be increasingly widespread in many areas of military and economic life.

Over the years, robots have become smarter and more autonomous, but so far they still lack an essential feature: the capacity for moral reasoning. This limits their ability to make good decisions in complex situations. For example, a robot is not currently able to distinguish between combatants and noncombatants or to understand that enemies sometimes disguise themselves as civilians. » More

What Are the Ethical Implications of Emerging Tech?

Hal 9000, the intelligent computer of Stanley Kubrick’s 2001: A Space Odyssey. Image: OpenClips/Pixabay

This article was originally published by Agenda, a blog operated by the World Economic Forum, on 4 March, 2015.

In the past four decades, technology has fundamentally altered our lives: from the way we work, to how we communicate, to how we fight wars. These technologies have not been without controversy, and many have sparked intense debates, often polarized or embroiled in scientific ambiguities or dishonest demagoguery.

The debate on stem cells and embryo research, for example, has become a hot-button political issue, involving scientists, policy-makers, politicians and religious groups. Similarly, the discussions on genetically modified organisms (GMOs) have mobilized civil society, scientists and policy-makers in a wide debate on ethics and safety. The developments in genome-editing technologies are just one example that bio research and its impact on market goods are strongly dependent on social acceptance and cannot escape public debates of regulation and ethics. Moreover, requests for transparency are increasingly central to these debates, as shown by movements like Right to Know, which has repeatedly demanded the labelling of GMOs on food products. » More

Drones: From Technology to Policy, Security to Ethics

«Drones: From Technology to Policy, Security to Ethics». Poster for the conference organized by the ISN and ETH Global. Image: ISN

Rapid technological advances are making drones cheaper, more accessible and highly adaptable. Once the exclusive preserve of the world’s most advanced armed forces, unmanned platforms are now being used by civilian actors for a wide range of applications. Yet, while members of the technical community have tended to emphasize the opportunities that this technology offers, their counterparts in international relations and other fields have increasingly raised questions about the legal, ethical, humanitarian and security implications of unmanned aerial systems (UAS). Against this backdrop, ETH Global and the ISN recently hosted a one-day conference that brought together over 160 experts from the fields of robotics, environmental science, law and ethics, and international relations and security. Since ETH Zurich is considered one of the world’s leading ‘competence centers’ in the field of robotics systems and control, its activities offer a glimpse into emerging UAS technologies and their potential social impact in the future.

» More

Civilian Drones: Fixing an Image Problem?

Image:flickr/XRay40000

Drones were among the most popular Christmas gifts in 2014 — so popular, in fact, that British authorities warned recreational drone users to make sure to use their toys lawfully, or to expect hefty fines. Similarly, the US FAA released a video just before the holidays, teaching aspiring drone users how to “stay off the naughty list”.  More and more people are becoming familiar with drones as the number of ‘hobby droners’ (yes, this is a term) grows.  Businesses are discovering drones as well: drones carry mistletoe in restaurants (with questionable results), or are used to give real-estate buyers a better view of their property. Beyond this, hundreds if not thousands of commercial drone users are waiting in the wings for a few last technical details to be figured out (especially sense-and-avoid technology) and for the implementation of legal regulations allowing drones to share airspace with manned aircraft. » More

Lethal Robots and the Conduct of Warfare

The military robot “Atlas”, developed by Boston Dynamics for DARPA . Image: DARPA/Wikimedia

This article was originally published by the ASPI Strategist on 5 December, 2014.

The use of lethal robots in conflict is inevitable. When it happens, it’ll create a significant shift in the ways of warfare. A discussion has already begun (see here and here) on how such capabilities might be developed and applied.

Robots in general are becoming smaller, smarter, cheaper and more ubiquitous. Lethal robots are becoming more deadly and discriminating. The degree of autonomy will be a key driver of a robot’s role in conflict and is likely to evolve in three generations; the semi-autonomous, the restricted-autonomous, and ultimately the fully-autonomous generation. » More

Page 1 of 2