This article was originally published by E-International Relations on 22 May 2017.
These days, the pulse of the world’s political health is running fast. The general prognosis is terminal, the end of the international world order, as we know it. But determining what order we are on the verge of losing could do with more diagnosis, including tracking the symptoms of the disorder (and order) back to their beginnings. One of the useful roles that historians can play in this regard is to offer a longer view of what we have lost, or, at least, the international order that seems to be disappearing from view. So bear with me as I offer a “Cook’s tour” of two centuries in search of the point where the end possibly began, in order to understand better the history of the aims—or “ends”—of international order itself.
European historians have long assumed that the early nineteenth century made “international” politics possible: In 1814, after decades of continental wars against French hegemony, a coalition led by Russia, including Sweden, Prussia, Austria, and Britain (as well as some smaller now non-existent sovereignties) emerged victorious and established what became known as the “Congress system.” At its most basic, this comprised negotiations through discussion—famously identified with the Congress of Vienna—and transnational cooperation in the interests of permanent peace. In the years that followed, ambassadorial conferences in London, and occasional conferences around the smaller towns of the European continent, became a method for managing territorial and ideological flashpoints. Within a few years, the British foreign minister Lord Castlereagh confidently reported to his Prime Minister the practical value of this transformation of European politics:
how much solid good grows out of these Reunions, which sound so terrible at a distance. It realy [sic] appears to me to be a new discovery in the Science of European Government at once extinguishing the Cobwebs, with which Diplomacy obscures the Horizon – bringing the Whole bearing of the system into its true light, and giving to the Counsels of the great Powers the Efficiency and almost the simplicity of a Single State.
Courtesy Pedro Ribeiro Simões/Flickr
This article was originally published by E-International Relations on 31 October 2016.
The so-called history problem has long been seen by academics and pundits as a key obstacle to the improvement of bilateral relations between China and Japan. In the academic literature, the problem is typically described as consisting of a number of sub-issues related primarily to Japan’s attitude towards its invasion of China in the 1930s and 1940s, an attitude that many regard as insufficiently repentant. In this literature the meaning of the history problem tends to be understood as fixed rather than as something that changes over time. Even though numerous discussions of the problem exist and many observers agree on its importance for Sino-Japanese relations, the question of how the history problem itself is understood within Japan and China has received surprisingly scant attention. This article, by contrast, argues that while the specific sub-issues viewed as being part of the problem are indeed important, currently the most fundamental and overlooked aspect of the history problem in Sino-Japanese relations is the lack of agreement on what exactly the problem is.
This article was originally published by the Small Wars Journal in September 2016.
Thanks to a sequence of fortunate accidents around 2005-2006, the world discovered the intellectual legacy of David Galula (1919-1967). Since then, two books and one monograph restituted the story of his life or vast segments of it. Although some went into a fascinating level of detail, none of these, in my view, are an easy read for a non-military audience. A minimal awareness in terms of war studies is needed to really capture what they had to say. Besides, the French-speaking readership is still far from hearing about Galula. These are the two reasons why I decided to tell the story of Galula’s life – in French.
Writing a book about David Galula amounts to recounting the story of a paradox (many of them actually). On the one hand, there is a consensus on him being the founding father of counterinsurgency, a groundbreaking theory in modern military affairs. Galula was a self-made man in various aspects; born into a relatively modest environment, he rose to positions where no one expected him to, in virtue of his faith and social background. He traveled the world and exerted the full scope of his talents in a diversified career ranging from diplomat, author and secret agent to infantry officer. Many influential people liked him and very few voiced any opposition or hostility at his respect. Yet, Galula’s legacy went silent after his premature death in 1967. During forty years, neglect and bad luck buried Galula’s unorthodox and stimulating contributions to the art of war. Neither book royalties nor a military pension were enough to keep his widow from having to look for a job to make a living and raise their only child. Galula is still mostly unheard of in his home country, France.
Courtesy Rhoni Mcfarlane/Flickr
This article was originally published by Global Policy on 4 May 2016.
Why do certain ideas and political paradigms endure while others become obsolete or are rejected?
This question has preoccupied political and philosophical scholarship for millennia. This article puts forward four conditions for the survivability of ideas. It argues that modern tools for understanding human nature, such as those offered by neuroscience, provide us with unprecedented insights about human predilections and needs. Based on these findings, we can better conceptualize why some ideas thrive while others do not and their possible implications to international relations. The human need for dignity is central to this explanation: no ideas can thrive if they do not guarantee and safeguard human dignity.
In 1859, Darwin introduced the concept of natural selection in On the Origin of Species, and J.S. Mill explored the flourishing of ideas in On Liberty. In Darwinian natural selection, features that do not contribute to the function of the individual vanish over the course of generations, as bearers of such traits lack the reproductive fitness to pass those features on to their offspring. Mill applied a similar argument to ideas: good ideas would survive the rigors of critical debate, but there were no means of discovering which ideas would endure apart from testing them. In my attempt to continue this debate, I turn to neuroscience. Advances in neuroscience and brain-imaging inform us about underlying predilections in our nature, which indicate that we will be more likely to choose and validate certain ideas over others. My task here is to unpack this premise and to do so by looking at four prerequisites for the selection of ideas.
Radioactive Graffiti, courtesy Tristan Schmurr / flickr
This article was originally published by the YaleGlobal Online on 21 June 2016.
Led by Russia and the United States, the world reduced the nuclear stockpile from 60,000 weapons to about 16,000 held by nine nations. The total still poses a grave global threat. Any nuclear attack or accident would kill many, devastating an entire region, which in turn would revive demands for abolition, explains Bennett Ramberg, author and a former policy analyst in the US Bureau of Politico-Military Affairs during the George H.W. administration. No country has used the bomb since World War II, he explains, and “A presumption emerged that a nuclear-use taboo overwhelms any inclination toward nuclear use.” The potential for nuclear catastrophe runs high in an era of terrorism and chaos emerging out of failed states, but prevention is possible, too. Global agreement is required, notes Ramberg, and he points to the 1946 Baruch Plan as a foundation. The plan calls for an international authority to manage atomic energy and an end to manufacturing nuclear weapons.
Seventy years ago this month the United States placed on the global agenda a proposal that would have eliminated nuclear weapons for all time. Drawing on the US State Department’s Acheson-Lilienthal scientific advisory study, the Truman administration turned to the long-time confidant of presidents, Bernard Baruch, to craft a proposal for global action.
In June 1946, Baruch appeared before the newly constituted UN Atomic Energy Commission to present the nuclear abolition plan that would come to bear his name. He called for establishment of an International Atomic Development Authority that would retain “managerial control or ownership of all atomic energy potentially dangerous to world security,” eliminate weapons manufacturing and dispose of all existing bombs while asserting “power to control, inspect, license all other atomic activities” coupled with assured enforcement. Had Cold War politics not intervened – Stalin pressed his scientists to build a competitive Soviet bomb as rapidly as possible – the nuclear Damocles Sword that’s hung over the world ever since might have been avoided.