Yemen’s AQAP Dilemma

Counterterrorism Yemen-style, photo: Ammar Abd Rabbo/flickr

Since the al-Qaida in the Arabian Peninsula (AQAP) crisis erupted in Yemen, the country has suddenly been thrown into the international spotlight. While numerous think tanks and experts have been warning for years of the critical challenges that Yemen faced (the southern secession movement, the Houthi rebellion, AQAP) most governments only really started to take note after Umar Farouk Abdulmutallab set his pants on fire and tried to bring down Northwest Airlines flight 253 in what was dubbed the “Christmas Day plot” in 2009.

The international attention given to Yemen has, not surprisingly, since then focused on the terrorism threat. In President Ali Abdallah Saleh’s calculations AQAP was long seen as a nuisance, not as a substantial threat to his presidency or the unity of the country. The southern secession movement and the Houthi rebellion in the North were perceived as far more dangerous and potentially consequential, particularly for President Saleh.

After the attack on USS Cole  in 2000 and again after the incident involving Umar Abdulmutallab the US has made it abundantly clear that it expects President Saleh to reign in AQAP. Development aid flows as well as military assistance have been closely tied to Yemen’s cooperation with regards to fighting al-Qaida.

US aid flows have been varying greatly in the last decade, depending on the current threat perception. In 2000 Yemen got a relatively meager $400,000 in food aid from the US. In 2001, after the attack on the USS Cole, the US administration deliberated an aid and loan forgiveness package of around $400 million. In 2006, when the terrorism threat was thought to be over, the US cut aid again to $18.7 million. Since then US aid to Yemen has steadily increased every year, reaching $58.4 million in 2010. This is a threefold increase in only four years. According to the Congressional Research Service (CRS) the administration has requested a staggering $106.6 million for 2011.

Denying Terrorists Glory

Denying terrorists the glory. photo: Adam Tinworth/flickr

In the wake of 9/11 and subsequent terrorist attacks on European and American soil governments felt the need to adapt their legislation to what was perceived as a new threat paradigm. Policymakers considered existing criminal laws as insufficient to combat terrorism. A contributing factor was that governments needed to be seen doing something about the threat. Enacting new laws is one of the things governments are very good at.

A large number of new laws have since then been enacted specially aimed at terrorism-related offenses. Examples include the Prevention of Terrorism Act in the UK, the US PATRIOT Act and various law amendments in Germany.

These special laws have been criticized on many accounts. Civil rights advocates disapprove of the sweeping powers some of these laws bestow upon authorities. The potential for abuse has been highlighted repeatedly. The main thrust of the argument has been that special laws for terrorism were not necessary, because killing people was a crime anyway, regardless of the ideology behind such acts.

While the process of radicalization is still poorly understood, it seems that adventure-seeking is an important part in the trajectory of many radicalized people. Terrorism, after all, is “exciting” business. That is the way it is depicted in jihadist videos and print publications with guns, explosives, and heroic battles against the West. This does not square with reality. An important number of uncovered plotters in the US and in Europe were amateurs and completely inept at their trade. They seemed to have little in common with their role models.

An unintended consequence of these aforementioned terrorism laws is that they give would-be terrorists a status which they clearly do not deserve. The label “terrorist” signals danger to society. In some circles this is seen as a badge of honor.

Connections Count

The spectre of homegrown attacks, photo: Josh Gross/flickr

America and Europe have experienced a string of terrorist attacks perpetrated by “homegrown” terrorists. But the term “homegrown” is often conflated with “independent”. There are in fact two types of homegrown terrorists: those with external support and guidance and those without. In recent years a clear pattern has emerged. Technically sophisticated attacks, such as the 7/7 attacks in London and the airline liquid explosives plot, have with almost no exception been carried out by terrorists who where homegrown, but had received substantial training and guidance from terrorist groups outside Europe, usually based in Pakistan. Terrorists who lacked the connections to established terror networks had to resort to more primitive methods such as shooting or stabbing.

This importance of hands-on training has been neglected in the hype surrounding “homegrown” terrorism. It turns out that it is more difficult than it was once believed to teach bomb making and other essential terrorist skills over the internet. One indication for this is that intelligence agencies still presume that there are only a limited number of proficient bomb makers within al-Qaida’s ranks.

The internet, however, does play a role in radicalization processes. In May 2010 British student Roshonara Choudhry tried to stab MP Stephen Timms for his support of the Iraq war. When interrogated by the police shortly after the crime, she said that video sermons by the radical preacher Anwar Al-Awlaki, who resides in Yemen, had prompted her to “punish” Timms. She had also consulted an Islamist website which had called on Muslims to “raise the knife of Jihad” against MPs who had voted for the Iraq war in 2003. There is no question that Choudhry was not radicalized solely by watching a couple of videos featuring Al-Awlaki, but it is reasonable to assume that these contributed to her decision to attack Timms.