Orwell on Perpetual War

A fictional strategist’s logic for the continuation of hostilities in Eurasia/Eastasia:

“The war, therefore, if we judge it by the standards of previous wars, is merely an imposture. It is like the battles between certain ruminant animals whose horns are set at such an angle that they are incapable of hurting one another. But though it is unreal it is not meaningless. It eats up the surplus of consumable goods, and it helps to preserve the special mental atmosphere that a hierarchical society needs. War, it will be seen, is now a purely internal affair.”

orwell

Afghanistan: No Viable Goals and No End in Sight

With confirmation from United States officials earlier this week that an additional 4,000 troops will be sent to buttress the training and advisory mission in Afghanistan, one is forced to consider what to make of the state of affairs in that country. Frankly, it’s time the public started asking the hard questions, especially in light of Defence Industry Minister Christopher Pyne’s pledge that “[Australia] will always consider requests from the United States — our most important ally — for assistance”.

So what long-term national security interests are likely to be achieved by the US and its allies in Afghanistan in the future. Is the task to “defeat the Taliban” an impossible mission guided by a skewed sense of what the military can realistically accomplish? Is the current training mission “a bandaid for a bullet wound”, as one US combat advisor described it? A boulder to be rolled uphill by the military for all eternity, with an ever-so-slightly different campaign plan every four years?

According to Defence Secretary Jim Mattis, one of the chief architects of Donald Trump’s “new” strategy, the plan announced earlier this week draws on lessons learnt by the combat advisory teams who deployed alongside the Iraqi Army in the fight against Islamic State. The main takeaway, apparently, is that embedding Western military advisers with forward units is better than leaving them behind at base.

With a “frontline” emphasis for Trump’s campaign plan, you can see similarities to another “new” campaign plan recently outlined by Senator John McCain, who applauded Trump’s speech as a “big step in the right direction”. In his strategy, McCain argued that a “long-term, open-ended counter-terrorism partnership” with the Afghan government and the deployment of military adviser-trainers with the Afghan National Security and Defence Forces at the kandak (battalion) level instead of the higher corps level was the key to victory. What this means is that more troops are wanted to achieve a set of goals that a much larger force in 2011 could not achieve either.

To the uninitiated, a strategy that splits hairs over minutiae in mission structure instead of having a frank discussion about the mission’s fundamental problems might seem a little beside the point, especially when one considers that violence in Afghanistan derives less from non-desirable teacher-student ratios in US-Afghan training camps than it does from complex feuds over tribe and religion.

“There’s always more you can do — more advisers you can send, more capabilities you can develop for the Afghans,” says Dr Mike Martin, a Pashto-speaking former British army officer and research fellow at King’s College London.

“The Afghan government will take the support gladly because they would prefer that foreigners do the fighting for them. If you are an Afghan faction this is the game: get some foreigners to fight for you”.

Rather than being dragged into the conflict every time a new feud erupts between the Afghan government and its local enemies, Dr Martin argues, what is needed is simply a “minimum viable force” — the smallest possible training and support mission and a small counter-terrorism force — to keep the government afloat. This would prevent both mission creep and everybody’s worst case scenario — the fall of Kabul.

With such calls for minimalism seemingly sidelined in the President’s new strategy, however, the question that arises is what are an extra 4,000 troops going to do that the 100,000 deployed by President Obama in 2011 could not?

One begins to wonder if the emphasis on numbers and mission structure is a distraction from more basic problems looming in the background. Problems such as, say, the possibility that the Afghan National Security and Defence Forces might not be a viable fighting force without a permanent US military presence to buttress it.

The looming likelihood of a permanent war-footing for America in Afghanistan is worthy of consideration, not least because a core theme of Trump’s speech revolved around the idea that “conditions on the ground, not arbitrary timetables, will guide our strategy [from] now on”.

There’s a strong whiff of McMaster and Mattis in this phrasing because it’s indubitably correct that wars do not conform to neat timescales. It’s also true that this rhetoric can be interpreted as an attempt by Trump to distance himself from Mr Obama — a man strongly criticised for announcing his withdrawal timeline and giving the Taliban cause to “wait the US out”.

At the same time, even if Trump is right, that conditions instead of preferred timeframes should dictate decisions, it does nothing to allay the public’s concern that Afghanistan has become a case study in “endless war”.

But this is what makes the way Western governments formulate Afghanistan policy so frustrating. While a vague set of goals are well-known to the public — “disrupting and dismantling the neo-Taliban insurgency” or “denying sanctuary to jihadist groups” for example — never has a single campaign plan shown signs of permanently achieving any of these goals.

Preferred though they may be, they just don’t seem particularly achievable.

If jihadist ideology cannot be wholly eradicated on the Afghan-Pakistan border, is there a point at which we can call its outreach successfully contained? If “the Taliban” cannot be militarily defeated then at what point should other options be explored?

If Trump is good to his word that “perhaps it will be possible to have a political settlement that includes elements of the Taliban”, then what are the conditions in which this settlement could occur? At what point does the US President seek conflict termination over conflict perpetuation?

Trump needs to outline as clearly as possible by what quantifiable metrics his mission would be deemed a success. At present, we have none.

All in all, too many questions remain unanswered. With no tangible goals, no maximum spends and no body count cut-offs provided in Trump’s strategy-free strategy for Afghanistan, the public cannot but keep guessing how, when or even if Western military involvement in the country will come to an end. And that is exactly the problem.

 

‘Land, kill and leave’: On CIVCAS and HVT

The photographs, the documents, the whistleblower testimony are all there — the brutal details of our diggers’ conduct brought forward into the harsh light of day.

A blow has been dealt to the prestige of Australia’s special forces with in-kind damages likely to follow for the reputation of the Australian Army as a whole.

At first, it might seem tempting to think of these kinds of events as isolated incidents that do not speak to a more widespread problem within the Army’s special operations community. But misconduct on the battlefield also speaks to a wayward shift in a military force’s broader operating culture.

Along with the Maywand District murders and the Panjywai massacre, what these new allegations levelled against Australian soldiers in Uruzgan will come to symbolise is the ultimate failure of Western militaries to adapt to a fight where the decisive battle was the human terrain.

According to our military leaders, the reason for Australia’s presence in Uruzgan province between 2001 and 2014 was to “clear, hold and build” a Taliban-free Afghanistan. Per counterinsurgency doctrine, by providing an enduring sense of physical security to local Afghans, the “hearts and minds” as well as the rifles and trigger-fingers of fighting-aged males in Uruzgan would eventually be won over.

At some point it seems that this strategic guidance either failed or was wholly ignored.

While Special Operations soldiers had earlier played a kind of “guardian angel” role in support of their regular counterparts in the Mentoring and Reconstruction Task Force, as the Afghan war dragged on, that role became increasingly aggressive.

An upsurge in “direct action” operations began to distract from efforts to secure the population. By 2010, much of the task group was solely focused on so-called “high-value targeting” — the coalition’s effort to kill or capture an ever-growing list of local Taliban “commanders”.

As a former Special Operations Task Group member drily put it to me, the new penchant for fly-in fly-out missions conducted out the side of a Black Hawk saw the entire concept of operations switch from “clear, hold and build” to “land, kill and leave”.

Of course, operating in this manner was never going to defeat the Taliban. Insurgencies are complex adaptive systems capable of surviving the deaths of leaders. As David Kilcullen writes in Counterinsurgency: “decapitation has rarely succeeded [and] with good reason — efforts to kill or capture insurgent leaders inject energy into the system by generating grievances and causing disparate groups to coalesce”.

All this considered then, by channelling an apparent “shoot first, never ask questions at all” ethos, there’s a good argument to be made that much of SOTG’s work in the final years of the Afghan War was counter-productive.

In many ways, the sunset years of operations in Afghanistan marked a transitional moment in the Australian way of war — one which saw our special forces transformed into the hyper-conventional juggernaut it has become today.

In other Western forces, the over-emphasis on “conventionalised” operations — that is heavy-hitting operations which deviate from the subtle and indirect approach of yesteryear — has had similar results on the ground.

The Australian flag sowed onto the arm of a military uniform worn by a man

Courtesy: ABC News

The New Zealand SAS is currently reeling from allegations that its members carried out “revenge raids” against civilians. US Navy SEAL Teams have now been linked to extra-judicial killings and corpse desecration on the battlefield. In Britain too, the story is much the same. Reports of “rogue” SAS troopers and battlefield executions. Civilian casualties. A Ministry of Defence probe into war crimes allegations.

Incident by incident, this is how the war in Afghanistan was lost.

Despite more than a decade and a half of sustained military effort, today Taliban and other extremist groups cover as much as 40 per cent of the country.

Certainly, where our own efforts are concerned, the data is clear. Australia’s war in Afghanistan was a failure. According to the Institute for the Study of War, districts like Shah Wali Kot (where Corporal Ben Roberts-Smith’s VC-winning charge took place) are now categorised as “high confidence Taliban support zones”.

Elsewhere, the observable metrics on the ground speak for themselves. In 2002, US intelligence estimated the Taliban’s strength at 7,000 fighters. As of 2016, that number has increased to 25,000. As this year’s spring fighting season begins, the Taliban still control roughly a quarter of Afghanistan.

More than anything, what these new revelations demonstrate is that somewhere along the way our military, and our special forces in particular, simply lost the ability to effectively counter an insurgency.

Once upon a time, “the best of the best” were trained to operate like “phantoms” — treading lightly and prudently alongside their local partners.

Today, however, the legacy they will leave behind in the minds of Afghans will be a brutal one. The civilian cost of the Special Operations Task Group’s operations in Afghanistan is now apparent for all to see.

Practical Tips for Skimming the Qur’an: Or, How to Study Islam without Rigour

In a world where media outlets are now plucking Middle East experts from the ranks of heavyset non-Arabic-speaking private military contractors (whose interactions with people from “these cultures” are confined to firefights and abusive run-ins at heavily-trafficked intersections), a few colleagues of mine over at the Australian National University have started a great new project called “Re-Anth”. Envisioned as a clearinghouse for popular, prescient scholarship in the social sciences, the general objective of Re-Anth is simple – to reintroduce anthropological thinking into the wider social and political discourse. As such, this will hopefully be the first of many contributions I can make to their new blog.

The first topic I’ve been asked to write about is the concept of “praxis”  one of those great buzzwords you will only ever come across in postgraduate anthropology seminars or in vaguely-meaningful but mostly arcane discussions of Hegel’s contributions to philosophy.

Praxis, in the context which anthropologists use it, refers to the process by which immaterial concepts and ideas (Aristotle’s theoria) are realised by action – the bridge between what Hannah Arendt saw as the two defining categories of human thought and behaviour – “vida contemplativa” (the contemplative life) and “vida activa” (the active life).

While the term itself suffers from a terminal case of jargonitis (in part because praxis is an import word from ancient Greek and in part because praxis is also the German for “practice” which has a separate meaning in English-language anthropology), the spirit of the praxis concept is as follows: there exists a process which connects the things people think about with the things people do and that mapping this contemplation-action algorithm is key to understanding how a member of a particular cultural group is likely to think and behave under a given set of conditions.

There is a huge body of theoretical muck out there to wade through in one’s search for a definition of praxis (from experience, this can actually lead to a reduced understanding of the concept) but since praxis, like anthropological fieldwork itself, is practically-oriented (or indeed, praxically-oriented) a good way to grapple with the concept can be found by thinking about a religion like Islam not only as a “practice” (that is, something someone does) but also as a “process” (the contemplative and active steps which lead to the doing). By reflecting on the process by which religious texts like the Qur’an (a body of work that contains various theoria) are interpreted and then incorporated into the daily lives of individuals, for example, one can observe the praxis concept in the field.

As a student of Islamic societies, the process by which the Quran is brought into the material world is the textbook example of the praxical process. Similarly, if one looks at a political project like “Marxism” – which Antonio Gramsci called “the philosophy of praxis” in his Prison Notebooks  one can observe an analogous process (that is, a revolutionary strategy) by which a utopian ideal is interpreted and then progressively introduced into society by the Marxist. Both the Marxist revolutionary process and Quranic exegesis-enactment (as a hermeneutic process) therefore, are examples of praxiin the wild”.

With praxis thus defined and with the title of this post suggesting that there is something lacking in how “Islam” is dissected and studied in public discourse, it is now incumbent upon us to consider how the praxis concept might improve the way we think about Islam, re-injecting some intellectual rigor into the discussion.

As I’ve discussed previously, the “true meaning” of any text (especially a religious text) is ultimately interpretive. This should be self-evident to anyone who studied “the novel” in high school – especially if one’s English teacher was intent on extruding bizarre, hidden meanings from the most innocuous of sentences. Certainly, the fact that deciphering a text is an interpretive process (a praxis) should be self-evident to anyone who is familiar with the way in which a nation’s laws are interpreted by the courts.

As Barack Obama said of the US Constitution in his final address as President: “it’s really just a piece of parchment.  It has no power on its own.  We, the people, give it power – with our participation, and the choices we make.”

The “participation and choices” which Obama speaks of is, in this instance, a description of constitutional praxis – the process by which the law is interpreted, reflected upon, incorporated and then lived by “We the People”.

To Islamic scholars, the praxis concept is encapsulated by a process called ijtihad – the mental and physical effort which connects the Muslim vida contemplativa with the vida activa (to revisit Arendt). Ijtihad therefore, is the process (thus the praxis) by which interpretations of Islamic jurisprudence are developed. It follows then, that because jurisprudential interpretation is ultimately subjective, sharia (the legal aspect of Islam) cannot be thought of as comprising a single codex. Indeed, much as constitutional opinion amongst American jurists is not, in any way, unitary, “Islamic law” cannot be understood as a monolithic bloc that regulates Muslim behaviour in any single way.

For this reason – that is, because the ijtihad process produces many different interpretations of both sharia and Islam itself, it is uniquely artless to paint a literal “broad church” of thought with such a broad brush stroke. Likewise and for the very same reason, it is equally artless for one to imply that ISIS’ worldview has “nothing to do with Islam” (this position is often labelled an “apologist” position – often  by those who themselves detest the label “Islamophobic”).

Having said that, I suspect that there are very few serious scholars of Islam who would claim that Islamist extremism has “nothing to do with Islam”. As both Shadi Hamid and Reza Aslan (two very popular scholars of Islam) have argued – it’s not that ISIS’ ideology is “not Islamic” per se (because the very nebulous nature of religious praxis means that if one says it is Islamic then it is Islamic) but rather that using ISIS as a case study to inform a generalization about what it means to be a Muslim is inaccurate and unfair to the majority of Muslims around the world.  As such, despite the shrill cries ringing out from the far-reaches of the internet that terrorist-sympathising “snowflake apologists” are amassing in their “safe spaces” to measure just how little of nothing terrorism has to do with Islam, I’m yet to come across any serious peer-reviewed research that would reject that members of ISIS self-identify as Muslim. The critique, therefore, is probably a straw-man argument.

In many ways then, the greatest intellectual failure of “the anti-Islam school” (that is, the school formerly known as “Islamophobic” Prince logo.svg), lies not in its interpretation of Islamic text per se but rather in its refusal to include a discussion of praxis into how Islam is actually lived – that is, the inability to see Islam not merely as a set of practices but also as a process by which the practitioner interprets text and engages with the sacred.

Certainly, it is possible that one could conclude that the Qur’an is intrinsically violent or misogynistic if one selectively read (as ISIS does) verses like 9:5 or 4:34 to the exclusion of contradicting verses like 109:6 and 30:21 (even though, as the anti-Islam school will tell you, later-occurring verses are supposed to abrogate earlier verses). Yes, if you read the Qur’an like that you might find “Islam [monolithic]” guilty of many crimes.

But of course, in order to find Islam guilty of these crimes, one would also have to refute the role of praxis in producing human behaviour – discounting, for example, the possibility that Islam is a living breathing religion (defined by heterodoxy) or that Muslims are followers of a constantly-evolving faith, a community possessing of a diverse collection of doxa that oscillate from “asymptote to asymptote”. So yes, if one used such a myopic approach – that is, if one employed a textualist, literalist, atomistic, and wholly un-holistic approach to religion as an entire field and object of study, ignoring the fact that interpretation matters or dismissing the empirically-tested finding that diversity of religious opinion exists even in small-scale societies – you might conclude that Islam is, must be, has to be “bad”, “evil”, “antithetical to Western democracy”… as Tasmanian Senator Jacqui Lambie seems to have concluded.

Naturally, if you use this approach, you’ll probably not find much intellectual backing for your work outside the various think-tanks run by Daniel Pipes or Robert Spencer (or, indeed, Richard Spencer). But then, “left-wing social justice warriors on university campuses”, right?

Ultimately, the bottom line is this: giving credence to the praxis concept is absolutely critical to the study of Islam [not monolithic]. Moreover, if one actually goes out on the streets and talks to Muslims about how they interpret the Qur’an and how that interpretation influences their behaviour (note: this requires interacting with a sample size that is larger than the cellblock of Camp X-Ray or the mullet-wearing Lebanese teenagers hanging out in hotted-up cars down the road), one would probably conclude that diversity of opinion in a religious congregation which comprises more than a fifth of the world’s population might well be infinite; that praxis is really the only thing that counts when crafting generalisations about “Muslims”; and that ultimately, the Qur’an (whether it is the word of God) is simply a collection of words recorded on a sheaf of palm-fronds. To borrow again from Obama, the Qur’an exists but it is up to Muslims through their “participation and choices” to interpret it and live it.

It might seem bizarre that a religion which scientifically regulates its phases of worship according to incremental changes in the lunar cycle could have so much diversity of thought. Here though, it’s worth noting that, according to hadith, the notion of ikhtilaf (Arabic: إختلاف) meaning “difference” or “diversity” was seen as a blessing by Muhammad. Indeed, according to a comprehensive study of the subject by Musawaikhtilaf  al-fuqaha (“diversity of opinion amongst jurists”) not only existed as far back as the Abbasid Caliphate but was also respected as a necessary part of realising a truer, greater Islam.

A non-Muslim interested in thinking more about praxis might consider his own practices, and the contemplation-action algorithm that led him there. If, for example, one ascribes to the Christian faith and goes to church every Sunday, consider the following passage in Matthew 6:4-6:6.

“And when you pray, do not be like the hypocrites, for they love to pray standing in the synagogues and on the street corners to be seen by others. Truly I tell you, they have received their reward in full. But when you pray, go into your room, close the door and pray to your Father, who is unseen. Then your Father, who sees what is done in secret, will reward you.”

After reading this passage, would it be fair to say that conducting the commonly-practiced Sunday ritual at church is “un-Christian”? The answer, of course, is “no”. Of sole importance here, beyond your self-identification as a “Christian”, is the praxis which underpins the religious choices you have made. In the end, the process of selecting the Sunday ritual and participating in the ritual itself, is the only bit that matters.

We are winning but The Horror will continue

The images coming out of Nice are shocking. Bodies crushed beneath the multi-tonned might of a truck. Revellers who just minutes before were celebrating the festivities of France’s Bastille Day mowed down in the street. Corpses everywhere. People fleeing, running for their lives. All of it live-streamed by the ubiquitous smart-phone.

This terrorist attack (if that is indeed what this was) did not occur in isolation. In preceding weeks we have witnessed similar scenes of carnage in other great cities of the world – Istanbul, Medina, Dhaka and Baghdad. Terrorism is not new to us. But this attack is particularly frightening for two reasons.

At a visceral level, the mangled bodies on the promenade remind us of the human cost of terrorism in a way which even the vaporised nothingness which follows a suicide bombing can fail to convey. The mashed bodies are the bodies of actual recognisable people. The Horror, in the sense which Conrad meant it, is real.

Secondly, and perhaps most frighteningly, the use of an everyday vehicle as the primary weapon in a terrorist attack shows us that despite our best efforts to catalogue and trace the purchases of fertiliser at hardware stores; strictly control the dissemination of firearms and ban pen-knives on planes; we can never fully contain the threat posed by violent extremists. Preventing access to the means by which this violence is perpetrated is crucial but we should be under no illusions – we will never completely eradicate terrorism.

Reactionary voices will come forward saying that a ban on Muslim immigration is the solution to terrorism and Donald Trump will inevitably tweet, as he has tweeted before, that “I alone can solve” (the problem). But make no mistake – no border, no pogrom, no government-funded de-radicalisation program will ever be able to negate the possibility, however infinitesimal, that a madman will slip into the driver’s seat of a legally-purchased, road-worthy truck and run down dozens of innocent people in the street.

The perpetrators of these attacks plan and execute them with specific objectives, that is, a “desired end-state”, in mind. The political function of a terrorist attack is to incite fear in a population and if the scenes of chaos in Nice are anything to go by, IS has achieved this end-state. “Nous sommes terrifiés,” tweeted the Mayor of Nice, begging  the Niçois to remain indoors. The city is in a state of panic. At a global level, the terrorists are celebrating further because we, like the Niçois, are afraid as well – afraid that we will be the victims of the next terrorist attack.

But while the terrorists’ coup in Nice and the marked increase in terrorist attacks should give us all cause for concern, we should not confuse “an increase in terrorist attacks committed by the Islamic State” (assuming IS is responsible) with a statement like “the Islamic State is winning”.

Far from it in fact, on the ground in Syria and Iraq, where this fight really matters, IS is not winning. In the last few months alone, thanks to the co-ordinated efforts of Western, Iraqi and Kurdish forces (and the non-related but mutually-supportive efforts of the Syrian government and its ally, Russia), IS has lost a significant amount of its territory. Palmyra is back in the hands of the Syrian government. Fallujah is back in the hands of the Iraqi army and the Kurds have chased IS back to the gates of Mosul.

Indeed, if we use Mao Zedong’s 3-phased guerrilla war as a model for a successful fight we can see that the last year has been disastrous for IS – a year which has seen it regress from “Phase 3” (wherein the guerrilla army, as in 2014, begins the decisive annexation of enemy territory) back to “Phase 2” – the use of intimidation tactics like terrorist attacks to weaken the enemy’s resolve.

Furthermore, when one observes that the attacks in the holy city of Medina have drawn the ire of prominent Saudi Salafists or when one considers the empirical observation that the use of indiscriminate violence is ultimately counter-productive to a group’s political aims then, all things put together, these attacks appear less as a sign of strength and more as an indication that IS has a reduced threat profile than the one it had just a few years ago.

One-by-one its fighters in Iraq and Syria are being picked off. Just yesterday in a demonstration of the effectiveness of US airpower, Omar Al-Shishani, the Georgian-born commander of the Caliphate’s North and heartthrob of ISIS’ mujahireen  (foreign fighters) was obliterated by a laser-guided GBU-12. Indeed, according to some in the online #ISfanclub, it is yesterday’s loss of Al-Shishani which inspired this new attack in Nice. Thus we arrive at the following conclusion. Outgunned, on the run and lacking the means with which to commit atrocities, ISIS has now resorted to running innocent people over with trucks.

In real terms, as I tweeted just yesterday, ISIS is in a bad place. If current trends in Iraq and Syria continue, my guess is that ISIS will be militarily defeated by this time next year. On the conventional battlefield, they are done. The terrorist attacks however, will likely continue as ISIS reverts to “Phase 2” tactics. In kind, we should prepare ourselves for the next battle – to make sense of and systematically defeat the ideology of salafi-jihadism. This will take time. And patience. But we should be confident about our ultimate victory. Yes, we know now that a truck can be used by this enemy for indiscriminate violence. The prospect truly is terrifying. But as Omar Al-Shishani learned yesterday, 230kg of ordnance (when used selectively), and a patient, cerebral approach is far more effective.

CmmqwI6WIAAmSem

The Arabic word for “Islam” on the right. Inverted into the shape of a Kalashnikov in the thought bubble. Source: Jabertoon

 

What does it mean to be “radical”?

 

Radical (chemistry): “A molecule that contains at least one unpaired electron… because of their odd electrons, free radicals are usually highly reactive… they [can] react with intact molecules, generating new free radicals in the process”(Encyclopaedia Britannica, 2015)

Radicalism (political): “Radicalism is characterized less by its principles than by the manner of their application” (Cyclopaedia of Political Science, 1893)

 

If it can be said that free radicals in chemistry are good at creating more free radicals or that political radicals have a tendency to replicate and create revolutions then it is also a rule of Twitter that anything the popular neuroscientist Sam Harris says about Islam and Islamism will be liked, retweeted and defended by his legions of fans. The Law states if it is he that created it, then the Tweet will be spread, regardless of any discrepancies or oddities in the Tweet’s molecular structure.

Such was the case with Harris’ online rebuttal against the terminology used by Hilary Clinton in her response to the Orlando shooting – 691 retweets, 1,866 “favourites”.

Here Harris sought to chide Clinton for her use of pleonasms, arguing that the excessive use of the adjective “radical” made her terminology linguistically redundant. In many ways, Harris is right to focus on language. Phraseology is important in the discussion of jihadist violence. “If Hilary is only against the radical jihadists,” an onlooker might otherwise wonder. “What about the mainstream jihadists? Are they OK then?”

At this point in Trump’s over-televised run for Presidency, everyone who isn’t living under a rock should be aware that there is no such thing as a “mainstream jihadist” but the larger point Harris is trying to make is still valid – terminology is important and informed debate begins with the correct use of language.

Not being one to shirk the opportunity to nitpick however, I offered that although the term “radical jihadism” is a redundant pleonasm (much as the term “redundant pleonasm” is itself a pleonasm) the term “radical Islamism” is acceptable to use since there are many different schools of Islamist thought. This includes what we might call “mainstream” and “radical” forms of Islamism.

Harris’ response was simply:

Harris’ suggestion, of course, is that theocracy – as a system of government wherein all authority is derived from a deity – has some kind of innate quality which makes it “radical” and that because of this quality it is therefore redundant to affix the adjective “radical” to the word “Islamism” (since the central aim of most Islamists is the establishment of an Islamic theocracy).

As awful as theocracies are, one runs into problems by blanket-labeling an entire system of government as “radical” – even one as flawed as theocracy. If theocratic ideas were necessarily radical what would one then make of a country whose Pledge of Allegiance is a pledge to “one Nation, under God”? Or what would one make of the Vatican – a religious theocracy run by priests? Would one really argue that the Pope and his cabal of cardinals are nothing but “a bunch of radicals”? One could argue that I suppose but it would be a very radical argument to make indeed. And there would be a great many Catholics in the American mainstream (citizens of “one Nation, under God”) who would disagree with your position.

It is clear then that the word “radical” has an inherently relative quality and that it is better understood  simply as “that which is not mainstream”. Contiguously, the term “radicalism” refers simply to a collection of political beliefs which do not exist in the mainstream. It is merely the antonym of the humdrum middle-ground.

The distinction I was trying to make between what we might call “radical Islamism” and the less radical (but no less repugnant) forms of Islamism essentially coheres with the same distinction made previously by political philosopher Olivier Roy. Roy’s thesis was that Islamism is not one single movement but a spectrum of political beliefs which “oscillates between two poles” – a “revolutionary pole” and a “reformist pole”. The distinction should seem fairly self-evident to anyone with even a cursory understanding of the history of political Islam. Some radical Islamists (who we typically refer to as jihadists today) want to pick up a sword and accelerate Islamisation with cold steel while others in the mainstream are more relaxed (Jacob Olidort and Graeme Wood calls these relaxed types “quietist”) – seeking to focus their efforts on Islamising society “from the bottom up, bringing about, ipso facto, the advent of an Islamic State”.

The distinction between Roy’s “revolutionary” Islamists (who we can safely call the “radicals” among the Islamists) and “reformist” Islamists should be familiar to Harris because Maajid Nawaz made a very similar distinction in the book they co-authored together:

“…When I say ‘Islamism’ I mean the desire to impose any given interpretation of Islam on society. When I say ‘jihadism’ I mean the use of force to spread Islamism.” (Nawaz/Harris 2016)

Thus, given what we have discussed about the different forms of Islamism we arrive at the following diagram which expresses the fact that Islamism is not unitary but oscillates between “mainstream” and “radical” poles.

Spectrum of Political islam

Fig 1.1

The “pollination line” in Fig 1.1 is used to demarcate the point at which Islam ceases to be simply “one’s religion” and becomes a political ideology – the point at which the believer pollinates the spiritual life with “the world of the profane“. In essence, the pollination line delineates what we in the West might call “the separation of church and state”. My specific use of the  term “pollination” is intentional here, having borrowed it from a controversial hadith narrated by Anas which offers a glimpse of a secular Islamic world in which worldly affairs are separated from spiritual ones (see footnote below)***.

One will note in the mere existence of this diagram, that I have taken care in how I approach the subject of Islamism, viewing it less as a monolithic bloc (as Harris and Trump have now come to view Islam itself) but treating it as a highly complex collection of diverse social and political movements (all of them seriously flawed).

There are other reasons to be more careful when we taxonomise Islamism. How else can we distinguish between the Shia Islamism of Hizballah and the Sunni Islamism of Al-Qaeda, for example? Or how can we tell the difference between the fractious (and almost-incidental) Islamism amongst the Pashtun  and the ethno-nationalist tinged Islamism now popular amongst many of the Tuareg in Northern Mali?

Indeed, if one were to borrow from the political theorist David Nolan and rework my diagram by including another ideological vector like “primacy of the tribe vs primacy of the Caliphate”, one could plot the significant ideological differences between the various Islamist groups with significantly more accuracy on a Cartesian chart.

As with the following:

Cartesian Chart of Islamism

Fig 1.2 (Noting that Boko Haram’s ideology is nearly unplottable)

Indeed, once we realise that the issue of Islamism is far greater in scope than the white Muslim convert next door regurgitating the filth he reads on the Internet (that is, once we remove ourselves whole-bodily from the ethnocentrism of our own backyard) we will realise that Islamism, just like Islam itself, is very far from a single creed.

Ultimately, the most succinct way I could put the distinction between “the various Islamisms” was by pointing out that in some Muslim-majority countries (like Egypt) there are some Islamists who almost everyone would regard as a terrorist and others who would be democratically elected.

Naturally, by simply pointing out that Islamist views are fairly mainstream in many Muslim majority countries (which they are) I was likened to an ISIS sympathizer by the Harris fan club. According to my logic, they claimed, ISIS’ ideology shouldn’t be considered radical because within the Islamic State, ISIS’ worldview is the prevailing worldview.

Ad hominem aside, it’s actually a reasonable point to make. Hypothetically, if researchers were able to obtain unbiased psephological data from within the Islamic State or if we reduced the sample size of our “spectrum of political belief” diagram (Fig 1.1) to say, fighting aged males currently residing in the city of Raqqa, we would likely find that ISIS’ worldview is far from radical. One might even observe “the pollination line” shifting completely to the right indicating that everyone is in total agreement that Islam should be indivisible from the affairs of state (although if we use my Cartesian model, one would plot the Anbar tribes on a higher co-ordinate to the ISIS muhajireen on the “primacy of the tribe” vector).

Of course (returning to Harris’ original critique), we know that when Hilary Clinton is talking about trends in contemporary politics she is not restricting her sample size to fighting aged males in Raqqa. So the point is moot – ISIS’ worldview is indeed objectively radical in this context. In saying this, I will concede (to the glee of Sam Harris’ fan club) that if the sample size for this discussion was restricted to the US (which it may have been since Clinton was talking about Orlando) then yes, Islamism should be considered a radical ideology. This would mean that Harris is right and the term “radical Islamism” uses a redundant adjective (shame! O shame on you Hilary!). But if we can leave ethnocentrism out of our thinking for a moment and think of this “war of ideas” as a global war and the entire world (with its 50 Muslim-majority countries) as our sample size than it makes sense to make distinctions between different kinds of Islamist belief. Clinton has served as America’s top diplomat so I would hope that she was thinking big picture on this issue.

While I’m on the topic of making concessions to the Harris fan club, I’ll also concede that all Islamists (“quietist” or not) are in a sense “radical” in that an Islamist seeks to be an agent of radical change to society – transforming it completely. There is very little “moderation” in all Islamist ideation which is why many Islamists end up becoming “extremists” (the antonym of “moderation”). But to repeat the hundred and something year old quote at the top of this article: “radicalism is characterized less by its principles than by the manner of their application”.

Heaving ho then, while we could continue discussing whether “radical Islamism” constitutes a pleonasm, the key point is that if we can’t make simple distinctions between the ideation systems of someone like Abu Bakr Al-Baghdadi (a militant jihadist, whom I would label a radical Islamist) and someone like Mohammed Morsi (an Islamist whose views, according to the results of the Egyptian vote in 2012, are fairly mainstream in Egypt) then there really is no hope for our ability to understand the place of Islam in our world.

Of course, Harris’ thesis (the one that is retweeted by his legions of fans… and then repackaged in less savoury terms by Trump™) is that the world’s “Muzz-lims” should be considered followers of an intrinsically radical religion – Islam being what is – a religion founded by a puritanical Bedouin raider.

While the latter about Mohammed might be true, the reality is that in the world we live in today – a world which the founder of Islam was integral in shaping (for better or for worse) – the Islamic worldview and even the Islamist worldview is far from a “radical” one.

This is not to say that one should not speak out against Islamism (as Harris’ fan club seems to think I am suggesting). On the contrary, given all the empirical evidence which suggests that mixing religion and politics is about as good an idea as mixing sleeping pills and alcohol, I’ll be the first to speak out against Islamism if it ever becomes a mainstream belief in Canada (thankfully, an Islamist would be considered a radical in my neighbourhood). I’ll also happily speak out against Pakistan’s blasphemy laws, Saudi Arabia’s harsh dispensation of judicial punishment and the reign of theocracy in Iran.

But if we can agree that “Islamism” is the “enemy” (to use a term which others with military backgrounds can relate to) then our first duty in this global war of political ideas is to understand this enemy as best we can. One need not repeat the Sun Tzu edict here for emphasis.

Understanding this enemy involves conducting what military planners call a “stakeholder analysis” – mapping out all the individual actors within the conflict eco-system to grasp the role they play in producing and transforming violence. This mapping exercise might involve building a profile of each of these individual actors, and occasionally, categorising them according to where their views might lie on a spectrum of political belief (as we have done in Fig 1.2).

Understanding and making the distinction between what we might call “mainstream” Islamists (the quietist types) and “radical” Islamists (the jihadists) is important here because it enables us to adjust the parameters of our targeting apparatus within the system. This enables us to focus our efforts on the targets that matter the most. Indeed, if we remember that labor is in short supply, our aim should always be to attack targets who, once removed from the system, will have a significant effect on the enemy’s centre of gravity. A “radical” Islamist is good at creating more “radical” Islamists, just as in chemistry a radical molecule is good at creating more radical molecules. Therefore, it follows, we need to have words which enable us to categorise and identify radical Islamists where they exist.

Remembering that our ultimate aim in this war is to move that pollination line in Fig 1.1 as far over to the left as we possibly can, the greater problem – the problem of Islamist violence in our world – is greater than the debate over terminology. Ultimately however, our ability to solve the problem rests on our ability to understand the problem and if we can’t understand the basic terminology and the importance of making basic distinctions between the different forms of Islamism then we’ll never find a solution.

 

 

 

***

Footnote

Hadith #2363 – narrated from Anas

“The Prophet (peace be upon him) passed by some people who were busy with pollination and said: “if they would not do this, then it would still come out right”

The date crop that resulted was of a very poor quality.

Then he passed by them and asked: “what is with your date palms?”

They said: “You had told us such-and-such…”

He said: “You know best the affairs of your worldly life.”

 

 

 

A Blueprint for Asphyxiating Jihadism

The Problem

By now it should be obvious that the application of brute force, by itself, is insufficient in the effort to defeat jihadism. Similarly, while state intelligence organs have proven effective at disrupting threats to domestic security and adding new names to shiny-white balls in the drone strike lottery, the jihadist problem still persists. It persists. And it persists because we have failed to apprehend the nature of problem.

In more ways than one, this non-apprehension stems from our tendency to glean information through computer screens instead of through people – a symptom of our preference for technologism (as exemplified by the “death from above” problem-solution continuum) instead of humanism (an in-depth understanding of old mate Akhmal and his problems). As a result, and in light of the fact that jihadist terrorism is much worse (by several orders of magnitude) then it was even five years ago, it seems that we still don’t know why cultural facts on the ground in faraway places are manifesting as effects elsewhere.

Indeed, what our misadventures attempting to defeat insurgencies in Iraq and Afghanistan have demonstrated is that our inability to understand the cultural environments in which we operate renders instantly useless any and all efforts we might make as counter-insurgents.

In this war, knowing who to kill can be less important than knowing who not to kill. A given target on the Joint Prioritized Effects List might yield indices of “1” on a threat association matrix but that same target might also be a swing-voting imam, siding with the jihadists not because of any ideological affinity he has with them but because he is engaging in a survival maximisation strategy – collaborating out of necessity. 

If we had only known this before we droned him into oblivion, we might have slipped him a few greenbacks, done his speech-writing for him and used his sermons against the bad guys.

By contrast, the current industrial killing machine approach, as exemplified by the upthrust in direct action raids conducted by JSOC et al, has yielded limited results cohering with Stan McChrystal’s characterisation of “insurgent algebra” as “ten minus two [insurgents] equals twenty, or more, rather than eight (10-2≥20)”.

In many ways then, the logic for being better-informed (and perhaps more selective) in our bomb-dropping is numerical – we have a limited amount of ordnance to fire at any given location and we know we can’t and don’t want to kill everybody in that location because our desired end-state is neither genocidal nor Sisyphean. Ergo, it follows that we need to be better informed. But we cannot be better informed until we go and get informed.

Likewise with the view that aid dispensation is a cover-all panacea, we cannot expect the mere building of infrastructure in Afghanistan’s mountainous “land of unrestraint” (yaghistan) to capture the hearts and minds of a tribal population who have a culturally-engrained suspicion of cities (shahr) .

Neither can we expect the Sunni of Anbar to fight for us “out of gratitude” for the armed social work we once conducted in the past. “Hearts” (and well-building) can be valuable to us, yes. But hearts are not nearly as valuable as minds. Furthermore, without observing and understanding the “cultural mind” that is driving the phenomenon of militant jihadism, as it is occurring on the ground, the best strategy we will ever be able to hope for in our hopeless war of attrition is two 5.56mm in the heart and one in the mind.

It is clear then that what is required to defeat jihadism is a detailed, even ethnographic, understanding of any future terrain where this ideological conflict is likely to take place. It is not enough to simply draw causal links between jihadism and incorporeal factors like “grievances”. Nor is it enough to attribute the blame for jihadist recruitment on vaguely-defined ontological states like “poverty” or vaguely-described “charismatic recruiters” and “madrassas”.

Further questions need to be asked by people involved in field research. What are these “grievances”? Where did they come from? What is the nature of local “poverty”? If there are “charismatic recruiters” in Saudi-funded madrassas on the AfPak border, which ones in particular are churning out the bad guys? Why these ones? What is the cultural terrain in which these “bad madrassas” are ensconced?  In short, what are the “roots” of the so-called “roots of terrorism”?

Female CST-2 member speaks with Afghan child

“Right, but before we blew up your school did you like to go?” (Source: Sunnyinkabul.com)

Up to now, we have largely relied on arcane computer-plotted metrics like “significant kinetic effects” to tell us what the violence looks like rather than walking around, talking to people and finding out what the violence is actually doing. By relying on the quantitative data we are missing out on the qualitative description – the somatic inputs which inform us about the totality of cultural life and the dispositions and allegiances of the people.

The Practitioners

As far as seeking to better understand the problem, the US Army Human Terrain System represented a step in the direction. But it was a dismal failure. Putting uniforms on social scientists and asking them to “do anthropology” in the context of a military operation-cum-occupation is laughable. One can not be a “participant-observer” if one is dressed like Terminator in a town where the favoured dresscode is a kaftan or a dashiki.

Once the boots are on the ground stamping out a big footprint, it may actually be too late for anthropologists to do traditional ethnography. Indeed, if Iraq is anything to go by, it may be too late to do anything at all (2017 update: I take that back. Send in the anthropologists to survey the mess in Mosul and Raqqa).

Having said that, let me be clear. In and of itself, counterinsurgency strategy (COIN) is, at the very least, theoretically sound. Clear, hold, build. It does work – or it least, it can work. But it only works if it is executed by a group of practitioners who are well-informed enough such that it can be said they have a mastery of the cultural terrain and a thorough understanding of the socio-political forces driving the conflict. The agents of British empire were only able to execute successful counterinsurgencies after hundreds of years of deep immersion in the cultural environments they occupied – much of which involved sending explorers and ethnologists like Francis Younghusband and Richard Burton out to the periphery of imperium in order to bring back the cultural information and whispers of rumblings in the hills. Comparatively, 6-month military rotations whose aim is to work through a list of people to kill is pathetic.

This century has seen the US leading the ham-fisted fight against jihadism. But if the rise (or perhaps, “the scent”) of Donald Trump is symptomatic of a necrotic rot and general decline of a once great America, the responsibility for preserving Western civilisation against the very real threats which menace it will increasingly fall into the hands of smaller powers – Australia, Canada, The Netherlands, Denmark, even New Zealand.

Everyone knows the UN is broken but other multilateral institutions, like the International Criminal Court, can be leveraged and incorporated into the defence policies of small powers. We now have an international legal instrument to prosecute our enemies (war criminals all of them) – what need is there to have the Americans lock them up in Guantanamo? We now have a refuse station to deposit the trash – so why not bring Ahmad Al-Mahdi and “Caliph” Al-Baghdadi and Abubakr Shekau (and Joseph Kony, for that matter), kicking and screaming to the Hague where we can handcuff them in their underpants to the handrails outside.

There’s some argument to be made that local judiciaries function as better truth and reconciliation mechanisms than bureaucrat-heavy global courts – but what better proof can we provide to Muslim victims of suffering that we are on their side then by dressing-down the jihadists of the world before the international press? (2017 update: we might also dress down the Rohingya-killing Aung San Suu Kyi’s of the world, too).

In all likelihood, small powers will be crucial in the next phase of this war even if an examination of recent history shows that the foreign policy decisions of countries like Australia are symptomatic of a delusion where a small power thinks itself a great power. We are the truck drivers and logisticians for America’s theme park in Iraq’s Emerald City. We supplement our big cousin’s Air Force with a few extra fighter jets (which we buy off him for exorbitant prices). Secretly however, we all know that a country like Australia (with a population of 20 million) or Canada (whose landmass is largely a frozen waste) will never be able to join the global superpower club.

And really, we don’t want this anyway. We don’t want to conquer Afghanistan and install a glorious empire which will last a thousand years. We don’t want to occupy Iraq and raze all the mosques and make barbecues and the production of maple syrup mandatory. In principle (and I stress “in principle”), our main interest is in self-defence while peace-keeping and atrocity-prevention is also a shared goal. We are really just pre-emptive isolationists. For all intents and purposes, the “pre-emptive” component of this outlook involves surgically removing the little cancers in the world which are threatening to spread. In order that these cancers will never bother us.

Jihadism is one such cancer. And as with any cancer, it can be treated early or treated too late. One can cut the polyp out with a scalpel or one can wait till it becomes carcinogenic. One can pre-empt the spread or one can wait until it spreads, choosing instead to confront the problem with a bag-full of toxic chemicals (hyper-conventional military force) which is just as likely to destroy the rest of the body as it is to force the body into remission.

So how to cut out the cancer? Here’s a blueprint.

The Blueprint

A few years ago, a popular model was put forward to describe why complex adaptive systems like terrorist networks are so difficult to destroy – a model which juxtaposed decentralised systems with other systems whose command is centrally-controlled. The metaphor used was “the starfish and the spider”.

A spider, as we know, is reasonably easy to kill. Crush its invertebrate body between your fingertips and all its legs – its subsidiary parts – will cease to function. The hierarchical institutions of nation-states often look like spiders. Kill the mad king, his knights surrender. Or, in the world of today, if a drone is ready to be fired and the President is in a meeting the whole operation comes to a standstill because the chain of command is temporarily paralysed.

The starfish however, doesn’t need centralised command and control (C2). There is no singular brain running the show but a series of nerves running along the ambulacral surface of each individual arm. If any individual arm is cut off the arm regenerates. Each arm is, in effect, autonomous – decentralised.

starfish

A starfish regenerating an arm

Unlike with the spider, there is not one nerve centre to destroy but many waiting to grow back. And the biological analogy holds true to reality – organisations like Al-Qaeda and the “lone wolf” cells operating at the periphery of the Islamic State are demonstrative of the starfish model.

To unpack this further there are other congruent examples we could take from Greek myth – eg: if we were to contrast the regenerative heads of the Lernean Hydra with the single-minded Delphic Python (the classic mythological serpent – “cut off the head and the body dies”). There’s plenty of images to thickly-describe this phenomenon.

download

Guess who the terrorists are in this picture.

With this model in mind our task is thence to figure out ways in which to kill these “starfish” given that our current strategy (the drone-strike lottery) is having a limited net effect on the battlespace. As stated earlier, the fundamental problem with our approach to this conflict, has been our inability to understand the taxonomy, the anatomy and the reproductive capacity (that is, the nature) of the starfish – so, in many ways, the problem comes down to a problem of information and intelligence collection.

The nature of the information-space today is different to what it was during the Cold War. This is because unlike during the Cold War (when information was scarcely-available and jealously-guarded by those who held it) today’s “globalised” world is defined by what the anthropologist Arjun Appadurai calls “trajectories of disjuncture”. Information is no longer hidden, here and there, it is everywhere, available to everyone. It is no longer the purview of spies in the employ of the government. It is ripe for the picking by anyone – journalists, lobbyists, soldiers with blogs, hobbyists surfing the internet. Small Wars Journal, after all, is run part-time by a retired Marine out the back of his food truck. Much of the information (but not all) is already out there, at one’s fingertip, waiting to be apprehended.

Traditional intelligence organisations still fulfill a set of vital and specific functions. They collect high-level information which circulates through diplomatic circles; they analyse specific sets of information as it pertains directly to government policy; and, crucially, they deliver advice to policy-makers. But the world is far too big, the desertscapes and mountain ranges where jihadism is metastasizing are far too expansive for a bevy of urbane and taciturn bureaucrats to apprehend the nature of the problem as it appears on the ground. There is simply not enough paper in the Amazon to write all the risk assessment summaries.

Michael Nagata, the Japanese-American general who was until recently the head of the US Army’s program to fund Syria’s rebels, argued that in the fight against jihadism it will “take a network to defeat a network”. Following this logic then, the government (a centrally-controlled spider) is going to need help from the outside. This is where the private sector will likely be of some use.

Unlike a modern nation-state, there is no inevitable form which an entity in the private sector need adopt. Businesses like eBay have made billions by wresting control from central authority figures and placing it in the hands of the masses – by becoming thriving profitable starfish. Others, like Apple, have come to symbolise innovation in transcendental ways.

In general, and for good reason, there is a healthy suspicion of handing any kind of role in the War on Terror to the private sector. Indeed, apart from the problem of accountability there is a similar suspicion (if not a lack of trust) of the motivations of those in the business world. Just look at the controversies surrounding the free-reign that private military corporations like Blackwater have had over diplomatic security in Iraq. On this point, Machiavelli said “the lion”‘s share of what needed to be said about the problems posed by mercenaries in his writings on the condottieri in 16th century Italy.

In some cases however, specific and limited outsourcing of government war-time tasks to the private sector might be indispensable rather than inimical. Contractors are profit-minded which means, if they are paid according to outcomes, increased efficiency in the use of time and resources. Consider, as a heuristic comparison, the time it might take an individual military contractor to board a plane to the UAE and take up a job training Arab forces (as many retired Western soldiers have done) versus the time it might take an Australian military unit, even a special forces unit, to do the same. There’s almost no comparison.

The main problem with outsourcing, of course, will always be the issue of accountability. But insofar as the government holds the purse strings the private sector will always be behind to its pay-masters. Ultimately, contracts can be written how governments want. And laws still apply to individuals. Furthermore, with a degree of separation between the public and private sectors comes an additional, and useful element of deniability for the government. A condottiere does not carry a government ID card – therefore the government cannot be burned at the stake for the condottiere‘s shortcomings.

condottieri

The condottieri, the gentlemen-mercenaries of Italy

Ultimately then, given what has been discussed about our cultural knowledge-gap and given the future role which smaller, devolved, government-affiliated but private entities might play, one could conclude that our order of battle (particularly in the sphere of information-gathering and intelligence-collection) needs a complete restructure. And it starts, of course, with government itself.

The current force pitted again jihadism behaves much like “the spider” – where a single-minded body controls eight independent and often knock-kneed arms (*cough* Sovereign Borders). But as the war evolves, it is increasingly clear that what is required to asphyxiate jihadism, once and for all, is an organism that more closely resembles a jellyfish.

To biologists, jellyfish are known as medusae, named for the chthonic snake-haired monster from Greek mythology. A medusa typically takes the form of an umbrella. In this metaphor, the upper surface (the exumbrella) is the figurehead of governance (an influencer but not necessarily a decider of the mundane and everyday) which encompasses everything. The exumbrella is in turn supported by a pulsating hydroskeleton (a more efficient, flexible bureaucracy) and a tangle of toxin-delivering stingers (the military, especially the special forces).

The key distinguishing feature between the jellyfish and its older arachnid self is obnoxiousness of presence. While the spider is intrusive – a blot in one’s surroundings, a menace, something to be feared – the jellyfish is confidential, cordial almost, barely noticed as it pulsates seamlessly through the environment. In battle, however, a medusa is just as lethal as the spider. The semi-transparent Australian Irukanji, the smallest of the box jellyfish, is also the most deadly of the box jellyfish, despite being the size of a fingernail.

Again, and crucially, the jellyfish is not intrusive – it does not meddle, disrespectfully and contumeliously, in the same way that the spider does. Jellyfish do not hide behind the fortress walls of the Camp Russell’s of the world (see SOTG-Afghanistan), hunkering down in a maze of HESCO, browbeating those caught in its web about the virtues of democracy. Jellyfish simply “bloom” – reproducing seasonally and in large numbers when the sunshine increases – in a way which, crucially, never disrupts the ecosystem.

Instead of hiding and occasionally killing – like stonefish consuming bottom-feeders on the seabed – they replicate. The focus is not on opportunistic consumption but ally-creation

Still though, the bloom hunts. And there is yet prey to hunted.

So, the bloom goes forward. And swimming with, amongst and at the vanguard of this bloom will be other carnivorous hydrozoa – sworn into the service of the medusan public but privately employed – at an arm’s length. Hydrozoa like the Portuguese Man o’ War. The Man o’ War distinguishes itself from the bloom jellyfish in that it is not one organism comprised of many cells but colonial organism made of many individual organisms called “zooids”.

the fleet

The fleet moves

In principle, the privately-contracted Man o’ War is independent from the bloom and this independence can be useful to the bloom. The Man o’ War remains accountable to the bloom, who feed it the bloom’s scraps, but it complements the bloom because its structure – with its many “zooids” – is different to the bloom. These zooids can produce themselves at random through a process called “direct fission” – redeploying copies of themselves instantaneously.

As the bloom and the Man o’ War approach the juvenile starfish, teams of these zooids break away and descend upon the prey. The zooids attach themselves to the prey’s exterior – problematising the nature of the prey, fissioning further to create more zooids – “local” zooids – who can map the prey’s centre of gravity, assisting with the uncreation of the prey by thickly describing the prey. The zooids prepare the battlespace for the rest of the bloom by showing the bloom where the prey’s weaknesses are; what the prey subsists off; contextualising the prey as the right prey within an entire seabed of prey in a way which complements the inputs gathered by the bloom’s sensory organs – the bloom’s spooky spy-feelers.

zooid

A zooid. Microscopic. Deadly.

Having colonised the prey’s crusty back, the zooids weigh the prey down and the prey is consumed by the bloom. Then, the bloom moves on, in search of more prey, with the auxiliary zooids swimming in front, disappearing silently into the deep.

This is the blueprint for the slow asphyxiation of jihadism and one need get behind it, before the chronology overtakes us. It does, of course, require money. Or more precisely, the reallocation of money away from direct-action, droney-droney, pointy-shooty measures.

In a statement directed at his government pay-masters, General James Mattis, the Warrior-Monk of the US Marine Corps put the issue of allocation of resources rather succinctly: “if you don’t fund the State Department fully, then I need to buy more ammunition.

Applying Mattis’ logic, if Western governments with vested interests in the problem of terrorism don’t properly fund ground-based, human-conducted research which seeks to grapple with the problem in the places where it is metastasizing then those same governments are going to need greater funding for missile research. In any given location, by the time the jihadist problem requires a military intervention, then it is too late.

The key to “defeating jihad” is a re-structuring of the intelligence sector and in part a devolution of certain functions to the private sector (getting behind the mercenary zooids) to assist with collecting more information about the problem. A knowledge gap persists. And we need to fill it.

Take the rise of jihadism in Mali for example. Jihadism has been spreading, and rapidly so, over the last two years. Right now, of course, everyone is paying attention to ISIS  because ISIS is in vogue. ISIS are the badass bandidos with all the fancy videos and media attention. But while Iraq and Syria are now firmly in the clutches of jihadism, a new group – the  Force de libération du Macina (FLM) – is growing in central Mali. Before, jihadism was just a problem in the far north of Mali, a fad amongst a few Arab traders and disaffected nomadic Tuareg. Now, for the first time, FLM is targeting settled Fulani in Mali proper, wooing them to jihadism with nostalgic dreams of long-since forgotten caliphates. This is where the zooids will prove indispensable. Send in the zooids. Let them find out what’s happening in the Sahara. Indeed, what is happening in Fulani Mali? What is the problem? Why is the cancer spreading?

james mattis

James Mattis

Lo, The Terminus

“Oooh! For Christ’s sake let me alone!” cried the wounded man, but still he was lifted and laid on the stretcher.

Nicholas Rostov turned away and, as if searching for something, gazed into the distance, at the waters of the Danube, at the sky, and at the sun. How beautiful the sky looked; how blue, how calm, and how deep! How bright and glorious was the setting sun! With what soft glitter the waters of the distant Danube shone. And fairer still were the faraway blue mountains beyond the river, the nunnery, the mysterious gorges, and the pine forests veiled in the mist of their summits… There was peace and happiness… “I should wish for nothing else, nothing, if only I were there,” thought Rostov. “In myself alone and in that sunshine there is so much happiness; but here… groans, suffering, fear, and this uncertainty and hurry… There—they are shouting again, and again are all running back somewhere, and I shall run with them, and it, death, is here above me and around… Another instant and I shall never again see the sun, this water, that gorge!…”

At that instant the sun began to hide behind the clouds, and other stretchers came into view before Rostov. And the fear of death and of the stretchers, and love of the sun and of life, all merged into one feeling of sickening agitation.” – Leo Tolstoy, War and Peace, Book II, Chapter VIII

crevasse

Navigating the terminus of the hollow, melted-out Sphinx Glacier.

The old man, chained, by time, to his wheelchair, looks up at me with eyes wide. Medical paraphernalia exudes from everywhere all over him. On his wrist, there is a coloured band with a name and a number and a barcode – the international accessory of the admitted infirm.

“Are you going to be around here for awhile?” he asks with open palms. Fingers spread, hands pointing up – like a supplicant.

I come to a halt in front of him, keys jangling at my waist, short-wave radio clasped to my belt. In the evenings I work security at the hospital, doing my two-hourly rounds through palliative care. Checking the locks on doors, alarm systems, fire panels. That kind of thing.
“Jack was a logger” according to the life synopsis that the nurses have sticky-taped to the wall next to the door to his room.
He left his native Ontario at age 15 and worked his way across the country on the trans-Canada railroad. A stint in the boiler rooms of the coal-powered ships crossing the Pacific followed; then time in Papua New Guinea hunting “alligators” [sic]. Later, he would “serve as a mercenary” and then, returning to Canada, with the RCMP as a Mountie above the Arctic Circle. Then, he settled down, in the fjords of British Columbia, with his wife and three children. This is the bio of a man who has lived a very full life – an adventurous life. Jack was a “fun hog” in the sense that Chouinard and Tompkins might have used the term.
“Are you going to be around here for awhile?” he asks me again.
I nod, and point to the “Security” embellishment on my uniform. “I’m always around,” I say.
“What?”
He doesn’t hear me. Jack is mostly deaf and the deafness does not help with the dementia. He beckons me toward him, asking me to repeat myself – and, leaning in, progressively closer, I eventually give up.
I hold up two fingers. Jack can still see. He gets it. Kind of. “You’re here for two hours?”
I nod. Close enough.
“But I need someone to watch out for me,” he says. “Can’t you stay awhile and watch out for me?”
I nod. “I’m here for you Jack,” I say. He doesn’t hear me.
“I need someone to watch out for me,” he repeats.
A nurse at the nursing station, seeing me detained part-way through my patrol, intervenes. “Come on now Jack,” she says, and she approaches, inserts herself into Jack’s surrounding and then smiles at me as if to say, “it’s OK, I’ve got this now. You’re right to go on.”
I look back at the bio sheet on the door again, reading more about Jack’s life. Here, the choice of tense in the wording stands out. Jack “was a logger”; “he enjoyed fishing”; “he took to deep-sea sailing on the West Coast”. Here is a life history written in the past tense – the same tense we employ for the life histories of Norgay, Napoleon, Nietzsche – as though the man were already dead.
A nurse reports that one of the maintenance guys has left the door to the outside workshop open. It’s my job to go and lock it. Access control. I step out a fire escape. The evening is clear and cold in Squamish. No winter rains today. Just the chill as the last of the day’s light disappears behind the Tantalus Range. I look east towards the Garibaldi range. In the distance, the Crosscut Ridge of Mt Isosceles is seen through the valley-gap between Crumpit Woods and the lower flanks of the Chief, silhouetted in the light of a rising moon. In its current state, caked in ice and snow, the Crosscut Ridge is very much in winter condition. Last summer, we’d tried to get in there to climb it, only to be shut down by weather and distance and ability. Late season conditions. Melting glaciers reaching the end of their lifespans.
crosscut ridge

The saw-toothed Crosscut Ridge, “the obscure object of our desire”, centre-right.

I was preparing for another shot at it in the early spring, hoping to use skis to cut the approach time by traversing the ice floes on Garibaldi Lake. This time before the summer sun had melted everything out and before the glacier became a labyrinth again.
I return inside and patrol through the “Intermediate Secured Unit” – where they put the high-risk patients – and then, with my rounds complete, I step out into the main hallway again. Someone else, Jim, an old miner, is complaining that another resident entered his room and stole all his stuff. He seems upset. Upset people can become aggressive and Jim has a history of aggression. For the most part, I ignore him. I let the nurses know about his problem and tell them to raise me on the radio if they need me.
I walk away. I don’t much want to grow old, I think to myself, although I know that one day I will have to. I don’t want to die either but I know that this is not an option available to me.
In pre-modern Japanese society, the base of Mount Fuji was said to be a site for a practice called ‘ubasute‘ – whereby the elderly and the infirm were left before the mountain’s bosom to die. Similar things have been said of pre-colonial Inuit society where “old Eskimos were set adrift on ice floes” – farewelled into Nature’s arms. The historicity of these past practices is the subject of intense debate. They may indeed just be myths. But the fact that rumours of these other-worldly practices have persisted (even if solely amongst foreigners gossiping about the Other), reminds us that the problem of how Man should spend his last days is a problem we have not yet solved as a species. We are uneasy about and perhaps not yet satisfied with the systems we have designed for dying. How can we be?
I walk on through the corridors, passing by the infirm in their beds – respirators on, holding on, clinging on. Televisions play in all the rooms. Just another half hour of television. Hold on just a little bit longer. I feel very happy for my beloved grandfather (just passed in December) that he did not spend long in permanent care before he died. He escaped that fate – the fate of a man dying while surrounded by others who are also dying. Quick and painlessly, he went.
The next day, the rains return but then it clears for a while around midday. I can see my objective again – the Crosscut Ridge. I imagine myself on top of the highest gendarme – picking my way along its plated back. I am looking across my domain – my mountains – and I am wondering what it will feel like to die. I am wondering what it must have been like for Ari, when he fell from Mount Aspiring. What were those seconds like? Those final seconds of falling, before impact on the Bonar Glacier? Surely, there must have been fear. Anxiety. But still, I have to believe, I must believe that he was at peace with himself – that he’d accepted it, and in accepting it, experienced a sensation of something akin to bliss.
Yes, I think to myself, gazing across at Garibaldi and Phyllis’ Engine and the Sphinx – mountains named for beings past, both real and fictive, with their own life histories attached. Death is a problem.
A host of dark questions gnaw at me. How do I stay alive in these mountains? How do I keep living without growing old? How do I face the inevitable without becoming a nihilist? How much more of this beauty can I enjoy before I am too old to keep seeking it out? And will I be able to find enjoyment, find beauty in other things, when I am too old and too weak and I’ve lost my mobility?
A few days later, I clock on again at the hospital and continue on my rounds through the residential home. Jack, in his wheelchair, is in the hallway again. He looks docile now. The feintest hint of a smile crosses his lips. Like the dying Count Bezukhov, the father of Pierre, the protagonist of War and Peace:
“While the count was being turned over, one of his arms fell back helplessly and he made a fruitless effort to pull it forward. Whether he noticed the look of terror with which Pierre regarded that lifeless arm, or whether some other thought flitted across his dying brain, at any rate he glanced at the refractory arm, at Pierre’s terror-stricken face, and again at the arm, and on his face a feeble, piteous smile appeared, quite out of keeping with his features, that seemed to deride his own helplessness.”
Jack, half-smiling still, is wheeled back into his room by a carer, embracing the infinite jest of it all. And me, the mountaineer just down from my mountains, the summiteer but after the fact, the security guard on my lonely night patrol – I am left, alone, in the hallway. Alone with another pithy quote. Nietzsche. The so-called nihilist, again.
One should part from life as Odysseus parted from Nausicaa,” Nietzsche wrote. “Blessing it, rather than in love with it.”
I poke my head around the corner and see Jack being helped out of the wheelchair and into his bed. He moves, at a glacial pace – the sound of the crepitus in his bones like the crack and grind of crevasses in the fracture zone. The whole mass is moving downstream to its end. Here, at his terminus, Jack is ready to go. Ready to transition from one world into the next.
11355942_696969237096714_1711542263_n

Why I Chose To Commemorate Australia Day This Year

***Editorial Note***

My views on Australia Day and the #ChangetheDate movement have changed. My current position on the issue can be seen in the Twitter thread I composed during my involvement with The Familiar Strange public anthropology project (click on the below).

That said, the blogpost below does reflect a position I previously held. So, for posterity’s sake (and as an illustration of how views evolve throughout an individual’s ontogeny) I am leaving the post up.

***

As the date marking the arrival of the First Fleet at Sydney Cove passes us by, we have all paid heed to the now-annual calls for Australia Day to be struck down in our national calendar. Yes, this year, like every year, we have heard how the date treasured by lovers of barbecues, beer and Triple J is not “Australia Day” but “Invasion Day” – a date which, rather than commemorating some abiding sense of Australianness instead grotesquely celebrates the beginning of White Man’s colonization of the Great Southern Land.

“January 26, 1788 marked the beginning of a cultural genocide which systematically dispossessed the indigenous peoples of Terra Australis of their land, history and future,” follows this argument. Therefore, its proponents claim, it is a national disgrace to be celebrating Australia Day on that date.

This year among the “Down With Australia Day” pronouncements, a popular video produced by Buzzfeed has been doing the rounds on social media. Labelled “an aboriginal response to ‘Australia Day’”, the video documents the responses of several indigenous speakers who discuss what Australia Day means to them. Celebrating Australia Day, according to several of these indigenous speakers, is “insensitive”; a commemoration of an “invasion”; a day which is “really really sad” for the suffering sewed by British colonists after their arrival.

In principle, there’s merit to some of the arguments advanced in the video. Perhaps a date like Federation Day – that is January 1 – would be a more appropriate date to celebrate Australia Day. For the most part though, the entire production only reproduces viral memetic untruths which add little to serious discussions about “real” issues in aboriginal Australia – like continuing disadvantage in remote-living communities.

It’s a shame, because by regurgitating spoon-fed fallacies about the history and culture of aboriginal Australia – most of which hold no weight anthropologically or historically – it leaves the viewer less informed and the whole debate in a state where only the most reactionary voices are likely to be heard.

The many fallacies orbiting this debate are perhaps best outlined in the form of a listicle. I say this half-ironically, of course, because the shallow analysis which propelled Buzzfeed to the mainstream is at the core of the problem. So. Now. A dissection of some random piece of click-bait I watched on social media – for better or for worse.

Point #1: Sweeping Generalisations about Indigenous Australia

What is perhaps most remarkable about this video is the sweeping generalisations and falsehoods many of its speakers make about Aboriginal Australia. This is even more remarkable because the speakers are indigenous Australians.

1a. “Oldest Surviving Culture”

The first and most obvious fallacy is one speaker’s assertion that Invasion Day marks the survival of the oldest culture on earth. Anyone who has browsed through a tourist brochure selling bite-sized aboriginal cultural experiences is probably familiar with the “oldest surviving culture” claim. But how accurate is this framing?

While aboriginal peoples have inhabited Australia for a long long time (certainly far longer than anyone else), even a cursory examination of what a “culture” actually is shows how utterly ridiculous it is to use a superlative like “oldest”.

Anthropology tells us that “culture” – a term which encompasses the discourses and practices within a given human society – is not static. Rather, “culture” is a continuously evolving set of norms in a state of constant flux – a vehicle in-motion not a petrified fossil. Since culture is ever-changing, ever-transforming, resembling something one day and something else the next, to talk about “aboriginal culture” as possessing attributes like “age” or “survivability” is utterly meaningless.

Of course, quantities we might call cultural “cores” do transmit across generations. Transmissable knowledge (like storytelling and native land management practices) and some aspects of material culture have survived millennia in many parts of Aboriginal Australia. And certainly, where a presiding Aboriginal sense of “being” is concerned, the connection with the past remains important, even if the significance of that connection becomes more abstract as the yawning gap between the Dreaming and the Now grows wider. Be that as it may, when there are tens of thousands of years between cultural forms the differences eventually outweigh the similarities.

In many ways then, talking about Aboriginal culture as being “the world’s oldest surviving culture” is akin to calling the mace carried by the Serjeant-at-Arms of the Australian Parliament “a cultural relic of the bludgeoning instruments used by early hominids in the East African Cradle of Humankind”.

Thus, the point stands that Aboriginal culture today is so utterly different to what it was in 1788 that attributing an age value to any currently-practiced customs and traditions makes little to no sense. (Remember, the destruction of pre-colonial Aboriginal Australia is the reason why Invasion Day is so controversial in the first place)

Moreover, perhaps the greatest irony in this video is the fact that the speakers discussing their “oldest surviving cultures” are wearing European-style business attire and Chinese-made German-branded Adidas T-Shirts, speaking English and talking into Japanese-made video cameras.

With this in mind, there is an obvious cognitive dissonance when people speak about “cultural genocide” and “oldest surviving cultures” in the same sentence. Which is it? Were the first Australian cultures wiped out or did they survive? Granted, it’s not necessarily a binary question – but it’s a question worthy of closer consideration.

Personally, I think that the dispossession of aboriginal people in Australia (which began in 1788) constituted a cultural genocide. As the destruction of the aboriginal population of Tasmania shows, this is a historical reality that would seem to fly in the face of the assertion that pre-colonial aboriginal culture has “survived” intact unto the present.

1b. “They were a peaceful people”/”We are an inclusive people”

According to the young boy interviewed in the video (who, it should be noted, is clearly below the age of informed consent as an interviewee), the arrival of the First Fleet was a day when Europeans came and slaughtered “a peaceful people”.

Apart from the historical fallacy that the First Fleeters simply landed and started slaughtering people on the day they arrived (more on this later), there is a more pernicious untruth to the claim that the indigenous inhabitants of Australia were any more “peaceful” than any other people who have ever lived.

“Inclusiveness” is also used to describe pre-colonial aboriginal culture. But how accurate is this? While most of the aboriginal informants I have come across during ethnographic research in Cape York could be described as both “inclusive” and “peaceful” (for the most part, at least)  to claim that either of these adjectives are abiding, essentialising cultural traits is to make a sweeping generalisation without the backing of the empirical record – an over-simplification which borders on stereotype.

Certainly, in pre-colonial Kuuk Thaayorre society, clan rivalries saw the Thaayorre come into almost constant violent contact with members of the Kuuk Yaak language group (“snake speakers”) – a historical enmity which ultimately manifested in the eradication of the Kuuk Yaak as a cultural unit.

No one in the modern Cape York community of Pormpuraaw self-identifies as “Kuuk Yaak” anymore – one is either “Wik-Mungkan” (a language group with strong ties to the township of Aurukun to the north) or Thaayorre. The Kuuk Yaak were literally wiped out. This seems neither “inclusive” nor “peaceful” to me.

599143_10151312975121974_972876868_n

My good friend Peret Arkwookerum (nicknamed “Wookie” meaning “flying fox” – also one of his totems – half Wik-Mungkan, half Kuuk Thaayorre, catching a black bream on his first cast at a sacred site near Pormpuraaw

We know of course, that the interviewees are trying to argue that the pre-colonial Eora of Sydney were “peaceful” and “inclusive” – at least in comparison to the world-destroying British. But even in the case of the Eora, there is little evidence to suggest that they were any less war-like than any other human group that has ever existed.

Incidents of spearing were common occurrences among the natives of pre-colonial Sydney. Disputes were often settled by violence. Under Pemulwuy, a group of aboriginal insurgents gathered to resist (perhaps rightfully so) the settlers occupying their lands. Around the Eora campfires of Sydney Cove, discussions about immigration and the unwanted arrival of the “boat people” were vociferous and heated. Pemulwuy himself was rumoured to have been blinded in one eye in a violent incident with an enemy from another tribe.

Indeed, with all the violence and exclusivity observed throughout the history of Aboriginal Australia it is fair to say that perhaps one of the most remarkable features about Aboriginal people, historically and into the present, is how remarkably like the rest of us they are. Aboriginal people were and are people – and like all societies, pre-colonial Aboriginal society had its racism and its bloodshed, its in-group/out-group-isms and its conflict.

Peace and not conflict is the exception to the rule throughout much of human history. It was no different in the Australia that existed before the arrival of Europeans. To imagine pre-colonial Aboriginal society as having embodied some kind of Utopian dream-state is to cling to the long-since discredited Rousseausian myths of Early Man. It also denies almost everything we know about the evolution of hominids. Namely, that our capacity for violence (if not our propensity it) has deep evolutionary roots – inextricably tied as it to our higher-order forms of social organization and our complex cognition. By contrast, if we’re to persist with this imagined version of a warless early Australia, then we may as well start waxing lyrical about “noble savages”.

bennelong

Bennelong, an Eora collaborator described by Watkin Tench as “a second Omai”, the textbook “noble savage”.

Pemulwuy_aka_Pimbloy

Pemulwuy. Aboriginal Australia’s most successful guerrilla leader.

 

Point #2: Excessive Use of the First Person Plural (“We”, “Our”)

One of the common traps we often fall into when talking about the historical lives of our ancestors is what I would like to call “the excessive use of the first person plural”. When one sees historical persons as part of the extended genealogical network which we call “our family” it is easy to start using terms like “we” and “our” in discussions about events that took place centuries ago. Even if we ourselves weren’t victims of what happened to these historical family members, the pain experienced by them is often experienced inter-generationally. In making sense of this phenomenon however, it’s also important to remember that we also possess some agency in how we choose to internalize this pain.

Now, I’m not going to claim that there is no validity to the idea of “inherited grievance” or “intergenerational trauma”. There is significant empirical evidence to support the idea that physical manifestations of hurt can be experienced over and over again by descendants of the initially aggrieved. (New research in the domain of epigenetics is shedding interesting light on this). Nor am I going to deny that oppression and structural violence experienced by members of the same social group can be felt, in real terms, for centuries (and still continues to be felt by aboriginal peoples today). Ancestry is a complex issue which is heavily tied to peoples’ conceptions of their own identity.

Even so, it’s still important to remember that no one alive today was also alive on January 26, 1788, so the loss suffered by the indigenous peoples of Sydney Cove was a loss that no one today actually experienced directly. That hurt, though it might continue to resonate in the present, was transmitted socially – not felt in the flesh. The implications of this factual statement are important when considering reconciliation efforts because the point I am making is simple: even though trauma echoes into the future (usually in the form of memory) there is no reason why the trauma of past others should determine the future.

In a similar vein, some years ago, while munching on a shawarma in a Jerusalem hole-in-the-wall eatery, I listened to an Israeli man prattle on about “how we [the Israelites] suffered at the hands of the Philistines (the pre-modern Palestinians)” – how “they took our land [et cetera, et cetera].” This was why, apparently, “his people” were perfectly justified in wresting the Holy Land back from the Palestinians “all the way to the banks of the River Jordan”.

Naturally, being in Israel and surrounded by heavily armed IDF soldiers doing the rounds through the Old City, my immediate reaction was to smile and nod. Inwardly however, I couldn’t help but think: “Really? Did the suffering at the hands of the Philistines actually happen to youYou personally?”

This point – which pertains to the weaponization of traumatic memory – is a point that the historian Norman Finkelstein has previously brought up in his critiques of the Israeli government’s “Holocaust industry” (a term whose framing I personally disagree with). At what point are we abusing the power of memory by internalizing the trauma of long-dead victims in the present?

idf

In the presence of overwhelming firepower one is inclined to agree with whatever one is told. In the Old City of Jerusalem, some years ago.

There’s no easy answer to this question. It’s not binary – it’s complicated. But from my Jerusalem anecdote, it’s easy to see two problems created when we form overly strong connections with past generations: 1.) it is harmful for conflict resolution and it can perpetuate cycles of violence (as we see in the “who stole whose land” debates in the Holy Land or, say, Yugoslavia) and 2.) it becomes easy to fall into an ancestral phantasm whereby you confuse something that happened to a historical person (who you never actually met) with something that happened to you, yourself.

Of course, I’m not suggesting that the subjugation of the Eora peoples in New South Wales in 1788 is something that has no relevance for a Guugu Yimidhirr person in Far North Queensland in the present.

Events like the arrival of the First Fleet are great examples of the butterfly effect – continuing as 1788 does to generate sociological hurricanes across the continent. A small flap of the wings like the landing at Sydney Cove was the chronological initiator of a centuries-long genocide. History, in this sense, is veritably macrolepidopteran.

Equally, I’m not suggesting that today’s aboriginal Australians should collectively “get over” the dispossession of their ancestors from their native lands. Nor am I suggesting that it is wrong to draw parallels between the historical suffering of Australia’s first inhabitants and the ongoing structural violence directed against aboriginal peoples.

It certainly would be insensitive to tell anyone to “get over” a cultural genocide and it would be factually incorrect to claim that the use of the first person plural in the context of one’s ancestors never holds any weight.

What I am really railing against here is the excessive use of terms like “we” and “our” when talking about historical victims – the inappropriate fusion of past persons with ourselves. I have genetic links to the starving Irish who were loaded onto ships and sent to a penal colony in the Southern Hemisphere. And yet I did not feel their hunger. I am not them – those same Irish.

I have a close ancestral link to Lt Jack Walsh, the first Queensland officer to take a bullet to the head at the landing of ANZAC. And yet, I wouldn’t have the slightest idea what taking a bullet to the head actually feels like. While I do not begrudge other veterans who feel strong personal attachments to the shores of Gallipoli, I claim no real inheritance to the “glory of ANZAC”, whatever that is. I wasn’t there, so that particular battle honour should really have no bearing on my life, my curriculum vitae, and the foundations of my identity.

Similarly, while it is perfectly valid for me to claim that my ancestor the Scottish outlaw Rob Roy McGregor was “one of us” (“us” being “Clan Cattanach”: “touch not the cat, bot the glove”), it would be excessive to claim that everything McGregor lost at the hands of the English was also physically lost by me as his ancestor.

To claim Rob Roy McGregor’s suffering as my own would be akin to claiming his achievements as my own, in a way which conjures up today’s “ugly American” laying claim to “the liberation of France from the Nazis”. Very few Americans alive today can claim this as one of their own personal achievements (see comedian Doug Stanhope tear this sentiment apart).

The past does continue to be felt and heard. But only in the form of echoes. Or through the structures it leaves behind.

 

3. The Date Itself

Screen Shot 2016-01-26 at 12.18.44 am

Perhaps the most eloquent speaker in the video is the bloke in the red and blue shirt. His understanding of Australia Day, as he describes it, is like “if a guy comes into your house, does horrible things to your family, and says ‘we’re gonna have a party and have a barbie and listen to Triple J on the date we turned up’.”

Leaving aside the debatable use of “we”, if read solely as a celebration of “the day White Man turned up” (a date which symbolically represents the beginning of a cultural genocide), it’s true that Australia Day might fairly be interpreted as a bit “sadistic”.

Again, I agree that there is some merit to the idea of picking a different date to celebrate Australia Day. Perhaps a more neutral date like the date of Federation in 1901 would be more appropriate – given that it doesn’t carry the same historical and emotional baggage as the arrival of the First Fleet.

But to play devil’s advocate, if we as Australians have a responsibility to “never forget” what happened to Aboriginal Australians under colonial rule then doesn’t it make sense to commemorate the arrival of the First Fleet in much the same way that “never again” commemorations have memorialised the tragedy of genocide in Rwanda or South Africa?

Isn’t it a good thing that the counter-cultural “Invasion Day” is dredged up every year simply because of the date on which Australia Day falls? Wouldn’t all the awareness-raising efforts about the atrocities in Australian history fade into obscurity if the PM just went and changed our national day to the whatever-th of July?

Similarly, if one is being faithful to the historical record, one should also acknowledge that January 26, 1788 most certainly was not the bloodiest chapter in the history of European colonialism in Australia. There were no slaughters or massacres carried out on the day of the Sydney Cove landing.

Arthur Phillip didn’t simply arrive and begin slaughtering (though his miscreant gamekeeper would later develop a horrible proclivity for that).

According to my reading of history, the actual “invasion” – the very first boat landing – was actually a few days before January 26 anyway. The 26th was merely the date when the colony of New South Wales was formally declared. Compared to some of the other dates in the colonial history of Australia, January 26 was comparatively tame.

Australia Day does not commemorate, for example, the date of the first landfall made by Europeans on Australian shores – June 1605 – when the Dutch navigator Willem Janszoon, made the first contact with aboriginal Australians at Cape Keerweer – a contact which was characterised by the massacre of “savage, cruel, black barbarians” who had slain some of Janszoon’s sailors.

Neither does Australia Day celebrate travesties like the Black War in Tasmania, part of which involved the formation of an extended line by the 63rd Regiment to corral Tasmanian aborigines into a penal colony on the Tasman Peninsula.

Certainly, while the landing at Sydney Cove did mark the beginning of colonization, the date of the landing itself – January 26, 1788 – was a pretty low-key, native-friendly event. Per the accounts of Watkin Tench and others, the amicable relations between aboriginals and settlers continued peacefully for at least the first year until the Governor’s game-keeper, John McKintyre, started slaughtering Eora for fun on his hunting parties -resulting in his own death at the hands of Pemulwuy.

Of course, in downplaying the symbolic importance of the 26th, I’m not seeking to revise the history of the First Fleet’s arrival by painting it as a harmless event in our nation’s history. It may have been the contingent event upon which modern Australia was founded but that doesn’t make it something to be overly proud of.

More than that, what I’m not calling for is an Andrew Bolt version of Australia where Aboriginal people just move on from the wrongs done to them and “pick themselves up by their own bootstraps”. Nor am I advocating for any particular position in the discussion over who gets what in modern Australia – the ins-and-outs of Native Title and reparations still need some work.

What I am calling for is a little bit more intellectual honesty in the way we discuss the past. Yes, the colonisation of Australia and the dispossession of its native peoples was a travesty of genocidal proportions. But no, the First Fleet did not land at Sydney Cove and immediately begin “slaughtering” people in droves (as the Buzzfeed video incorrectly claims). No one was killed or poorly-treated on the 26th.

Likewise, while aboriginal Australians have inhabited Australia for a period dating back at least 50,000 years, the heterogenous Aboriginal cultures of today are “not the world’s oldest surviving culture” because the very idea of an oldest surviving culture is a load of anthropological horse-shit.

(And anyway, what about the uncontacted peoples of the Orinoco basin or the grumpy resistant-to-contact North Sentinelese? These groups continue, for the most part, to live in the isolation they created for themselves many thousands of years ago.)

Ultimately, it’s worth emphasising that the video was produced by Buzzfeed (under the watermark “Buzzfeed Aboriginal”) – the internet’s classic purveyor of clickbait-for-profit. So perhaps it is not really worthy of serious intellectual consideration. At the same time, the video still speaks to an increasingly-popular discourse and it has had an impact – purpose-designed as it is to emotionally-manipulate us into sharing and spreading (not unlike war-prop videos produced by ISIS or the Lions of Rojava in Syria). And yes, sharing and spreading is something that many all over my Facebook newsfeed have certainly done… by my last count this video has 2,082,458 views.

Indubitably, the white demographic of the video-sharers is worthy of note. Conspicuously absent from the re-share meme-train are any of my aboriginal and Torres Strait Islander friends on Facebook – probably because they are too busy catching barramundi or counting crocodile eggs with the Indigenous Land and Sea Rangers program. Or doing other, more useful things… like protecting the country.

300006_320743244606798_930254034_n

Wookie examines one of his totemic ancestors (“minh pinch” is the Kuuk Thaayorre word for “crocodile”)

As for me, while my aboriginal friends are out fishing and drinking beer on Australia Day lapping up some or other gorgeous Cape York sunset, I’m writing this from the cold depths of wintry Canada. This 26th, I’ll be spending the rest of the day dreaming of barbecues, thongs, beaches and Triple J.

After that perhaps, I’ll be waiting out for ANZAC Day – sharpening my pencils for the annual debate over whether the remembrance of the landing at ANZAC constitutes a day for the mourning of dead sons or a day when Australians unite to glorify bloodshed and violence. Probably, ANZAC Day (like Australia/Invasion Day) is a little bit of both – a celebration for what we have and a remembrance of what we have lost.

525762_10151312975416974_1314406714_n

Sunset over the Gulf of Carpentaria in the Western Cape York community of Pormpuraaw. Reminds me very much of the Aboriginal flag.

 

A Few Words on Existential Threats, Syrian Refugees… and the Mongols

While Donald Trump and the various Republican fear-mongerers have been crying Lacoön about the so-called existential threat posed by Syrian refugees and outlining vague stratagems about how just a few extra bombs will solve the Middle East’s problems, I’ve been doing some learnzerizing about the Mongols.

Yeah, the Mongols. You know…. Genghis Khan and his prodigal general Subutai. And the sons and grandsons of Genghis as well – Ögedei, Batu, Hulagu and Möngke. Those Mongols. “The fearsome Mongol hordes. The infamous horse raiders from the Eastern Steppe.

I got onto this tangential Mongol reading binge after noticing some parallels between the apocalyptic horse-borne invasions of the Khans and a popular trope in jihadist eschatology which prophesises the arrival of the Mahdi (the redeemer of Islam) at the head of a great horde coming from the cold netherworlds of Khorosan (a historical region to the northeast of Ancient Persia).

The prophecy of the Mahdi has its origins in an unauthenticated transmission from hadith which says: “if you see the black banners coming from the direction of Khorosan, then go to them, even if you have to crawl over ice, because among them will be Allah’s Caliph – the Mahdi.”

Jihadists, of course, see themselves as the manifestation of these storied Riders of Khorosan and in keeping with the spoken tradition that the Mahdi’s Army will one day defeat the Anti-Christ at the gates of Jerusalem, a patchwork of jihadist cells linked to Al-Qaeda Central (the AfPak coven led by Al-Zawihiri) were reported to be operating in Syria under the moniker “Khorosan”.

Screenshots from jihadist videos dealing with the prophecy of the Mahdi and the “riders of Khorosan”

riders

Imaginariums of all-conquering “black swan”-type hordes from the East have played an important role in the history of Islam. Apart from their role in the fantasies of modern salafist-jihadists, the prophecy of the Mahdi and his Black Banners was also invoked by the Abbasids during the revolution of 750 which overthrew the Ummayid Caliphate.

Elsewhere, and from a position which saw the inscrutable East as the dwelling-place of a great and unimaginable evil, the Qur’an contains a story where Alexander the Great is helped by God to build a wall with which to contain Gog and Magog (the tribal personification of chaos) in order to prevent them from wreaking havoc upon the world. Indeed, these apocalyptic ideas remained so strong for believers that centuries later, when the Mongols arrived to sack the Muslim and European worlds, the Khans were seen by many as the Quranic “Gog and Magog” worst-case-scenario finally realised.

A 13th century bestial representation of "Juj and Majuj" (Gog and Magog) from the Qur'an

A 13th century bestial representation of the mischievous “Juj and Majuj” (Gog and Magog) from the Qur’an

Certainly, the Mongols, as far as medieval Muslims and Europeans were concerned, were the ontological (and visceral) successors to the Huns of Rome-sacking fame – a pastoral people from the East who, having formed an almighty confederacy, had mounted an unstoppable cavalry charge to conquer the known world. To Medieval Europeans, the Mongols were like the Four Horsemen of Christian eschatalogia – destroyers of worlds. To the Muslims of the Middle East, the Mongols’ arrival served up an equal helping of doom – the end of the Golden Age of Islam.

Writing about the arrival of the “Tartars” in Muslim lands, the 13th century Arab historian Ali ibn Al-Athir began with the following: “to whom, indeed, can it be easy to write the announcement of the death-blow of Islam and the Muslims, or who is he on whom the remembrance thereof can weigh lightly? O, would that my mother had not born me or that I had died and become a forgotten thing ere this befell! … Nay, it is unlikely that mankind will see the like of this calamity, until the world comes to an end and perishes, except the final outbreak of Gog and Magog.” Even professional historians it seems, had trouble recounting the horrors of the Mongols.

Bearing witness to an approaching Mongol horde must have been terrifying. When Genghis Khan entered Nishapur, one of the first major cities he encountered on the edge of Persia, the riders under his command were said to have murdered 1.7 million people in a matter of hours. The skulls of the slaughtered were piled up in pyramids next to the city gates.

When his grandson Hulagu reached Baghdad (the centre of science and government in the medieval Islamic world), the city was sacked so thoroughly that the Tigris, which had once been described as a river running through town “like a string of pearls between two breasts” now “ran black with scholars’ ink and red with the blood of martyrs”.

Mustasim, the Caliph of Baghdad himself, was awarded special treatment. Accounts vary between the more “conventional” tale that he was simply wrapped up in a carpet and trampled to death by Mongol horses and the more fantastic story told by Marco Polo in The Travels – that he was locked in his own treasure room without food or water and told to eat as much of his own treasure as he “wilt”.

The havoc wreaked by the Mongols set off a great deal of fear-mongering in the lands of their enemies and in many ways, when we listen to the various pundits taking over our newscasts, we can see much of the same threat rhetoric now being employed to describe the hordes of ISIS.

1024px-DiezAlbumsArmedRiders_II

Mongol cavalry on the move

ISIS cavalry on the move

ISIS cavalry on the move

I should say that the threat rhetoric surrounding ISIS isn’t, in and of itself, wrong. Indeed, if we compare the “Riders of the Khan” with today’s “Riders of Khorosan”, we can observe some astonishing similarities between the Mongols and ISIS. First, there is the almost-unique proclivity for cruelty and destruction. The Khan’s gold-feeding method of execution seems like it would slot perfectly into one of ISIS’ snuff films. And though ISIS’ war on archaeology is perhaps even more intense than the cultural destruction reaped by the Mongols (the Khans were apparently quite respectful of the mosques, cathedrals and pagodas in the cities they conquered) both groups proved themselves very good at making ruins of fine things.

The second similarity however (and the most crucial similarity to this piece), is the one we see when we look at how the Mongols and ISIS were able to exploit the internal divisions of their enemies. Hulagu’s Mongols, it seems, were able to exploit existing grievances by driving a wedge between the Sunni Caliph and some of his non-Sunni subjects. In the final years of his rule, Mustasim had gained some notoriety after throwing a copy of a celebrated Shia poem into the Tigris – an unforgiveable insult to an already-livid Shia population. The Caliph’s own Shia vizier would later defect to the Mongols and prove integral in the eventual fate of Baghdad.

Likewise, in modern Iraq where a post-Saddam Shia-dominated government oversaw the widespread repression of non-in groups, ISIS was able to obtain bay’ah from the anti-Maliki Sunni tribes in Anbar, creating a tribal union which accelerated ISIS’ military advances in early 2014. Indeed, just as the Mongols took Baghdad riding on the coat-tails of internal weakness, so too ISIS has shown itself capable of riding Baghdad-ward on the backs of angry Sunni tribesmen.

Ex-Pres of Iraq Nouri Al-Maliki. A modern day stand-in for Caliph Mustasim of Baghdad? (note: "malik" means "king" in Arabic).

Ex-Pres of Iraq Nouri Al-Maliki. A modern day stand-in for Caliph Mustasim of Baghdad? (note: “malik” means “king” in Arabic).

Insofar as these problematic internal divisions can be observed in our own society, there is perhaps no clearer example of Mustasim-style division-making than some of the recent discourse concerning the moral value of Islam (and its place in our society) and the “threat” posed by the Syrian refugees waiting on our borders.

When, just days after the Paris attacks, it was revealed that one of the perps was carrying a (rightfully- or wrongfully-acquired) Syrian passport an uproar about the so-called existential threat posed by Syrian refugees  (and Muslims more generally) began in earnest. Donald Trump began talking about a religious ID card for American Muslims (with terrific responses from those same Muslims). Marco Rubio proposed refusing entry to certain refugees based on their Syrian-ness. Letting Syrian refugees into the West, these people have argued, might be akin to “letting in a Trojan Horse”.

A similar bout of refugee fear-mongering began when the Mongols arrived on the doorstep of Medieval Europe. Having conquered most of Asia, when the Mongols descended upon Hungary, they did so harassing the fall-behinds of a wave of refugees from a place called Cumania. The Cumans were a nomadic Turkic people (i.e.: they looked different and had a different God) and when a wave of about 40,000 of them were allowed to settle in the lands of King Bela IV, many of the Hungarian nobles immediately suspected the Cumans (in particular their leader Köten) of having Mongol sympathies.

A medieval mosaic depicting a Hungarian noble killing a Cuman... the two groups had some baggage

A medieval mosaic depicting a Hungarian noble killing a Cuman… the two groups had some baggage

Eventually, discord between the settled Hungarian agriculturalists and the itinerant Cuman pastoralists reached boiling point. Köten was assassinated by a group of angry nobles on suspicion of being a spy. A rabble of angrier Cumans began plundering the Hungarian countryside. Things fell apart. Unfortunately for all parties concerned however, this internal discord also happened to coincide with the eve of the main Mongol invasion – the real existential threat menacing Europe. The Hungarians were routed, utterly, at Mohi. And Hungary suffered a similar fate to that of Baghdad under Hulagu.

Bearing in mind that for the most part, we humans as a species are very bad at learning from the lessons of the past, there are some very important takeaways from the case of the Cumans in Hungary – the main one being that we should be careful about domestic fear-mongering when there are real threats overseas.

While we’re on the point of refugees, I should point out that history is replete with negative repercussions caused by sudden mass immigration. Just ask today’s Gazans what they think of the Jewish refugees who ended up on the shores of British Palestine. Multi-culturalism is not always pretty. Racism is common in ethnically-diverse communities because, as a general rule of thumb, people tend to get along better with people that look like they do and think like they do.

But though I’ve commented extensively about the need to return to a more isolationist world outlook, ultimately, there’s really only one course of action to take on the refugee issue. Let a few of them in. Because what harm can these people really do? At the very very worst, our post-Paris fears are realised and some kind of shoot-up occurs.

Indeed, as an article in Foreign Policy recently suggested, ISIS-inspired attacks “will not disappear, but they will be too few and [too] small in scope to topple a government”. A terrorist attack – hell, even a nuclear terrorist attack – is not an existential threat to our society. A nuclear bomb exploding in Sydney or New York would be a real bummer, but it wouldn’t spell the end of our society.

Indeed, the real “existential danger”, as political anthropologist David Killcullen has argued, is that “our response to terrorism could cause such measures that, in important ways, we would cease to be ourselves”.

The refugee crisis is more complex than some commentators like ex-soldier Harry Leslie Smith have painted it to be. The economic reality is that Western countries can’t open their borders to everyone from every warzone around the world. We have to be comfortable, if only for the sake of our own sanity, with the fact that we can’t eliminate the suffering of all of the world’s refugees.

But if we follow the medieval Hungarian example and begin demonising people based on who they are (Muslims) and where they come from (Syria) then all we are likely to achieve is the creation of a society with widening schisms between the different faiths and worsening tensions between the different ethnic groups. A recipe for a domestic societal crisis. And all this is likely to do is weaken us against the real existential threats – such as the Mongol-Khorosan hordes on our doorstep.