joker123 jforeman » Jonathan Foreman

jforeman

Terror & the Failure of the Liberal Imagination (Commentary July/Aug 2017)

Comment/OpEd, Essays/Book Reviews, Latest Articles, Old Articles/Archive Comments Off on Terror & the Failure of the Liberal Imagination (Commentary July/Aug 2017)

Three attacks in Britain highlight the West’s inability to see the threat clearly

The real lesson of the recent terrorist attacks in the UK is not that all open Western societies are frighteningly vulnerable. It is not that our homegrown terrorists have finally realized that low-tech attacks using vehicles and knives are easy to carry out and hard to prevent. Nor is it that our population is impressively courageous in the face of such attacks. The real lesson is that the British state and the British political class are, and have long been, fundamentally unserious about the dangers presented by Islamist terrorism.

This lack of seriousness manifests itself in several ways. It’s perhaps most obvious in the failure to reform Britain’s chaotic immigration and dysfunctional asylum systems. But it’s also abundantly clear from the grotesque underfunding and under-resourcing of domestic intelligence. In MI5, Britain has an internal security service that is simply too small to do its job effectively, even if it were not handicapped by an institutional culture that can seem willfully blind to the ideological roots of the current terrorism problem.

In 2009, Jonathan Evans, then head of MI5, confessed at a parliamentary hearing about the London bus and subway attacks of 2005 that his organization only had sufficient resources to “hit the crocodiles close to the boat.” It was an extraordinary metaphor to use, not least because of the impression of relative impotence that it conveys. MI5 had by then doubled in size since 2001, but it still boasted a staff of only 3,500. Today it’s said to employ between 4,000 and 5,000, an astonishingly, even laughably, small number given a UK population of 65 million and the scale of the security challenges Britain now faces. (To be fair, the major British police forces all have intelligence units devoted to terrorism, and the UK government’s overall counterterrorism strategy involves a great many people, including social workers and schoolteachers.)

You can also see that unseriousness at work in the abject failure to coerce Britain’s often remarkably sedentary police officers out of their cars and stations and back onto the streets. Most of Britain’s big-city police forces have adopted a reactive model of policing (consciously rejecting both the New York Compstat model and British “bobby on the beat” traditions) that cripples intelligence-gathering and frustrates good community relations.

If that weren’t bad enough, Britain’s judiciary is led by jurists who came of age in the 1960s, and who have been inclined since 2001 to treat terrorism as an ordinary criminal problem being exploited by malign officials and politicians to make assaults on individual rights and to take part in “illegal” foreign wars. It has long been almost impossible to extradite ISIS or al-Qaeda–linked Islamists from the UK. This is partly because today’s English judges believe that few if any foreign countries—apart from perhaps Sweden and Norway—are likely to give terrorist suspects a fair trial, or able to guarantee that such suspects will be spared torture and abuse.

We have a progressive metropolitan media elite whose primary, reflexive response to every terrorist attack, even before the blood on the pavement is dry, is to express worry about an imminent violent anti-Muslim “backlash” on the part of a presumptively bigoted and ignorant indigenous working class. Never mind that no such “backlash” has yet occurred, not even when the young off-duty soldier Lee Rigby was hacked to death in broad daylight on a South London street in 2013.

Another sign of this lack of seriousness is the choice by successive British governments to deal with the problem of internal terrorism with marketing and “branding.” You can see this in the catchy consultant-created acronyms and pseudo-strategies that are deployed in place of considered thought and action. After every atrocity, the prime minister calls a meeting of the COBRA unit—an acronym that merely stands for Cabinet Office Briefing Room A but sounds like a secret organization of government superheroes. The government’s counterterrorism strategy is called CONTEST, which has four “work streams”: “Prevent,” “Pursue,” “Protect,” and “Prepare.”

Perhaps the ultimate sign of unseriousness is the fact that police, politicians, and government officials have all displayed more fear of being seen as “Islamophobic” than of any carnage that actual terror attacks might cause. Few are aware that this short-term, cowardly, and trivial tendency may ultimately foment genuine, dangerous popular Islamophobia, especially if attacks continue.

Recently, three murderous Islamist terror attacks in the UK took place in less than a month. The first and third were relatively primitive improvised attacks using vehicles and/or knives. The second was a suicide bombing that probably required relatively sophisticated planning, technological know-how, and the assistance of a terrorist infrastructure. As they were the first such attacks in the UK, the vehicle and knife killings came as a particular shock to the British press, public, and political class, despite the fact that non-explosive and non-firearm terror attacks have become common in Europe and are almost routine in Israel.

The success of all three plots indicates troubling problems in British law-enforcement practice and culture, quite apart from any other failings on the parts of the state in charge of intelligence, border control, and the prevention of radicalization. At the time of writing, the British media have been full of encomia to police courage and skill, not least because it took “only” eight minutes for an armed Metropolitan Police team to respond to and confront the bloody mayhem being wrought by the three Islamist terrorists (who had ploughed their rented van into people on London Bridge before jumping out to attack passersby with knives). But the difficult truth is that all three attacks would be much harder to pull off in Manhattan, not just because all NYPD cops are armed, but also because there are always police officers visibly on patrol at the New York equivalents of London’s Borough Market on a Saturday night. By contrast, London’s Metropolitan police is a largely vehicle-borne, reactive force; rather than use a physical presence to deter crime and terrorism, it chooses to monitor closed-circuit street cameras and social-media postings.

Since the attacks in London and Manchester, we have learned that several of the perpetrators were “known” to the police and security agencies that are tasked with monitoring potential terror threats. That these individuals were nevertheless able to carry out their atrocities is evidence that the monitoring regime is insufficient.

It also seems clear that there were failures on the part of those institutions that come under the leadership of the Home Office and are supposed to be in charge of the UK’s border, migration, and asylum systems. Journalists and think tanks like Policy Exchange and Migration Watch have for years pointed out that these systems are “unfit for purpose,” but successive governments have done little to take responsible control of Britain’s borders. When she was home secretary, Prime Minister Theresa May did little more than jazz up the name, logo, and uniforms of what is now called the “Border Force,” and she notably failed to put in place long-promised passport checks for people flying out of the country. This dereliction means that it is impossible for the British authorities to know who has overstayed a visa or whether individuals who have been denied asylum have actually left the country.

It seems astonishing that Youssef Zaghba, one of the three London Bridge attackers, was allowed back into the country. The Moroccan-born Italian citizen (his mother is Italian) had been arrested by Italian police in Bologna, apparently on his way to Syria via Istanbul to join ISIS. When questioned by the Italians about the ISIS decapitation videos on his mobile phone, he declared that he was “going to be a terrorist.” The Italians lacked sufficient evidence to charge him with a crime but put him under 24-hour surveillance, and when he traveled to London, they passed on information about him to MI5. Nevertheless, he was not stopped or questioned on arrival and had not become one of the 3,000 official terrorism “subjects of interest” for MI5 or the police when he carried out his attack. One reason Zaghba was not questioned on arrival may have been that he used one of the new self-service passport machines installed in UK airports in place of human staff after May’s cuts to the border force. Apparently, the machines are not yet linked to any government watch lists, thanks to the general chaos and ineptitude of the Home Office’s efforts to use information technology.

The presence in the country of Zaghba’s accomplice Rachid Redouane is also an indictment of the incompetence and disorganization of the UK’s border and migration authorities. He had been refused asylum in 2009, but as is so often the case, Britain’s Home Office never got around to removing him. Three years later, he married a British woman and was therefore able to stay in the UK.

But it is the failure of the authorities to monitor ringleader Khuram Butt that is the most baffling. He was a known and open associate of Anjem Choudary, Britain’s most notorious terrorist supporter, ideologue, and recruiter (he was finally imprisoned in 2016 after 15 years of campaigning on behalf of al-Qaeda and ISIS). Butt even appeared in a 2016 TV documentary about ISIS supporters called The Jihadist Next Door. In the same year, he assaulted a moderate imam at a public festival, after calling him a “murtad” or apostate. The imam reported the incident to the police—who took six months to track him down and then let him off with a caution. It is not clear if Butt was one of the 3,000 “subjects of interest” or the additional 20,000 former subjects of interest who continue to be the subject of limited monitoring. If he was not, it raises the question of what a person has to do to get British security services to take him seriously as a terrorist threat; if he was in fact on the list of “subjects of interest,” one has to wonder if being so designated is any barrier at all to carrying out terrorist atrocities. It’s worth remembering, as few do here in the UK, that terrorists who carried out previous attacks were also known to the police and security services and nevertheless enjoyed sufficient liberty to go at it again.

But the most important reason for the British state’s ineffectiveness in monitoring terror threats, which May addressed immediately after the London Bridge attack, is a deeply rooted institutional refusal to deal with or accept the key role played by Islamist ideology. For more than 15 years, the security services and police have chosen to take note only of people and bodies that explicitly espouse terrorist violence or have contacts with known terrorist groups. The fact that a person, school, imam, or mosque endorses the establishment of a caliphate, the stoning of adulterers, or the murder of apostates has not been considered a reason to monitor them.

This seems to be why Salman Abedi, the Manchester Arena suicide bomber, was not being watched by the authorities as a terror risk, even though he had punched a girl in the face for wearing a short skirt while at university, had attended the Muslim Brotherhood-controlled Didsbury Mosque, was the son of a Libyan man whose militia is banned in the UK, had himself fought against the Qaddafi regime in Libya, had adopted the Islamist clothing style (trousers worn above the ankle, beard but no moustache), was part of a druggy gang subculture that often feeds individuals into Islamist terrorism, and had been banned from a mosque after confronting an imam who had criticized ISIS.

It was telling that the day after the Manchester Arena suicide-bomb attack, you could hear security officials informing radio and TV audiences of the BBC’s flagship morning-radio news show that it’s almost impossible to predict and stop such attacks because the perpetrators “don’t care who they kill.” They just want to kill as many people as possible, he said.

Surely, anyone with even a basic familiarity with Islamist terror attacks over the last 15 or so years and a nodding acquaintance with Islamist ideology could see that the terrorist hadn’t just chosen the Ariana Grande concert in Manchester Arena because a lot of random people would be crowded into a conveniently small area. Since the Bali bombings of 2002, nightclubs, discotheques, and pop concerts attended by shameless unveiled women and girls have been routinely targeted by fundamentalist terrorists, including in Britain. Among the worrying things about the opinion offered on the radio show was that it suggests that even in the wake of the horrific Bataclan attack in Paris during a November 2015 concert, British authorities may not have been keeping an appropriately protective eye on music venues and other places where our young people hang out in their decadent Western way. Such dereliction would make perfect sense given the resistance on the part of the British security establishment to examining, confronting, or extrapolating from Islamist ideology.

The same phenomenon may explain why authorities did not follow up on community complaints about Abedi. All too often when people living in Britain’s many and diverse Muslim communities want to report suspicious behavior, they have to do so through offices and organizations set up and paid for by the authorities as part of the overall “Prevent” strategy. Although criticized by the left as “Islamophobic” and inherently stigmatizing, Prevent has often brought the government into cooperative relationships with organizations even further to the Islamic right than the Muslim Brotherhood. This means that if you are a relatively secular Libyan émigré who wants to report an Abedi and you go to your local police station, you are likely to find yourself speaking to a bearded Islamist.

From its outset in 2003, the Prevent strategy was flawed. Its practitioners, in their zeal to find and fund key allies in “the Muslim community” (as if there were just one), routinely made alliances with self-appointed community leaders who represented the most extreme and intolerant tendencies in British Islam. Both the Home Office and MI5 seemed to believe that only radical Muslims were “authentic” and would therefore be able to influence young potential terrorists. Moderate, modern, liberal Muslims who are arguably more representative of British Islam as a whole (not to mention sundry Shiites, Sufis, Ahmmadis, and Ismailis) have too often found it hard to get a hearing.

Sunni organizations that openly supported suicide-bomb attacks in Israel and India and that justified attacks on British troops in Iraq and Afghanistan nevertheless received government subsidies as part of Prevent. The hope was that in return, they would alert the authorities if they knew of individuals planning attacks in the UK itself.

It was a gamble reminiscent of British colonial practice in India’s northwest frontier and elsewhere. Not only were there financial inducements in return for grudging cooperation; the British state offered other, symbolically powerful concessions. These included turning a blind eye to certain crimes and antisocial practices such as female genital mutilation (there have been no successful prosecutions relating to the practice, though thousands of cases are reported every year), forced marriage, child marriage, polygamy, the mass removal of girls from school soon after they reach puberty, and the epidemic of racially and religiously motivated “grooming” rapes in cities like Rotherham. (At the same time, foreign jihadists—including men wanted for crimes in Algeria and France—were allowed to remain in the UK as long as their plots did not include British targets.)

This approach, simultaneously cynical and naive, was never as successful as its proponents hoped. Again and again, Muslim chaplains who were approved to work in prisons and other institutions have sometimes turned out to be Islamist extremists whose words have inspired inmates to join terrorist organizations.

Much to his credit, former Prime Minister David Cameron fought hard to change this approach, even though it meant difficult confrontations with his home secretary (Theresa May), as well as police and the intelligence agencies. However, Cameron’s efforts had little effect on the permanent personnel carrying out the Prevent strategy, and cooperation with Islamist but currently nonviolent organizations remains the default setting within the institutions on which the United Kingdom depends for security.

The failure to understand the role of ideology is one of imagination as well as education. Very few of those who make government policy or write about home-grown terrorism seem able to escape the limitations of what used to be called “bourgeois” experience. They assume that anyone willing to become an Islamist terrorist must perforce be materially deprived, or traumatized by the experience of prejudice, or provoked to murderous fury by oppression abroad. They have no sense of the emotional and psychic benefits of joining a secret terror outfit: the excitement and glamor of becoming a kind of Islamic James Bond, bravely defying the forces of an entire modern state. They don’t get how satisfying or empowering the vengeful misogyny of ISIS-style fundamentalism might seem for geeky, frustrated young men. Nor can they appreciate the appeal to the adolescent mind of apocalyptic fantasies of power and sacrifice (mainstream British society does not have much room for warrior dreams, given that its tone is set by liberal pacifists). Finally, they have no sense of why the discipline and self-discipline of fundamentalist Islam might appeal so strongly to incarcerated lumpen youth who have never experienced boundaries or real belonging. Their understanding is an understanding only of themselves, not of the people who want to kill them.

Screenshot 2025-05-15 at 17.10.59

Before she was accused of thought- crime, the London-based American writer Lionel Shriver was best known for her 2003 bestseller, We Need to Talk about Kevin, an epistolary novel about a mother’s efforts to understand why her son had carried out a Columbine-like school massacre. She has published five novels since Kevin, to wide acclaim, but it was a keynote speech she gave at an Australian literary event in September that made her the most unlikely celebrity of 2016.

Shriver had been invited by the Brisbane Writers Festival to discuss “community and belonging.” Instead, Shriver gave a talk about “fiction and identity politics” that criticized the idea of “cultural appropriation” and other forms of political correctness. She espoused the right of writers to create characters and speak in the voices of people ethnically or culturally different from themselves, pointing out that “otherwise, all I could write about would be smart-alecky 59-year-old 5-foot-2-inch white women from North Carolina.”

She excoriated contemporary forms of politically correct censorship with typically astringent fearlessness and rubbished the whole notion of identity politics: “Membership of a larger group is not an identity. Being Asian is not an identity. Being gay is not an identity. Being deaf, blind, or wheelchair-bound is not an identity, nor is being economically deprived.” It was a tough, fine, coruscating essay that should be widely read by every university head, arts administrator, and literature teacher in the West. But it might have gone unnoticed beyond Queensland had not a local activist stormed out of the talk and then written about its offensiveness for the Guardian.

The article was by Yassmin Abdel-Magied, a 25-year-old Sudanese-Australian author (of a memoir, of course), engineer, and activist, who had ostentatiously walked out of Shriver’s speech (while live-tweeting her walkout). Many people who came across the article, myself included, thought initially that it was a witty spoof of the ultra–politically correct counterculture that has taken such a hold in many academic and literary institutions. Its censorious mixture of ignorance, arrogance, inverted racism, melodramatic self-pity, and self-righteousness (at one point Abdel-Magied declares that Shriver’s dismissal of “cultural appropriation” is “the kind of attitude that lays the foundation for prejudice, for hate, for genocide”) seemed almost too perfect, too titanically solipsistic to be real.

The piece describes in detail the impact that Shriver’s lecture about “fiction and identity politics” had on the Young Person. She’d known something was amiss at the beginning of the talk when Shriver mocked the fuss made at Bowdoin College in Maine about a Mexican-themed party at which non-Mexican students wore sombreros—even donning a sombrero herself—and “the audience chuckled, compliant.”

Naturally, Abdel-Magied “started looking forward to the point of the speech where [Shriver] was to subvert the argument. It never came.” Twenty minutes into the talk, Abdel-Magied was overwhelmed and turned to her mother, who was there with her: “‘Mama, I can’t sit here,’ I said, the corners of my mouth dragging downwards. ‘I cannot legitimize this.’”

Abdel-Magied’s tale of oppression by disrespect gets more dramatic: “The faces around me blurred. As my heels thudded against they grey plastic of the flooring, harmonizing with the beat of the adrenaline pumping through my veins, my mind was blank save for one question. How is this happening?”

The shocking, unbelievable “this” that prompted Abdel-Magied to turn down her mouth, text her friends, and storm out of the hall was the un-ironic expression of an opinion so close to secular blasphemy that a believer like Abdel-Magied could not bear to hear it. It turned out Shriver’s talk was “nothing less than a celebration of the unfettered exploitation of the experiences of others, under the guise of fiction” [italics hers]. Worse still, in words she presumed were so outrageous her readers would immediately recoil from them, “Shriver’s real targets were cultural appropriation, identity politics, and political correctness.”

As Shriver was suggesting, the cultural-appropriation police seem not to understand the dead end represented by their racialist essentialism.

Perhaps the most comically un-self-aware passage in Abdel-Magied’s cri de coeur comes close to the end of the article when she wonders at the fact that Shriver was ever “given such a prominent platform from which to spew such vitriol.” Why was Shriver even invited, Abdel-Magied asks when “the opening of a city’s writers festival could have been graced by any of the brilliant writers and thinkers who challenge us to be more. To be uncomfortable. [emphasis mine] To progress.”

This seems baffling, given that her own account of the event makes it clear that Abdel-Magied could not herself tolerate being even mildly uncomfortable, and given that almost every review of a Shriver novel describes her work as challenging and discomforting.

Shriver’s relative obscurity before and even after the success of Kevin (which was made into a 2009 movie starring Tilda Swinton) has been both undeserved and understandable: Undeserved because Shriver is a superb, unforgiving satirist in the Horatian tradition. Her caustic new novel, The Mandibles: 2029–2047, depicts life in an impoverished, dystopic United States following the collapse of the dollar. Understandable because the sensibility that informs Shriver’s work is largely anathema to the grandees of what passes for literary culture in our day.

It seems that when Abdel-Magied says “ thinkers who challenge us,” what she really means is “thinkers who challenge you unenlightened people who need your consciousness raised by people like me.” Certainly, it’s not uncommon for devotees of the discourse of cultural appropriation, political correctness, and trigger warnings to use words in ways that invert their customary meanings.

Much of the controversy that followed was not so much about the bizarre, self-contradictory notion of “cultural appropriation” but about Shriver’s supposedly insensitive treatment of it and its allegedly “vulnerable” and “oppressed” devotees (most of whom, it goes almost without saying, are upper-middle-class graduates of or students at elite Western universities).

The past few years have seen an explosion of concern about “cultural appropriation,” especially on campuses. There was the complaint by students at Oberlin that their dining hall’s choice to serve sushi was “appropriative” and disrespectful.” At the University of San Francisco, white students wearing their hair in dreadlocks were accused of wrongly appropriating a hairstyle that is supposedly the sole preserve of “black culture.” Then came the widely reported cancellation of a yoga class at the University of Ottawa because yoga in North America has supposedly been appropriated from a culture that “experienced oppression, cultural genocide, and diasporas due to colonialism and western supremacy.” (Never mind that much of the physical practices in yoga are recent developments that actually have their origin in 19th-century European gymnastics.)

As is so often the case, this sinister American academic fad has spread swiftly to other Anglophone countries, and it may have become even more toxic in transit. At the beginning of October, at Britain’s Bristol University, a production of the musical Aida (an adaptation of Verdi by Elton John and Tim Rice) was cancelled because student protesters claimed that having white actors play Ethiopian and Egyptian characters would be “cultural appropriation.”

This is the context in which Abel-Magied’s screed must be read. Shriver was indeed taking this totalitarian impulse on directly, and her offended listener was therefore right to see Shriver’s speech as a dagger aimed at her intellectual heart. It would be something of an understatement to say that it is rare for an American literary figure to come out in this way against a movement that has such strong support among young people and such moral prestige in important cultural institutions. There are plenty of politically engaged American writers, of course, but they tend to take stands against things that everyone they know and work with are also against, like the Iraq war or Guantanamo. The same outrageousness, and lack of interest in swimming with the tide, is apparent in her work.

As Shriver was suggesting, the cultural-appropriation police seem not to understand the dead end represented by their racialist essentialism—or how easily it might be turned around against them. By their logic, black actors should not be allowed to play Lear, Macbeth, Julius Caesar, or other “white” roles in Shakespeare, and nonwhite performers should be completely excluded from taking part in any opera or classical ballet given that both are “white” European art forms, in the same way that jazz or blues music could be said to belong exclusively to black people.

Nor do those who prate about “cultural appropriation” in fiction seem to understand—or care—that, as Shriver pointed out, literature would be impossible if writers were forbidden from imagining and creating characters of different gender, race, ethnicity, age, or sexual orientation to their own. In this sense, the “cultural-appropriation” movement is a particularly poisonous form of philistinism, one that is all the more distressing because those obsessed with it are theoretically well-educated people.

But the most disturbing thing about the Brisbane furor was not that another millennial spoke of the trauma she experienced in having her beliefs challenged, or that the Guardian saw fit to publish her unintentionally funny op-ed about the horror of having her ideology subjected to Shriver’s mockery, or even that the instant reaction of the Brisbane Writers Festival to controversy was to take Shriver’s speech off its website and arrange a bogus “right of reply” session at which Shriver was guaranteed to be absent. These are all depressingly commonplace phenomena in the Anglophone world, even in far-off, relatively conservative, provincial cities like Brisbane.

The instinct to suppress certain categories of ‘offensive’ speech and thought is at least as strong now as it was in the Victorian era.

What was more disturbing was that the literary and media mother ships in New York and London did not automatically take the side of an author arguing for the freedom to create characters of any age, character, color, and ethnicity. Instead, Shriver was widely criticized for her tone and her insensitivity to the exquisitely fragile feelings of those young people whose expensive educations have not accustomed them to robust debate. Many of those who admonished Shriver seemed to believe that someone of her skin color should have known better, or at least shown more awareness that she was axiomatically more privileged than her interlocutors.

The New Republic published an article by Suki Kim, a Korean-American novelist who had been in the audience in Brisbane. She assured readers that “the most upsetting aspects of the speech won’t be found in the transcript. It was the sight of a white woman who has had great literary success playing the victim.” Kim, it should perhaps be noted, is the recipient of a Guggenheim Fellowship, an Open Society Fellowship and a Fulbright Grant.

According to Nesrine Malik in the Guardian, Shriver was “disrespectful,” and her speech was “unexamined, entitled and just plain ignorant.” As a “successful white author with a platform,” her “vantage point [had] blinded her to the validity of others.”

A writer for the New Yorker named Jia Tolentino also admonished Shriver, gravely pointing out that “there are all sorts of ways to borrow another person’s position: respectfully and transformatively, in ignorance, or with disdain . . .  One of the worst ways to wear a sombrero, I think, is to be a white keynote speaker at a literary festival, saying, ‘I am hopeful that the concept of “cultural appropriation” is a passing fad.’”

Given that it took Shriver three decades of relatively impoverished struggle to achieve any success as a novelist, you can only wonder how she feels being lectured about privilege and victimhood by young writers of color who have been lavished with fellowships and book contracts by the infamous racists in the American and British publishing industry.

Astonishingly, another of those who took Shriver to task was the highly regarded American novelist Francine Prose. On the website of the New York Review of Books, Prose admonished that “I can think of only a few situations in which humor is entirely out of line, but a white woman (even one who describes herself as a ‘renowned iconoclast’) speaking to an ethnically diverse audience might have considered the ramifications of playing the touchy subjects of race and identity for easy laughs.”

For disingenuousness, Prose’s attack on Shriver is hard to beat, and not just because priggishly censorious people invariably buttress their condemnations of “inappropriate” humor with the false claim that in general they are against censoring humor. Anyone reading Shriver’s speech would be in no doubt that its author was making a serious point rather than “playing…the subjects of race and identity for easy laughs.” But even if she hadn’t been, it’s hard to believe that  Prose can’t think of many, many situations in which humor is “out of line.” She is, for example, on record as believing that the slaughtered staff of France’s Charlie Hebdo magazine had also been “out of line,” i.e., asking for it, when they made fun of the militant Islam of those who eventually killed them.1

What all this goes to show is that extreme attitudes to race-based censorship that were until recently confined to the academy have leached into the higher precincts of the general culture, and that the instinct to suppress certain categories of “offensive” speech and thought is at least as strong now as it was in the Victorian era.

So it is perhaps rather surprising that it has taken so long for Lionel Shriver to come to the notice of the millennials at the forefront of the new censorship movement and their establishment allies. (Perhaps if she weren’t resident in London and didn’t have a man’s first name, it would have happened sooner.) Her books mercilessly satirize the very tendencies that have produced the prima donna hysteria about trigger warnings and safe spaces. In novel after drily unsentimental novel, she implicitly or explicitly condemns the refusal of Americans to take individual responsibility for their problems, our society’s tendency to apply “semantic solutions to real problems,” and she suggests that there is little hope for health, prosperity and happiness without hard work, willpower, and ruthless self-examination.

This is true even in Big Brother, the novel inspired by her own sibling’s morbid obesity, an affliction—though not, in her view, an “illness”—that eventually killed him. (Shriver herself is a famous fitness fanatic who eats just one meal a day and follows a daily exercise routine involving 3,000 jumping jacks, 500 sit-ups, and 130 push-ups, though she mocks baby-boomer fitness obsessives like herself in both Big Brother and The Mandibles.)

If that weren’t bad enough, Shriver is an outspoken supporter of Brexit and has frequently expressed skepticism about mass migration to both the UK and the United States. Before settling in London, Shriver lived in Belfast (where she wrote journalism notable for its clear-sighted loathing of terrorism), Nairobi, Tel Aviv, and Bangkok. But the native North Carolinian’s cosmopolitan experiences seem to have made her only less tolerant of the self-loathing often expressed by Americans. As Shriver explained to a British interviewer:

If you have had much to do with liberal intelligentsia in the U.S., they like to think they are above their own country, and they often have contempt for their compatriots, and they think they’re better. They think that being super-critical of the United States exempts them. When they talk about Americans, they don’t think they’re talking about themselves. They’re the same people who are always vowing if Bush wins the election, they’re moving to Italy. They never move to Italy.

The Mandibles has great sport with such people and their pieties. Set in a bleak and frightening near future, it imagines how four generations of a privileged New York family cope with life in a United States that is first crippled by cyber attacks in 2025 and then goes catastrophically bankrupt in 2029 after Russia, China, and other countries attack the dollar. But even before these disasters strike, Shriver’s America has grown so decadent and awash with various forms of political correctness that its millions of obese citizens are referred to as “people of scale.” And it’s pretty clear where Shriver stands on such matters. Early in the novel when America’s first Mexican-American president insists on giving all his addresses first in Spanish and then in English, one of Shriver’s WASP characters looks forward to the day when white Americans finally become a minority, too:

They’d get their own university White Studies departments, which could unashamedly tout Herman Melville. Her children would get cut extra slack in college admissions regardless of their text scores. They could all suddenly assert that being called “white” was insulting, so that now you had to say “Western-European American,” the whole mouthful. While to each other they’d cry “What’s up, cracker?” with a pally, insider collusion, any nonwhites who employed such a bigoted term would get raked over the coals on CNN. Becoming a minority would open the door to getting roundly, festively offended at every opportunity.

Later in the book the experience of national bankruptcy, social disorder and real poverty turns out to have an upside in the improvement of Americans’ mental and physical health:

Hardly anyone was fat. Allergies were rare. Eating disorders like anorexia and bulimia had disappeared. Should a friend say he was depressed, something sad had happened. After a cascade of terrors on a life and death scale, nobody had the energy to be afraid of spiders or confined spaces or leaving the house… Sex-reassignment surgery being roundly unaffordable, diagnoses of gender dysphoria were pointless… No one had the money, time or patience for pathology of any sort . . .

Suffice it to say, Shriver’s is not the conventional, politically safe brand of satire exemplified by the likes of Jonathan Franzen or the humor section of the New Yorker. It is, pace Yassmin Abdel-Magied, the kind that makes readers uncomfortable.

“I can live with offending people,” Shriver has said. “In fact, it’s become politically important to offend people, because we have to fight back against this notion that being offensive should be against the law or something, and that everyone supposedly deserves ‘respect’ for their often dopey views.”

Consider the fearlessness with which she speaks these words and then consider the example of her tut-tutting critic Francine Prose. Prose is the author of Blue Angel, a somewhat satiric novel about a professor whose life is ruined by a highly problematic charge of sexual harassment by an ambitious female student. It was published in 2000. It seems impossible that Prose would have the nerve to write Blue Angel today, or even the inclination. Indeed, the 2016 Prose might even feel compelled to take to the website of the New York Review of Books to challenge the decency of her own best work. Perhaps, Francine Prose might say, Francine Prose was “out of line.”

www.commentary.org/articles/jonathan-foreman/lionel-shriver-line/

1  Prose was one of the renegade members of PEN America—an organization founded to protect freedom of speech and persecuted writers of all shades of opinion—who led a protest against the decision to give a freedom-of-expression award to the surviving editors of the French satirical weekly.  Indeed, she was one of the six prominent New York authors who withdrew from the gala and were derided by Salman Rushdie as “fellow travellers” of the murderous Islamist project, and more gently as “six authors in search of a bit of character.” Prose and her fellow luminaries apparently felt that satire doesn’t deserve protection if it might be said to target communities or ethnic categories that they and their friends consider to be “oppressed.”

The School Runners (Commentary April 2016)

Essays/Book Reviews, Latest Articles, South Asia Comments Off on The School Runners (Commentary April 2016)

Review of “The Last Thousand” by Jeffrey E. Stern

A 2015 exposé on the Buzzfeed website created a stir by savaging the notion that the massive expansion of education in Afghanistan has been one of the triumphs of the international military effort. It was titled “Ghost Students, Ghost Teachers, Ghost Schools.”

“As the American mission faltered, U.S. officials repeatedly trumpeted impressive statistics— the number of schools built, girls enrolled, textbooks distributed, teachers trained, and dollars spent —to help justify the 13 years and more than 2,000 Americans killed since the United States invaded,” wrote a Pakistani-American journalist named Azmat Khan. The U.S. government’s claims are, Khan said, “massively exaggerated, riddled with ghost schools, teachers and students that exist only on paper.”

One-tenth of the schools that Buzzfeed’s employees claimed to have visited were not operating or had not been built. Some U.S.-funded schools lacked running water, toilets, or electricity. Others were not built to international construction standards. Teacher salaries, often U.S.-subsidized, were being paid to teachers at nonexistent schools. In some places local warlords had managed to divert U.S. aid into their own pockets.

The tone and presentation of the article leaves little doubt of its author’s conviction that 13 years of effort in Afghanistan, including the expenditure of 2,000 American lives and billions of dollars ($1 billion on education alone) were pointless and the entire intervention a horrendous mistake.

Unfortunately, it is all but certain that some of the gladdening numbers long cited by USAID and others are indeed inaccurate or misleading, especially given that they are based in large part on statistics supplied by various Afghan government ministries. The government of Afghanistan is neither good at, nor especially interested in, collecting accurate data. Here, as in all countries that receive massive amounts of overseas aid, local officials and NGOs have a tendency to tell foreign donors (and foreign reporters) what they think the latter want to hear. They are equally likely to exaggerate the effectiveness of a program or the desperate need for bigger, better intervention.

Moreover it would be remarkable if there weren’t legions of ghost teachers. No-show or nonexistent salaried employees are a problem in every Afghan government department. This is true even in the military: The NATO-led coalition battled for years to stop the practice whereby Afghan generals requested money to pay the salaries of units that existed only on paper. As for abandoned or incomplete school-construction projects, such things are par for the course not only in Afghanistan but everywhere in South Asia. India, Nepal, and Pakistan are littered with them. You don’t read about them much because no development effort has ever been put under the kind of (mostly hostile) scrutiny that has attended America’s attempt to drag Afghanistan into the modern era. Given the general record of all development aid over the past half century and the difficulty of getting anything done in a conflict-wrecked society like Afghanistan, it may well be the case that reconstruction efforts by the U.S. military and U.S. government in Afghanistan were relatively effective and efficient.

Despite all the money that may have been wasted or stolen, there really has been an astonishing education revolution in Afghanistan that is transforming the society. It is an undeniable fact that the U.S.-led intervention in Afghanistan has enabled the education of millions of children who would never have seen the inside of a school of any kind had it not been for the overthrow of the Taliban. The World Bank and UNICEF both estimate that at least 8 million Afghans are attending school. This means that even if a quarter of the children who are nominally enrolled in school aren’t getting any education at all, there are still 6 million other kids who are; in 2001 there were fewer than 1 million children in formal education, none of them female.

To get a sense of what education can achieve in Afghanistan, even in less than ideal circumstances, you can hardly do better than to read The Last Thousand, by the journalist and teacher Jeffrey E. Stern. It tells the extraordinary story of Marefat, a school on the outskirts of Kabul. Marefat (the Dari word means “knowledge” or “awareness”) was originally founded in a hut in a refugee camp in Pakistan. After the fall of the Taliban regime in November 2001, its founder, Aziz Royesh, brought the school to Afghanistan and set it up on a windblown patch of desert West of Kabul. By 2012, Teacher Aziz, as he is known to all, had enrolled a total of 4,000 pupils and was sending students to elite universities around the world, including Tufts, Brown, and Harvard.

The school primarily caters to the Hazara ethnic minority, of which Aziz (a former mujahideen fighter) is a member. As anyone who read The Kite Runner or saw the movie made from the bestselling novel by Khaled Hosseini knows, the Hazara have long been the victims of oppression by the majority Pashtuns and Tajiks. The Hazara (who account for about 10 percent of the Afghan population) bear a double ethnic burden. They are Shiites—heretics in the eyes of the Sunni majority of the country. And they look Asiatic. Indeed, they are widely but probably wrongly believed to be the descendants of Genghis Khan’s invaders.

Hazara were traded as slaves until the early 20th century. As late as the 1970s, they were barred from government schools and jobs and banned from owning property in downtown Kabul. As if certain parallels to another oppressed minority weren’t strong enough, the Hazara are well known for their appetite for education and resented for their business success since the establishment of a democratic constitution, and they have enthusiastically worked with the international military coalition—all of which has made them particular targets of the Taliban.

 From the start, Aziz was determined to give his students an education that would inoculate them against the sectarian and ethnic extremism that had destroyed his country. He taught them to question everything and happily educated both boys and girls, separating them only when pressure from conservative politicians put the school’s survival at risk. (When fathers balked at allowing their daughters to go to school, Aziz assured them that a literate girl would be more valuable in the marriage market.) Eventually the school also found itself educating some of the illiterate parents of its students and similarly changing the lives of other adult members of the school community.

The school’s stunning success in the face of enormous obstacles won it and its brave, resourceful founder affection as well as benefactors among the “Internationals”—the foreign civilian and military community in Afghanistan. When John R. Allen, the tough U.S. Marine general in command of all international forces in Afghanistan, finished his tour in February 2013, he personally donated enough money to the school to fund 25 scholarships. Thanks to reports about the school by a British journalist, a “Marefat Dinner” at London’s Connaught Hotel co-sponsored by Moet & Chandon raised $150,000 for the school in 2011. But by early 2013, Teacher Aziz was in despair for Marefat’s future, thanks to terrorist threats against the school and President Obama’s declaration that he would pull out half of America’s forces within a year regardless of the military and political situation in the country.

It’s a fascinating story. Which makes it a shame that much of it is told in a rather self-indulgent and mannered way. Stern’s prose tends to exude a world-weary smugness that can feel unearned, especially given some shallow or ill-informed observations on subjects such as Genghis Khan, Blitzkrieg, and the effect of Vietnam on current U.S. commanders, and his apparent ignorance of the role of sexual honor in Hazara culture.

Most exasperating, Stern patronizingly assumes an unlikely ignorance on the part of the reader. There are few newspaper subscribers who, after 15 years of front-page stories from Afghanistan, have not heard of the grand assemblies known as loya jirgas, or who don’t know that Talib literally means student. Yet Stern refers to the former as “Grand Meetings” and the latter as “Knowledge Seekers.” He also has his characters refer to Internationals as “the Outsiders,” even though any Afghan you are likely to meet knows perfectly well that the foreign presence comes in different and identifiable national and organizational flavors: Americans, NATO, the UN, the Red Cross, Englistanis (British), and so on. The same shtick apparently frees Stern from the obligation to specify an actual date on which an event occurred, or the actual name of a town or province.

Even so, The Last Thousand is a powerful and important book, especially in the way Stern conveys the sense of betrayal and the terror that many Afghans feel at the prospect of international abandonment. The Hazara children and staff at the Marefat school fear a prospective entente with the Taliban enthusiastically promoted by foreign-policy “realists” in the U.S. and UK. They correctly believe it would lead to cultural concessions that could radically diminish their safety and freedom—if not a complete surrender to murderous Pashtun racism and Sunni bigotry.

The book’s main characters are concerned by what seemed to be the imminent, complete departure of all foreign forces as part of the “zero option.” This option was seriously considered by the United States in 2013 and 2014 when then–President Karzai, in the middle of a bizarre descent into (hashish-fueled) paranoia and poisonous anti-Westernism, refused to sign a bilateral security agreement with the Western powers.

Aziz confessed to Stern (who was teaching English at the school) that he himself was in despair but was trying to hide his gloom from his pupils. He began to to urge his students, graduates, and protégés—especially the female ones—to be less vocal in their complaints about discrimination against Hazara, and he himself began controversially to cultivate unexpected allies such as the Pashtun presidential candidate Ashraf Ghani. But Marefat’s staff, students, and their parents had few illusions about the future. As one young girl said to Aziz: “If the Americans leave, we know there is no chance for us to continue our education.”

Although the future of Marefat and its Hazara pupils is uncertain, it is comforting that so much has already been achieved by the education revolution in Afghanistan. Assuming that the Taliban and its Pakistani government sponsors are not allowed to take over or prompt a collapse into civil war, this revolution may well have a tremendous and benign effect on the country’s future. After all, more than 70 percent of the Afghan population is under 25 and the median age is 17. Unlike their parents, these youths have grown up with television and radio (there are more than 90 TV stations and 174 FM radio stations), cellphones (there are at least 20 million mobile users), and even the Internet. Their horizons are wider than anything the leaders of the Taliban regime could even imagine.

As Stern relates in a hurried epilogue, the bilateral security agreement was finally signed in September 2014 after Karzai’s replacement by a new national unity government. There are still U.S. and other foreign troops in Afghanistan, even if not enough.

In Stern’s sympathetic portrayal of the Hazara and their predicament, it’s hard not to hear echoes of other persecuted minorities who put their trust in Western (and especially Anglo-Saxon) liberator-occupiers. The most recent example is the Montagnard hill tribes of Vietnam who fought alongside U.S. Special Forces and were brutally victimized by the victorious Stalinist regime after America pulled out of Indochina. Something similar happened to the Shan and Karen nations of Burma, who fought valiantly alongside the British during World War II but ever since have had to battle for survival against the majority Burmans who sided with the Japanese. In today’s Afghanistan, Gulbedin Hekmatyar, the Pakistan-backed Taliban leader, has overtly threatened the Hazara with something like the fate of the Harkis, the Algerians who fought with French during the war of independence between 1954 and 1962: At least 150,000 of the Harkis were slaughtered with the arrival of “peace.”

 The Last Thousand should remind those who are “war-weary” in the U.S. (which really means being weary of reading about the war) that bringing the troops home is far from an unalloyed good. Having met the extraordinary Teacher Aziz and his brave staff and students through the eyes of Jeffrey Stern, and knowing the fate they could face at the hands of their enemies, one finds it hard to think of President Obama’s enthusiasm for withdrawal—an enthusiasm echoed distressingly by several candidates in the presidential race—as anything but thoughtless, heartless, trivial, and unworthy of America.

The School Runners

Jeremy Corbyn and the End of the West (Commentary Magazine December 2015)

Essays/Book Reviews, Latest Articles Comments Off on Jeremy Corbyn and the End of the West (Commentary Magazine December 2015)

The Portents of Labour’s Extreme New Leader

In October 2015, the American novelist Jonathan Franzen gave a talk in London in which he expressed pleasure that Jeremy Corbyn had just been elected leader of Britain’s opposition Labour Party. To his evident surprise, Franzen’s endorsement was met with only scattered applause and then an embarrassed silence. 

Most of Franzen’s audience were the same sort of people likely to attend a Franzen talk in New York: Upper-middle-class bien-pensant Guardian readers who revile the name Thatcher the way a New York Times home-delivery subscriber reviles the name Reagan. For them, as for most Labour members of Parliament, the elevation of Jeremy Corbyn offers little to celebrate. Indeed, it looks a lot like a disaster—a bizarre and potentially devastating epilogue to the shocking rout of the Labour Party at the May 2015 general election.

Franzen probably imagined Corbyn as a kind of British Bernie Sanders, a supposedly lovable old coot-crank leftie willing to speak truth to power—and so assumed that any British metropolitan liberal audience would be packed with his fans. In fact, for all the obvious parallels between the two men, Corbyn is a very different kind of politician working in a very different system and for very different goals. Sanders may call himself a socialist, but he is relatively mainstream next to Corbyn, an oddball and an extremist even in the eyes of many British socialists.

It may seem extraordinary that a party most observers and pollsters were sure would be brought back to power in 2015—and that has long enjoyed the unofficial support of much of the UK’s media, marketing, and arts establishments—now looks to be on the verge of disintegration. But even if no one a year ago could have predicted the takeover of the party by an uncharismatic extreme-left backbencher with a fondness for terrorists and anti-Semites, the Labour Party might well be collapsing due to economic and social changes that have exposed its own glaring internal contradictions.The first stage of Labour’s meltdown was its unexpected defeat at the general election in May 2015. The experts and the polls had all predicted a hung Parliament and the formation of a coalition government led by Labour’s then-leader, Ed Milliband. But Labour lost 26 seats, was wiped out by nationalists in its former heartland of Scotland, and won less than 30 percent of the popular vote. The Liberal Democrats, the third party with whom Milliband had hoped to form a coalition, did far worse. Meanwhile the populist, anti-EU, anti-mass immigration, UK Independence Party (UKIP) won only one seat in the House of Commons but scored votes from some 3 million people—and took many more voters from Labour than from the Tories.

Milliband’s complacency about and ignorance of the concerns of ordinary working-class people played a major role in the defeat. So did his failure to contest the charge that Labour’s spendthrift ways under Tony Blair had made the 2008 financial crisis and recession much worse. Perhaps even more devastating was the widespread fear in England that Milliband would make a deal with Scottish nationalists that would require concessions such as getting rid of Britain’s nuclear deterrent. He had promised that he would never do this, but much of the public seemed to doubt the word of a man so ambitious to be prime minister that he had stabbed his own brother in the back. (David Milliband was set to take over the leadership of the party in 2010 when his younger brother, Ed, decided to challenge him from the left with the help of the party’s trade unionists.)

In the old industrial heartlands of the North and Midlands, Labour seemed at last to be paying a price for policies on immigration and social issues anathematic to many in the old British working class. As a workers’ party as well as a socialist party, and one that draws on a Methodist as well as a Marxist tradition, Labour has always had to accommodate some relatively conservative, traditional, and even reactionary social and political attitudes prevalent among the working classes (among them affection for the monarchy). Today the cultural divisions within the party between middle class activists, chattering class liberals, ethnic minority leaders, and the old working class can no longer be papered over.

With the ascension of Tony Blair to the leadership of the party in 1994, Labour began to pursue certain policies practically designed to alienate and drive out traditional working-class Labour voters and replace them not only with ordinary Britons who had grown tired of the nearly two-decade rule of the Tories but also with upper-middle-class opinion leaders attracted to multiculturalism and other fashionable enthusiasms.

One can even make a kind of quasi-Marxian argument that as the Labour Party has become more bourgeois over the decades, the more it has engaged in what amounts to conscious or unconscious class warfare against the working class it is supposed to represent. One of the first blows it struck was the abolition of the “grammar schools” (selective high schools similar to those of New York City) on the grounds that they were a manifestation of “elitism,” even though these schools gave millions of bright working-class children a chance to go to top universities. Then there was “slum clearance,” which resulted in the breakup and dispersal of strong working-class communities as residents were rehoused in high-rise tower blocks that might have been designed to encourage social breakdown and predation by teenage criminals. But the ultimate act of Labour anti-proletarianism came after the Party was recovering from the defection of working-class voters to Thatcherism and its gospel of opportunity and aspiration. This was the opening of the UK’s borders to mass immigration on an unprecedented scale by Tony Blair’s New Labour. Arguably this represented an attempt to break the indigenous working class both economically and culturally; inevitably, it was accompanied by a demonization of the unhappy indigenous working class as xenophobic and racist.

In the 2015 general election, many classic working-class Labour voters apparently couldn’t bring themselves to betray their tribe and vote Tory—but were comfortable voting for UKIP. This proved disastrous for Labour, which had once been able to count on the support of some two-thirds of working-class voters. But these cultural changes made it impossible for Labour to hold on to its old base in the same numbers. And its new base—the “ethnic” (read: Muslim) vote, a unionized public sector that is no longer expanding, and the middle-class liberals and leftists who populate the creative industries and the universities—is simply not large enough.

Labour should have won the election in 2015; it lost because of its own internal contradictions. Out of the recriminations and chaos that followed the defeat, there emerged Jeremy Corbyn.

 

To understand who Corbyn is and what he stands for, it helps to be familiar with the fictional character Dave Spart, a signature creation of the satirical magazine Private Eye. Spart is a parody of a left-wing activist with a beard and staring eyes and a predilection for hyperbole, clueless self-pity, and Marxist jargon, which spews forth from his column, “The Alternative Eye.” (He’s like a far-left version of Ed Anger, the fictional right-wing lunatic whose column graced the pages of the Weekly World News supermarket tabloid for decades.) A typical Spart column starts with a line like “The right-wing press have utterly, totally, and predictably unleashed a barrage of sickening hypocrisy and deliberate smears against the activities of a totally peaceful group of anarchists, i.e., myself and my colleagues.”

 
The column has given birth to the term spartist—which is used in the UK to refer to a type of humorless person or argument from the extreme left. There are thousands of real-life spartists to be found in the lesser reaches of academia, in Britain’s much-reduced trade-union movement, and in the public sector. For such activists, demonstrations and protests are a kind of super hobby, almost a way of life.

The 66-year-old Corbyn is the Ur-Spartist. He has always preferred marches and protests and speeches to more practical forms of politics. He was a member of Parliament for 32 years without ever holding any sort of post that would have moved him from the backbenches of the House of Commons to the front. During those three-plus decades, he has voted against his own party more than 500 times. Corbyn only escaped being “deselected” by Tony Blair—the process by which a person in Parliament can be removed from standing for his seat by his own party—because he was deemed harmless.

Many of Corbyn’s obsessions concern foreign policy. He is a bitter enemy of U.S. “imperialism,” a longtime champion of Third World revolutionary movements, and a sympathizer with any regime or organization, no matter how brutal or tyrannical, that claims to be battling American and Western hegemony. Corbyn was first elected to Parliament in 1983, and many of his critics in the Labour Party say he has never modified the views he picked up from his friends in the Trotskyite left as a young activist.

This is not entirely true, because Corbyn, like so much of the British left, has adapted to the post–Cold War world by embracing new enemies of the West and its values—in particular, those whom Christopher Hitchens labeled “Islamofascists.”

One of the qualities that sets spartists like Corbyn apart from their American counterparts is an almost erotic attraction to Islamism. They are fascinated rather than repelled by its call to violent jihad against the West. This is more than anti-Americanism or a desire to win support in Britain’s ghettoized Muslim communities. It is the newest expression of the cultural and national self-loathing that is such a strong characteristic of much progressive opinion in Anglo-Saxon countries—and which underlies much of the multiculturalist ideology that governs this body of opinion.

As a result, many on the British left today seem to have an astonishing ability to overlook, excuse, or even celebrate reactionary and atavistic beliefs and practices ranging from the murder of blaspheming authors to female genital mutilation. Corbyn has long been at the forefront of this tendency, not least in his capacity as longtime chair of Britain’s Stop the War Coalition. STWC is a pressure group that was founded to oppose not the war in Iraq but the war in Afghanistan. It was set up on September 21, 2001, by the Socialist Workers’ Party, with the Communist Party of Great Britain and the Muslim Association of Britain as junior partners. STWC supported the “legitimate struggle” of the Iraqi resistance to the U.S.-led coalition; declines to condemn Russian intervention in Syria and Ukraine; actively opposed the efforts of democrats, liberals, and civil-society activists against the Hussein, Assad, Gaddafi, and Iranian regimes; and has a soft spot for the Taliban.

Corbyn’s career-long anti-militarism goes well beyond the enthusiasm for unilateral nuclear disarmament that was widespread in and so damaging to the Labour Party in the 1980s, and which he still advocates today. He has called for the United Kingdom to leave NATO, argued against the admission to the alliance of Poland and the former Czechoslovakia, and more recently blamed the Ukrainian crisis on NATO provocation. In 2012, he apparently endorsed the scrapping of Britain’s armed forces in the manner of Costa Rica (which has a police force but no military).

As so often with the anti-Western left, however, Corbyn’s dislike of violence and military solutions mostly applies only to America and its allies. His pacifism—and his progressive beliefs in general—tend to evaporate when he considers a particular corner of the Middle East.

Indeed, Corbyn is an enthusiastic backer of some of the most violent, oppressive, and bigoted regimes and movements in the world. Only three weeks after an IRA bombing at the Conservative Party conference in Brighton in 1984 came close to killing Prime Minister Thatcher and wiping out her entire cabinet, Corbyn invited IRA leader Gerry Adams and two convicted terrorist bombers to the House of Commons. Neil Kinnock, then the leader of Labour and himself very much a man of the left, was appalled.

Corbyn is also an ardent supporter of the Chavistas who have wrecked Venezuela and thrown dissidents in prison. It goes almost without saying that he sees no evil in the Castro-family dictatorship in Cuba, and for a progressive he seems oddly untroubled by the reactionary attitudes of Vladimir Putin’s repressive, militarist kleptocracy in Russia.

Then we come to his relationship with Palestinian extremists and terrorists. A longtime patron of Britain’s Palestine Solidarity Committee, Corbyn described it as his “honor and pleasure” to host “our friends” from Hamas and Hezbollah in the House of Commons. If that weren’t enough, he also invited Raed Salah to tea at the House of Commons, even though the Palestinian activist whom Corbyn called “an honored citizen…who represents his people very well” has promoted the blood libel that Jews drink the blood of non-Jewish children. These events prompted a condemnation by Sadiq Khan MP, the Labour candidate for London’s mayoralty and a Muslim of Pakistani origin, who said that Corbyn’s support for Arab extremists could fuel anti-Semitic attacks in the UK.

That was no unrepresentative error. As Britain’s Jewish Chronicle also pointed out this year, Corbyn attended meetings of a pro-Palestinian organization called Deir Yassin Remembered. The group is run by the notorious Holocaust denier Paul Eisen. He is also a public supporter of the Reverend Stephen Sizer, a Church of England vicar notorious for promoting material on social media suggesting 9/11 was a Jewish plot.

Corbyn’s defense has been to say that he meets a lot of people who are concerned about the Middle East, but that doesn’t mean he agrees with their views. The obvious flaw of this dishonest argument is that Corbyn doesn’t make a habit of meeting either pro-Zionists or the Arab dissidents or Muslim liberals who are fighting against tyranny, terrorism, misogyny, and cruelty. And it was all too telling when, in an effort to clear the air, Corbyn addressed the Labour Friends of Israel without ever using the word Israel. It may not be the case that Corbyn himself is an anti-Semite—of course he denies being one—but he is certainly comfortable spending lots of quality time with them.

How could such a person become the leader of one of the world’s most august political parties? It took a set of peculiar circumstances. In the first place, he only received the requisite number of nominations from his fellow MPs to make it possible for him to stand for leader after the resignation of Ed Milliband because some foolish centrists thought his inclusion in the contest would “broaden the debate” and make it more interesting. They had not thought through the implications of a new election system that Milliband had put in place. An experiment in direct democracy, the new system shifted power from the MPs to the members in the country.

The party’s membership had shrunk over the years (as has that of the Tory Party), and so to boost its numbers, Milliband and his people decided to shift to a system in which new members could obtain a temporary membership in the party and take part in the vote for only £3 ($5). More than 100,000 did so. They included thousands of hard-left radicals who regard the Labour Party as a pro-capitalist sell-out. (They also included some Tories, encouraged by columnists like the Telegraph’s Toby Young, who urged his readers to vote for Corbyn in order to make Labour unelectable.) The result was a landslide for Corbyn.

Labour’s leadership was outplayed. The failure was in part generational. There is hardly anyone left in Labour who took part in or even remembers the bitter internal struggle in the late ’40s to find and exclude Communist and pro-Soviet infiltrators—one of the last great Labour anti-Communists, Denis Healey, died this October. (This was so successful that the British Trotskyite movement largely abandoned any attempt to gain power in Westminster, choosing instead to focus on infiltrating the education system in order to change the entire culture.) By the time Corbyn took over, most of Labour’s “modernizers”—those who had participated in the takeover of the party leadership by Tony Blair and his rival and successor Gordon Brown—had never encountered real Stalinists or Trotskyists and lacked the fortitude and ruthless skill to do battle with them.

Unfortunately for the centrists and modernizers, many of Corbyn’s people received their political education in extreme-left political circles, so brutal internal politics and fondness for purges and excommunications are (as Eliza Doolittle said) “mother’s milk” to them. For example: Corbyn’s right-hand men, John McDonnell and Ken Livingstone, were closely linked to a Trotskyite group called the Workers Revolutionary Party. The WRP was a deeply sinister political cult that included among its promoters not only the radical actors Vanessa and Corin Redgrave but also the directors of Britain’s National Theatre. Its creepy leader Gerry Healy was notorious for beating and raping female members of his party and took money from Muammar Gaddafi and Saddam Hussein.

Most people in British politics, and especially most British liberals, had fallen prey to the comforting delusion that the far left had disappeared—or that what remained of it was simply a grumpy element of Labour’s base rather than a devoted and deadly enemy of the center-left looking for an opportunity to go to war. As Nick Cohen, the author of What’s Left: How the Left Lost Its Way, has pointed out, this complacent assumption enabled the centrists to act as if they had no enemies to the left. Now they know otherwise.

Another reason for the seemingly irresistible rise of Corbyn and his comrades is what you might call Blair Derangement Syndrome. It is hard for Americans and other foreigners to understand what a toxic figure the former prime minister has become in his own country. Not only is he execrated in the UK more than George W. Bush is in the U.S., Blair is especially hated by his own party and on the left generally. It is a hatred that is unreasoning and fervid in almost exact proportion to the adoration he once enjoyed, and it feels like the kind of loathing that grows out of betrayed love. Those in the Labour Party who can’t stand Blair have accordingly rejected many if not all of the changes he wrought and the positions he took. And so, having eschewed Blairism, they were surprised when they lost two elections in a row to David Cameron—who, though a Tory, is basically Blair’s heir.  

Blair is detested not because he has used his time after leaving office to pursue wealth and glamour and has become a kind of fixer for corrupt Central Asian tyrants and other unsavory characters. Rather, it is because he managed to win three general elections in a row by moving his party to the center. Those victories and 12 years in office forced the left to embrace the compromises of governance without having much to show for it. This, more than Blair’s enthusiasm for liberal interventionism or his role in the Iraq war or even his unwavering support of Israel during the 2008 Gaza war, drove the party first to select the more leftist of the two Milliband brothers and now hand the reins to Corbyn.

As I write, Corbyn has been Leader of Her Majesty’s loyal opposition (a position with no equivalent in the United States) for a mere 10 weeks—and those 10 weeks have been disastrous both in terms of the polls and party unity. Corbyn’s own front bench has been on the verge of rebellion. Before the vote on the UK’s joining the air campaign in Syria, some senior members apparently threatened to resign from their shadow cabinet positions unless Corbyn moderated his staunch opposition to any British military action against ISIS in Syria. (It worked: Rather than face open revolt, Corbyn allowed a free vote instead of a “whipped” one, and 66 Labour MPs proceeded to vote for air strikes). Any notion that Corbyn’s elevation would prompt him to moderate his views quickly dissipated once he began recruiting his team. His shadow chancellor, John McDonnell, is one of the only people in Parliament as extreme as he. While serving as a London councillor in the 1980s, McDonnell lambasted Neil Kinnock, the relatively hard-left Labour leader defeated by Margaret Thatcher, as a “scab.” A fervent supporter of the IRA during the Northern Ireland troubles, McDonnell endorsed “the ballot, the bullet, and the bomb” and once half-joked that any MP who refused to meet with the “Provisionals” running the terror war against Great Britain should be “kneecapped” (the traditional Provo punishment involving the shattering of someone’s knee with a shotgun blast). Recently he made the headlines by waving a copy of Mao’s Little Red Book at George Osborne, the Chancellor of the Exchequer. As Nick Cohen has written of Corbyn and his circle: “These are not decent, well-meaning men who want to take Labour back to its roots…they are genuine extremists from a foul tradition, which has never before played a significant role in Labour Party history.”
 
During Corbyn’s first week as leader, he refused to sing the national anthem at a service commemorating the Battle of Britain, presumably because as a diehard anti-monarchist, he disagrees with the lyric “God save our Queen.” Soon after he declared that as a staunch opponent of Britain’s nuclear arsenal, he would not push the button even if the country were attacked.

He expressed unease at the assassination by drone strike of the infamous British ISIS terrorist “Jihadi John.” Corbyn said it would have been “far better” had the beheader been arrested and tried in court. (He did not say how he envisaged Jihadi John ever being subject to arrest, let alone concede that such a thing could happen only due to military action against ISIS, which he opposes).

Corbyn’s reaction to the Paris attacks prompted fury from the right and despair in his own party. He seemed oddly unmoved and certainly not provoked to any sort of anger by the horror. Indeed, he lost his chance to score some easy points against Prime Minister Cameron’s posturing. Cameron, trying to play tough in the wake of military and policing cuts, announced that British security forces would now “shoot to kill” in the event of a terrorist attack in the UK—as if the normal procedure would be to shoot to wound. Any normal Labour leader of the last seven decades would have taken the prime minister to task for empty rhetoric while reminding the public of Labour’s traditional hard stance against terrorism in Northern Ireland and elsewhere. Instead, Corbyn bleated that he was “not happy” with a shoot-to-kill policy. It was “quite dangerous,” he declared. “And I think can often be counterproductive.”

While there is no question that Labour has suffered a titanic meltdown, and that Corbyn’s triumph may mean the end of Labour as we know it, it’s not yet clear whether Corbyn is truly as electorally toxic as the mainstream media and political class believe him to be. What some observers within Labour fear is that Corbyn could indeed become prime minister after having transformed the party into a very different organization and having shifted the balance of British politics far to the left.

They concede that there is little chance of Corbyn’s ever winning over the 2–3 million swing voters of “middle England” who have decided recent elections. But they worry that in a rerun of the leadership election, Corbyn might be able to recruit a million or more new, young voters who have no memory of the Cold War, let alone Labour’s failures in the 1970s, and who think that he is offering something fresh and new.

It might not only be naive young people who would vote for Corbyn despite his apparent lack of parliamentary or leadership skills. In Britain, there is a growing disdain for, and distrust of, slick professional politicians—and for good reason. It’s not hard to seem sincere or refreshingly possessed of genuine political convictions if you’re going up against someone like David Cameron, who even more than Tony Blair can exude cynicism, smugness, and a branding executive’s patronizing contempt for the public. The fact that Corbyn is relatively old and unglamorous might also play in his favor; the British public is tired of glib, photogenic, boyish men. Corbyn and McDonnell are “an authentic alternative to the focus-group-obsessed poll-driven policies of the Blair days,” Cohen writes—but it is an authenticity based in “authentic far-left prejudices and hypocrisies.” Those prejudices and hypocrisies could sound a death knell for Britain’s historic role in advancing the Western idea—an idea that is, in large measure, this country’s greatest achievement.

https://www.commentarymagazine.com/articles/jeremy-corbyn-end-west/

In Britain and Across the World, An Age-Old Schism Becomes Ever More Bitter - Sunday Times 3 Jan 2016

Comment/OpEd Comments Off on In Britain and Across the World, An Age-Old Schism Becomes Ever More Bitter – Sunday Times 3 Jan 2016

http://www.thesundaytimes.co.uk/sto/news/focus/article1652282.ece

Original Version:

For most Westerners the Shia-Sunni conflict has been a confusing but distant phenomenon that rumbles along in the background of Middle Eastern and South Asian politics: an obscure theological dispute within Islam that only makes headlines when a Shia mosque is destroyed in Pakistan or Hezbollah blows up a Sunni leader in Lebanon. But the rumble has been getting louder, and yesterday’s execution of the prominent Shia cleric Sheikh Nimr al-Nimr by the Saudi authorities could well turn it into a roar that will echo throughout the middle east and beyond.

            Since the souring of the Arab Spring, and especially since the beginning of the civil war in Syria, outsiders have become more aware that this ancient sectarian division is reflected in the struggle between two bitterly opposed power blocs in the region: a conservative Sunni one led by the Saudis and a Shia one led and inspired by Iran. Their various proxy militias are fighting each other not just in Iraq and Syria but also in Lebanon and Yemen. As bad as that may seem, this war may well be about to expand to include divided countries like Bahrain and even the Saudi kingdom itself. And it is far from unlikely that the sectarian struggle could spread much further, even into our own cities.

            Having misunderstood and even fostered Sunni-Shia tensions in the past, Western countries have tended to underestimate the importance of the Sunni-Shia divide in recent decades. You could see this in the poor planning for the Iraq war and in the fact that many Western media organizations covering that war initially had no idea if their translators and fixers were members of the Sunni minority and therefore likely to be supporters of the Saddam regime. 

            Sunnis of course make up the vast majority of Muslims around the world, and most of them are willing to live alongside Shia even if they don’t like or respect their beliefs. But hardline Sunnis and Salafists refer to Shia Muslims as Rafidah (a strongly pejorative term which roughly translates as “the rejectors” (a reference to the Shiite rejection of the first Caliphs in favour of Ali, the prophet Muhammad’s cousin and son-in-law), and see them as polytheists. As apostates, the Shia are worse and more deserving of death even than Jews and Christians.

            You can sense this even in the UK. Most British Muslims are Sunni. The concern expressed by British Muslims about the killing of the faithful in wars abroad never extended to the tens of thousands of Shia civilians slaughtered in mosques and market places during the Iraq war. Nor have there ever been any demonstrations against the large-scale killing of Shia Hazaras by the Taliban, or murderous attacks on Shiite places of worship in Pakistan.  

            One line of Salafist thought sees the Shia as a fifth column set up to destroy Islam (by the Jews of course), and blames Shiite traitors for every Muslim defeat from the Crusades onwards. Hardline Shia are equally hostile to the Sunnis but have rarely been in a position to persecute them.

            The mutual suspicion runs deeper than a theological dispute. It is political in that Sunni rulers fear that Shia minorities (or majorities in the case of Bahrain) may be more loyal to Tehran than the countries they live in. But it can also take bizarre forms: In Lebanon, both Sunni and Shia believe that the other community is prone to disgusting sexual immorality, and there are said to be some Sunnis who think that Shiites have little tails.

            In any case it is often hard for non-Muslim Westerners to get a sense of the depth and intensity of Sunni-Shia hostility or to understand when that hostility is likely to overwhelm or be overwhelmed by other political or ethnic concerns.

            Iraq’s Sunni Kurds were long happy to ally with the country’s Shiite Arabs against the ruling Sunni Arab minority. In Gaza Hamas is willing to accept support from Shia Iran while Islamic Jihad is not. On the other hand, senior Salafist clerics in Saudi Arabia celebrated Israel’s recent killing of a Hezbollah leader.

            Recent developments have made Sunni-Shia hostility more lethal and more dangerous. One is the massive global missionary effort funded by Salafist and Wahabi princes in Saudi Arabia and other parts of the Gulf. Another is the growing power and aggression of Iran, its remarkably successful drive to establish a “Shia Crescent” from Iran through Iraq into Syria and Lebanon. A third is the weakening of forces that used to dilute Sunni-Shia hostility, such as secular Arab nationalism and Pan-Arabism.

             A fourth is the sheer ferocity of the fighting in Syria. A leading Saudi cleric, Mohammad al-Barrak recently tweeted that Shiites are more harmful to the faithful than the Jews “because [the Shiite’s] crimes in four years have exceeded all the Jews’ crimes in 60 years”.

             How bad could things get? Both the Saudi and Bahraini monarchies could face genuine uprisings. Lebanon is already on the brink of another civil war. But even more frightening perhaps is the prospect of attacks on Shia targets in the many countries and cities around the world – including in Europe – where there are Shia minorities surrounded by Sunni majorities. These could and probably would lead to reprisal terror attacks, most likely by Hezbollah and Iran’s Revolutionary Guards, both of whom have carried out operations as far away from the Middle East as Argentina. Much depends on whether the Saudi monarchy can appease or control its own furious Shiite minority, and whether calmer heads will prevail in both communities around the world. 

Does US Foreign Aid Really Do Good (Washington Examiner Magazine 09/27/15)

Aid/NGOs/Philanthropy, Latest Articles Comments Off on Does US Foreign Aid Really Do Good (Washington Examiner Magazine 09/27/15)

About a decade ago, a five-car convoy of Toyota Land Cruisers pulled up in a cloud of dust at a remote village on the edge of a South Asian mountain range. The passengers, all of them Westerners apart from an interpreter, walked over to where a canopy had been set up by an advance team the previous day. About 25 villagers were already there, enticed by free cookies and snacks.

One of the new arrivals gave a quick talk that was translated into a local language and then the others handed leaflets to the villagers. Then the foreigners climbed into their Land Cruisers and raced back to the relative safety and comfort of the capital. The leaflets concerned a micro-finance scheme, and the men and women handing them out were part of a project sponsored by the World Bank.

Not a single person in the village ever read the leaflets for the simple reason that no one in the village could read, a problem that had apparently not occurred to the people running the project. Nevertheless, the forms sent to Washington would, no doubt, confirm that outreach had taken place, that awareness had been raised, and key step had been taken in the process of helping members of an impoverished community help themselves.

This expedition took place in Afghanistan, but it could have been in any of a dozen heavily aided countries. While it would be an exaggeration to say that no local person benefited from this particular project (after all, its foreign and local employees probably contributed a good deal to the capital’s economy), its wastefulness was arguably a betrayal both of the taxpayers who funded it and of its purported beneficiaries.

If that weren’t bad enough, even if this particular project had been better conceived and executed, and awareness really had been raised, it probably wouldn’t have done much good. That is because micro-finance, celebrated as a development panacea, simply doesn’t work in certain cultures. It can be successful especially in quasi-matriarchal societies such as Bangladesh where it was invented; but it has abjectly failed in violently macho cultures like those of Rajasthan or Pashtun Afghanistan.

The point of this story isn’t to imply that all aid is a similarly arrogant waste of effort and money, but to serve as a counter-anecdote: a reminder that real aid requires more than just good intentions, and a snapshot of the realities that all too often lie behind the heartwarming imagery and simplistic appeals to compassion used by aid advocates when selling the work of their vast global industry to the public.
 

To be fair, the aid industry has in the last couple of decades come to acknowledge that good intentions are not enough. Hence the conferences and academic papers on “aid effectiveness,” the shift to “evidence-based aid” and the increasingly rigorous efforts to understand what programs work with real people in specific cultures. Critics, skeptics and disillusioned practitioners such as William Easterly, an economics professor at New York University, are now given a grudging hearing rather than ignored or dismissed as apostles of heartlessness.

That is not quite the same as conceding that seven decades and trillions of dollars in development aid have had remarkably disappointing results, in stark contrast to the Marshall Plan that was its original inspiration. And you will rarely encounter any acknowledgement that those countries that have emerged from long-endured poverty and underdevelopment, for instance South Korea after the 1960s, or some of today’s booming African economies, have done so for reasons unconnected with aid.

Another awkward fact is that many of the attempts to bring accountability, transparency and value for money to the enterprise of development aid have actually made it less rather than more effective. U.S. aid efforts are especially compromised by the oversight requirements that would be comical if they didn’t do such a disservice to both the taxpayer and the theoretical beneficiaries of aid.

USAID in particular is notorious for an obsession with “metrics” strongly reminiscent of the McNamara approach to “winning” the Vietnam war; an approach that inevitably prompts managers to favor projects that produce crunchable data, no matter how useless those projects might otherwise be.

Moreover, as the anecdote above should suggest, a great deal of aid data isn’t worth the time it took to input into a spreadsheet. The more impoverished, chaotic and badly governed an aid-receiving country is, the less you can or should rely on official data or even aid agency estimates of its birth rates, population, mortality, literacy, family size, income. Most statistics from basket-case countries, those in which it is too difficult or dangerous for researchers and officials to visit villages far from the capital, are a combination of guesswork and garbage.

No one knows, for example, how many people live in countries like Afghanistan that have not had a census in decades, let alone how much they live on or how long they live. Often, statistics from even the largest and best-funded aid organizations are based on marketing needs rather than rigorous research. For instance, last year a U.N. agency claimed that malnutrition has gotten worse in Afghanistan since the overthrow of the Taliban, even though it’s almost impossible to know with any degree of accuracy how good or bad malnutrition was anywhere in the country in 2001 or how bad it is in large swaths of the country today.

In general, those who market development or humanitarian assistance to the public are still unwilling to admit that delivering effective aid is difficult in the best of circumstances, and even harder in the ill-governed, chaotic, impoverished societies where it seems most needed. They are even less likely to confront the reality that foreign aid all too often does actual harm.

This awkward fact is true of both development aid and humanitarian or emergency aid. The former accounts for more than 85 percent of American foreign aid even though it’s less visible to and much less understood by the public. And, if you take seriously the criticism coming from a growing number of African dissidents, activists and intellectuals, it has contributed massively to the corruption, misgovernment and tyranny that has kept their nations mired in misery.

Even before people such as Zambia’s Dambisa Moyo, Uganda’s Andrew Mwenda and Ghana’s George Ayittey became a public relations nightmare for the aid industry, some economists had noted a correlation between being aided on a huge scale and subsequently experiencing economic, political and social catastrophe.

It was after the great increase in aid to sub-Saharan Africa that begin in 1970, that per capita income dropped and many African countries endured negative growth. Other circumstantial evidence for aid as a corrosive force includes the fact that the countries that have received the most non-military foreign aid in the last six decades have a disproportionate tendency to collapse into anarchy: Besides Afghanistan and Iraq, the countries that have received the most aid per capita include Somalia, pre-earthquake Haiti, Liberia, Nepal, Zaire and the Palestinian territories.

It’s almost as hard to measure the alleged harm inflicted by aid as it is to find reliable and truly relevant metrics for aid success. On the other hand, the evident failure of many heavily aided societies speaks volumes.

How aid feeds corruption

As Mwenda said, having such a huge source of unearned revenue allows the government to avoid accountability to the citizenry. This is true of his own Uganda, where foreign aid accounts for 50 percent of the government’s budget. There, President Yuweri Museveni, once hailed as a model of modern, democratic African leadership, has responded to the generosity of the rich world not by pursuing the U.N. Millennium Development Goals, but instead by purchasing top-of-the-line Russian Su-30 warplanes for his Air Force and a Gulfstream private jet for himself.

Nor is it just in Uganda that foreign aid actually seems to discourage what donors would regard as good behavior. A recent study from the Lancet magazine showed that aid funding earmarked to supplement healthcare budgets in Africa invariably prompted recipient governments to decrease their own contributions.

It also enables such governments to avoid or postpone necessary reforms, such as the establishment of a working tax system. In Pakistan, for example, a country with a significant middle class as well as a wealthy ruling elite, less than 1 percent of the population pays income taxes. Because states with little or no income from taxation cannot afford to pay decent salaries, this makes large-scale official corruption and extortion all but inevitable.

Aid feeds corruption in other ways as well. This is partly because large-scale, state-to-state aid has the same economic and political effects as the discovery of a natural resource like oil. But it is also because so many aid agencies will do almost anything to ensure that their good works can continue.

This is especially true in humanitarian intervention. Disasters such as earthquakes and tsunamis can be tremendous windfalls for ruthless officials in places such as Sri Lanka, India and Pakistan. Their people know that if they want to help the poor and vulnerable, they will have to pay bribes to government officials. And the government officials know that the agencies they are extorting will never close their offices and pull out rather than pay upfront.

Large inflows of development aid also seem to encourage political instability. This makes sense to the extent that once the state becomes the sole source of wealth and leverage, getting control of it for one’s own party or tribe becomes all the more important, certainly worth cheating, fighting and killing to secure.

Aid can also encourage a dependency that is not just morally problematic, but also dangerous. Food aid is particularly destructive. When foreign aid agencies hand out grain, it bankrupts local farmers or at least discourages them from sowing next year’s crops, all but guaranteeing future shortages.

Afghan people carry their ration of American wheat from the United Nations High Commission for Refugees (UNHCR) food distribution center in Kabul, Afghanistan, Tuesday, July 9, 2002. (AP Photo/Sergei Grits)

At the same time, governments that ought to be preparing for the next famine don’t bother because they assume that the foreigners will deal with the problem. The United States is by far the worst offender in this regard. Its food aid programs are now and have always been little more than a corporate welfare program for American agribusiness. It boosts the bottom line of companies such as Cargill while wreaking deadly havoc abroad.

On the other hand, the United States has pioneered aid to encourage the civil society organizations that are essential checks on “poor governance,” that is, irresponsible and corrupt government officials. Unfortunately, such efforts are often undermined by other forms of aid such as budget support. After all, it’s deeply discouraging for third world anti-corruption campaigners, civil society organizations and political dissidents when they see foreign aid agencies talk about the importance of good governance, democracy and human rights while handing over yet more money to tyrants and kleptocrats.

One of the less dramatic but no less damaging side effects of humanitarian aid is the distortion of local economy when aid agencies arrive to set up refugee camps or hand out emergency rations. Not only do prices go up for everything from water to fuel, but professionals abandon their jobs to work as interpreters and drivers. The standard aid agency/media salary of $100 per day can be more than a doctor makes in a month.

Then there are the “taxes” that the agencies routinely pay local warlords or garrison commanders to secure permission to operate or in return for “security” in dangerous regions. These payments sometimes take the form of food, radios or even vehicles. As a result, the armed payee is not only wealthier, and better able to continue the fight against his rivals, he also gains vital prestige; local people see that foreigners pay court to him.

Sometimes agencies go further and allow militias or equally vicious army units or oppressive political parties to control who gets food and water. This notoriously happened in the Hutu refugee camps in Goma and happens today in parts of Ethiopia.

Some moral compromise is inevitable in the grueling, dangerous business of emergency aid. But again and again, as critics Linda Polman, David Rieff and Michael Maren have shown, aid agencies have followed the path of “Apocalypse Now’s” Col. Kurtz in pursuit of their ideals. They have become the enablers and accomplices of murderous militias and brutal regimes, prolonged wars, and even collaborated in forced relocations. The refugee camps they operate have become sanctuaries for terrorists and rear bases for guerrilla armies.

The most infamous example of this was the aid complex that grew up on Goma in what is now Eastern Congo but was then Zaire in the wake of the Rwandan genocide. There, as chronicled by Linda Polman in her devastating book War Games, the world’s aid agencies and NGOs competed fiercely to help the Hutu power genocidaires who had fled Rwanda with their families.

As so often happens, they ran the refugee camps, taxing the population, taking vehicles and equipment when they needed it, and controlling the supply of food to civilians so as to favor their members. Even worse, they used the Goma refugee camps as bases for murderous raids into Rwanda. The massacres they carried out stopped only after the army of the new Rwandan government crossed the border and overran the camps.

As Rieff pointed out, an analogous situation would have been if at the end of World War II, an SS brigade fled from the death camps it was administering and took refuge, along with its families, in Switzerland, and then, fed by aid workers, raided into Germany in an effort to kill yet more Jews.

There are many other examples of conflict being fomented and prolonged by those housing and aiding refugees, accidentally or deliberately. Refugee warriors, as some have called them, operating from the sanctuary of camps established by the United Nations High Commissioner for Refugees and others, have created mayhem everywhere from the Thai-Cambodia border to Central America and the Middle East.

Sometimes aid agencies have allowed this to happen as a result of ignorance. Sometimes it’s a matter of Red-Cross-style humanitarian ideology taken to the edge: a conviction that even the guilty need to be fed or a belief that providing security in refugee camps would be an abandonment of neutrality. And sometimes it’s because those providing aid are supporting one side in a conflict. The U.S. and Western countries did so from Pakistan during the Soviet-Afghan war.

For decades, Syria, Lebanon and Jordan allowed or encouraged Palestinian refugee camps to become bases for guerrilla and terrorist activity. This should make it clear that the aid world’s traditional ways of dealing with refugee flows are inadequate. Even purely civilian camps such as Zaatari, the sea of tented misery in Jordan that houses a million Syrians, quickly became hotbeds of radicalism and sinkholes of crime and violence, not least because they are unpoliced and because they are filled with working age men with nothing to do.

The American way of aid

A wounded Laotian soldier (wounded in action at Long Cheng mid February) walks to his cot at the USAID managed and financed hospital at Ban Xon in February 1971, during the Nixon administration. (AP Photo/Horst Faas)

American foreign assistance is carried out by a number of different agencies. USAID, founded in 1961, continues to be the largest and most important. Its priorities have shifted over the years.

During the Kennedy and Johnson administrations, USAID emphasized the development of infrastructure and embarked on large-scale projects modeled on the Tennessee Valley Authority. President Nixon took American aid in a different direction, working with Hubert Humphrey to pass the so-called “New Directions” legislation that prompted a new emphasis on health, education and rural development. It wasn’t until the Reagan administration that USAID began to emphasize democracy and governance.

The next revolution in American foreign aid took place during the administration of George W. Bush, who almost tripled USAID’s budget. The Bush administration also began two huge aid initiatives outside USAID: PEPFAR America’s HIV/AIDS program, which has been a great success, and the Millennium Challenge Corp. But the most radical Bush administration shift in aid policy may have been its increase in aid to Africa. Among other initiatives, Bush more than quadrupled funding for education on the continent. It was subsequently cut by the Obama administration.

Although aid is traditionally divided into two main types, development aid and humanitarian aid, one can categorize American aid in terms of the places it’s sent, bearing in mind some amount of foreign assistance goes to 100 countries.

There is aid given to genuinely poor countries in an honest effort to help needy people there. Then there is aid to relatively wealthy states whose elites are too irresponsible to take care of their own people. A good example is the aid the United States sends to India, a country that can afford to send rockets to Mars and which has its own growing aid program, but whose ruling elite is content to tolerate rates of malnutrition, illiteracy and curable disease that are worse than those of sub-Saharan Africa.

A third category is aid given as a foreign policy bribe. This is not the same thing as aid used as a tool of public diplomacy, because its target is a foreign country’s ruling elite. The most obvious examples are Egypt and Pakistan. America gives Egypt money and in return the enriched Egyptian military, with its prestigious American weaponry, promises not to attack Israel. U.S. aid to Egypt has preserved peace, but it has not been successful in its secondary purpose of promoting economic development and political stability.

Aid to Pakistan is arguably less successful. Its purpose was to persuade the Pakistani military and intelligence establishment to reduce its sponsorship of Islamist terrorism in the region and in particular its murderous efforts to destabilize the U.S.-supported government in Afghanistan in favor of its Taliban clients.

A fourth category is aid given as part of what was called the war on terrorism, which has been dominated by reconstruction efforts in Iraq and Afghanistan.

A fifth, linked category is aid used for the purpose of public diplomacy. This has become increasingly controversial in the aid community.

Controversies about the utility and effectiveness of aid do not necessarily break down along conventional Left vs. Right ideological lines. Interestingly, people who identified with the Left rather than the Right have recently argued that foreign aid does not win friends for America and should not be seen as a useful tool of public diplomacy.

They often refer to Pakistan and a study that showed that American humanitarian aid after the 2005 Kashmir earthquake did not have a lasting positive effect on Pakistani attitudes toward the United States. This is a problematic argument, not least because Pakistan is a special case. It is a heavily aided country in which key state actors foster anti-Americanism and have done so for a long time. An American rescue effort in one corner of the country was never likely to win over the population, especially as the state played down that effort in order to make its own efforts seem less feeble.

Moreover, those who insist that aid does not win friends abroad or influence foreign populations may have philosophical and ideological reasons for taking such a view. Many in the aid industry prefer to see aid as something that should be given without regard to any benefit to the donor country, other than that feeling of having done the right thing that comes from an altruistic act. Others are politically hostile to efforts by Western governments to win hearts and minds as part of the war on terrorism.

My experiences in aided countries in Africa and South Asia tend to contradict this argument. In Somalia, for example, the vital but decayed highway between the capital and the coast is still referred to affectionately as “the Chinese Road” some three decades after it was built.

It’s also no secret that in many parts of Africa, you encounter positive attitudes to contemporary China thanks to more recent infrastructure projects, despite the abuses and corruption that so often accompany Chinese economic activity.

A general view of a refugee camp is seen in the city of Kabul, Afghanistan, Thursday, June 12, 2008. (AP Photo/Musadeq Sadeq)

In Afghanistan’s Panjsher valley, any local will tell you how grateful the people are for the bridges built by the U.S. Provincial Reconstruction Team before it closed. This is a localized response to a local benefit. It would be naive, though not unusual on the part of U.S. officials, to expect people in other Afghan localities to be grateful for help given to their fellow countrymen.

At the same time, there seems to be evidence that bad aid makes things worse. That could take the shape of shoddy or failed projects; projects that employ disliked outsiders, and programs that everyone knows have been commandeered or ripped off by corrupt local officials.

It is probably fair to say that effective foreign aid can win friends for America, but mainly on a local basis and only if it reflects genuine local needs and preferences, and if the beneficiary population is not already steeped in anti-American prejudice.

Aid and Afghanistan

Anyone who follows media reports about the American-led reconstruction effort in Afghanistan — the greatest aid effort since the Marshall plan — could be forgiven for thinking it has been a total disaster. But anyone who has spent time there and seen how much has changed since 2001 knows that this is nonsense. The millions of girls in school, the physical and economic transformation of Kabul and other cities, the smooth highways that make commerce possible are only the most obvious manifestations of success. At the same time, the waste, theft, corruption and incompetence is at least as spectacular as these achievements.

It is not clear that Afghanistan is a radically more corrupt society than other countries that have been the target of major aid efforts, that its ruling elite is uniquely irresponsible, cynical and self-interested, or that foreign government agencies and NGOs working there have been especially naive and incompetent.

But it’s important to remember that aid to Afghanistan is not just on a uniquely large scale, offering vast opportunities for theft, misdirection and waste. It’s also much more closely scrutinized than any other aid effort in history.

Afghan police officers drag a sack full of blankets. Soldiers were meeting with village representatives to assess their needs, provide humanitarian aid assistance and to gain intelligence about the region.(AP Photo/Rafiq Maqbool)

Nothing like the same level of skeptical attention has ever been paid by media organizations to development or humanitarian aid programs in sub-Saharan Africa or South Asia. Nor has there been an equivalent of the Special Inspector General for Afghanistan Reconstruction turning a jaundiced eye on big, notoriously inefficient U.N. agencies such as UNHCR, or the efforts of giant nonprofits such as Oxfam and Save the Children.

On the other hand, Afghanistan may well be uniquely infertile ground for development aid, thanks to decades of brutalizing war, a historically feeble state whose primary function has been preying on those who lack access to effective armed force and a traditional political culture in which no one expects government officials to be better than licensed bandits.

Much of the controversy that has accompanied the aid effort in Afghanistan has involved criticism of work by the Defense Department and the military.

No one who has seen how weapons systems are procured for the U.S. military would be surprised if some Defense Department-funded aid projects in Afghanistan turned out to be wasteful, ill-considered and poorly administered. But whether they are that much worse than efforts funded by other government departments such as the State Department or USAID is another question. That they have tended to attract particular opprobrium from news media and the special IG could simply reflect institutional dislike of the military or opposition to the Afghan war.

There is evidence that in many places the military did a better job of providing aid than USAID and the rest of the aid establishment. This was partly because the military wasn’t hamstrung by security concerns; unlike USAID, its employees were willing and able to go anywhere in the country. Local commanders with the ability to hand out funds may have lacked development experience, but they were there where help was needed and, unlike many aid professionals, saw no shame in asking locals what assistance they wanted.

USAID’s bureaucratic, box-ticking approach was arguably unsuitable for a country as damaged, impoverished, misgoverned, traumatized and dysfunctional as Afghanistan. Where the military decided to build schools, it did so quickly and efficiently, assuming that American or Afghan aid agencies would then find teachers, buy schoolbooks and make the projects sustainable.

USAID, by contrast, was required to get the relevant permissions from the ministry of education in Kabul and then provincial ministries, both of which were incompetent and corrupt, and was so slow in the execution of its mandates that its tardiness threatened to undermine the war for hearts and minds.

The Obama administration and Aid

Despite what one might have expected from the candidate’s internationalist rhetoric, foreign aid was far from a priority for the first Obama administration. Key positions such as the head of the Office of U.S. Foreign Disaster Assistance went unfilled for an unconscionably long time, and overall aid spending fell. To the extent that the administration paid attention to foreign aid, its primary concern seems to have been to reverse or undo the priorities of the Bush administration.

Accordingly, efforts to promote democracy and civil society in third world countries were defunded. Countries that had been given more aid as an apparent reward for joining the international coalition in Iraq were now penalized for the same reason.

The second Obama administration has seen a relative normalization of aid policy and an increase in overall aid spending. Although democracy promotion is not the priority it was during the Bush years, it is still a sufficiently important part of U.S. foreign aid to cause USAID to be expelled from Bolivia and Ecuador. In both cases, USAID was targeted by authoritarian left-wing governments for supporting the kind of civil society organizations that can make a genuine difference to bad governance in poor countries.

But normalization is not necessarily a good thing. It means that the United States is still committed to the U.N.’s absurd Millennium Development Goals, a vast utopian list of targets whose realization would, as Rieff has put it, amount to “quite literally, the salvation of humanity,” and was always hopelessly unrealistic.

It also means that those guiding America’s aid efforts continue to be naively enthusiastic about cooperation with big business and to put excessive faith in the potential of high technology to solve third world problems. It has become increasingly clear that the Gates foundation and other new philanthropic giants are influencing the overall direction of U.S. development aid in undesirable ways.

In particular, it has meant a heightened, even feverish emphasis on technological solutions for development problems, as if cheap laptops or genetically modified crops might really be the magic bullet that “ends poverty.” As David Rieff has pointed out, such “techno-messianism” has often failed in the past. If you have a high-tech cyberhammer, all problems start to look like nails.

But reducing poverty, promoting economic growth and rescuing failing states are — this is the real lesson of six decades of development aid — extremely difficult and complicated things to achieve. One gain can lead to new problems in the way that lowering infant mortality may have contributed to overpopulation and therefore malnutrition and even starvation in some African societies.

The challenge presented by the influence of Gates and other tech billionaires on American government aid policy is not just a matter of a techno-fetishism even more intense than that of the rest of American society. It’s also a matter of priorities, of which problems get the most attention. It may be that the only thing worse than aid directed by ignorant box-ticking bureaucrats or by self-serving aid industry ideologues, is aid directed by the spouses of Silicon Valley billionaires.

The way forward

Foreign aid has so far not been a key topic for presidential candidates. Those who have said anything on the subject have tended to be relatively uncontroversial.

Hillary Clinton is especially keen on aid that benefits women and girls. Jeb Bush believes that aid is a vital instrument of U.S. foreign policy and approves of the administration’s aid boost to Central America. Marco Rubio is, perhaps surprisingly, a stalwart advocate of foreign aid, though not, of course, to Cuba while it remains in the Castro family’s grasp. His fellow Republicans Chris Christie and Mike Huckabee also see aid as key to America’s moral authority, though the latter is particularly enthusiastic about faith-based aid efforts.

On the other hand, both Rand Paul and Rick Perry have expressed a libertarian or isolationist suspicion of foreign aid, although the latter has also indicated that he thinks aid should be used more explicitly as a foreign policy lever, calling for aid to Mexico and Central American states to be withheld until those countries do more to stop the flow of immigrants to the United States. It matters less since he has dropped out of the race.

Ted Cruz supported Paul’s 2013 proposal to withhold aid from Egypt after the military coup that ousted President Mohammed Morsi, but does not seem to be against the idea of giving aid to key allies. On the other hand, Cruz did say that same year that “we need to stop sending foreign aid to nations that hate us.”

Donation goods from U.S. military, are seen hung by parachutes over the earthquake-hit area in Pakistan on Friday Oct. 14, 2005. (AP Photo/ Musadeq Sadeq, Pool)

It’s a reasonable sentiment that could resonate with the public. Carrying out such a policy switch would entail stopping aid to Pakistan (one of the countries where American aid is not only unappreciated but seems actually to feed anti-American resentment), the Palestinian territories (whose citizens are per capita the most aided people on Earth), Turkey, China and Russia.

Whatever Republican and Democratic candidates say now, it seems unlikely that questions of cutting or boosting or reforming foreign aid will play a major role in the 2016 election. That is unless the migrant and refugee crises in the Middle East and Europe gets so much worse that there are loud and popular calls for Washington to intervene in some way.

If that does happen, then it will be probably be the military that once again leads an American humanitarian effort, assisted, with the usual reservations and resentments by USAID and other agencies. It is worth remembering that during the Asian tsunami and Philippine typhoon disasters, no aid agency rescued as many people, did as much good or could have done as much good as the United States Navy, and that this was a source of pride for most Americans.

http://www.washingtonexaminer.com/george-w.-bush-top-target-in-dem-and-gop-debates/article/2574296

The Battle of Britain (Weekly Standard March 12, 2001)

Essays/Book Reviews, Old Articles/Archive Comments Off on The Battle of Britain (Weekly Standard March 12, 2001)

Is the Sun Setting on the United Kingdom?

The Abolition of Britain – From Winston Churchill to Princess Diana, by Peter Hitchens

A few years ago, I was hiking up to an observatory in Georgetown on the Malaysian island of Penang. On the steep, winding road to the top, I fell into conversation with a well-dressed middle-aged man, a Malaysian Chinese, who told me about the problems his daughter faced getting into university because of the regime’s nastily racist program that favored ethnic Malays and penalized the ethnic Chinese minority. It was unfair, unjust. “You’re British,” he said. “You should do something about this.”

It was touching and not a little sad that he thought British influence still counted for so much, and that he automatically associated the concept of fair play with the former colonial power. From a historical point of view, he wasn’t entirely mistaken: Over the centuries, many people — African slaves in agony in the Middle Passage, Hindu widows being burned alive, Indian travelers strangled by religious lunatics, Belgian civilians brutalized by Wilhelmine soldiery, and Jews being kicked to death by Nazi brownshirts — have all wanted the British to do something about it, and eventually they did.

But then Britain and its prestige are perceived differently abroad than at home these days — especially by the political class. When Peter Hitchens, the former Trotskyite who is now Britain’s most forthright conservative pundit, laments the “abolition of Britain,” he isn’t talking just about the Blair government’s formal destruction of the United Kingdom as a unitary state or even the modernizing Kulturkampf against such vestiges of the imperialist, racist, class-ridden past as the breeches worn by the Lord Chancellor and the popular Royal Tournament show of military pageantry.

He’s also talking about the long-term shift in national self-perception that allowed all this to happen — a shift, strangely enough, that accelerated as Britain left the strikebound malaise of the late 1970s for the prosperity of the 1980s and 1990s. Essentially, the British seem to have reacted, rather belatedly, to the loss of empire with an orgy of self-contempt. Pushed along by a middle-class minority who passionately desire the submersion of Britain in a European superstate, this peculiar self-loathing has made the British particularly vulnerable to a virulent form of PC multiculturalism and to the idea that Britain’s institutions and traditions are, at best, outmoded and absurd.

“We allowed our patriotism to be turned into a joke, wise sexual restraint to be mocked as prudery, our families to be defamed as nests of violence, loathing, and abuse, our literature to be tossed aside as so much garbage, and our church turned into a department of the Social Security system,” Hitchens writes in his concluding chapter.

We let our schools become nurseries of resentment and ignorance, and humiliated our universities by forcing them to take unqualified students in large numbers. . . . We abandoned a coinage which. . . . spoke of tradition and authority. . . . We tore up every familiar thing in our landscape, adopted a means of transport wholly unfitted to our small crowded island, demolished the hearts of hundreds of handsome towns and cities, and in the meantime we castrated our criminal law, because we no longer knew what was right or wrong.

Some of these changes were organic and others artificial (though Hitchens, to the detriment of his argument, rarely distinguishes the two). Some were initiated by Labour governments, but a surprising number were the work of Conservative administrations.

So, for instance, the foreign office under Margaret Thatcher pursued a relentless policy of post-imperial betrayal, beginning with hints to the Argentines that Britain no longer cared about the Falkland Islands and culminating in the selling of the people of Hong Kong to Communist China — after first removing their right to reside in the United Kingdom, so they’d have no leverage and nowhere to run.

And so, for another instance, the Tories under John Major took the country deeper into the European Union — while reciting the mantra that further integration into the emerging superstate was the only way Britain could hope to exert any influence, now that it was merely a “fourth-rate power.” (This phrase is always delivered in tones of such gloomy satisfaction, no one notices that such a “rating” ignores factors like economic strength, nuclear deterrents, seats on the U.N. Security Council, and cultural influence.)

But Tory surrenders of sovereignty pale beside the changes instituted by the “New Labour” government of Tony Blair. For the most part, the British population has been an unenthusiastic but oddly resigned witness to even more revolutionary changes. (Though the drive to abolish British currency and replace it with the Euro provoked a surprisingly vocal opposition.) The most important of these changes are the constitutional “reforms” carried out merely because the need for such changes was self-evident to the London media elite that calls the tune in British society.

The fact that the United Kingdom seemed to work — despite the oddness and antiquity and irrationalism of its constitutional arrangements — was declared irrelevant. Sure, it provided reasonable prosperity, liberty, and security at least as effectively as systems in use in the Continent (or across the Atlantic). Sure it proved less vulnerable to economic and political storms than, say, the modern German state since 1870 or the various republics, empires, and monarchies that have ruled France since 1789. But that’s all ancient history. The key thing is that nothing about the old United Kingdom conforms to what the new British elite conceives of as “modernity.”

The idea that there might be risks in sudden, radical constitutional change, that for a constitution to be effective it needs legitimacy and the emotional allegiance of the people, is not one that Britain’s hyper-rationalist but parochial reformers have given much thought to, despite the warnings flashed from Yugoslavia. For the new public-sector middle class and the metropolitan media elite, a single idea is paramount: Britain is a musty, provincial place “held back” by dated, irrational institutions and a culture that wrongly venerates a history that is essentially a record of shame and oppression.

In its mildest form, this idea is manifested in the culturalist theory of British decline that influenced Thatcher as much as Blair: the idea that postwar economic failure is inextricably linked to the persistence in Britain of a culture of deference. Better policy might well have been found by asking instead how a pair of small islands off the coast of Europe managed to become the world’s most powerful nation for a century and a half, producing a fair number of the world’s best scientists, poets, admirals, and statesmen. But those old successes were dismissed. As the newly elected Tony Blair put it in 1997 — so memorably and tellingly, in marketing-man’s jargon — Britain desperately needs to be “rebranded” as a “young country.”

That the Blair government has been able to tear so much down in so short a time with so little effective opposition is one of the most fascinating mysteries of modern politics. After all, it’s rare for a perfectly viable system of government to be dismantled in a time of peace and prosperity. Peter Hitchens understands that Britain came to this pass because of a series of social and cultural changes, some of them inevitable results of postwar exhaustion and impoverishment, but many more of them the products of cultural and class warfare.

Unfortunately his Abolition of Britain is arranged in such a scattershot way that it conveys no real sense of either the chronology or the interplay of the various factors that broke British morale and allowed a resentful section of the population, without previous experience of power and responsibility, to make a revolution. Still, The Abolition of Britain is an entertaining and moving read that helps explain why certain key strata of the British middle classes are such enthusiasts for eliminating the things that make Britain unique. It offers a key to such mysteries as how the British state could actually prosecute merchants for using non-metric measures, jail a farmer for defending himself against brutal robbers, and arrest a man for the “racist” act of flying a flag above a pub.

There are so many effective anecdotes in Hitchens’s book that it is difficult to pick one as particularly telling. So, for symbolic concision, how about the abolition of the flag? It was in 1997, the year of Blair’s election, that British Airways removed the Union Jack flag from the tails of its aircraft and replaced it with “ethnic” designs that it hoped foreign customers would find more sympathetic.

The airline’s then-CEO, Robert Ayling, apparently feared that foreigners associated the British national flag with skinheads, soccer hooligans, and imperialism. This was not based, of course, on any polling of Africans or Asians or Europeans. But Ayling did know that the Union Jack is associated with skinheads and soccer hooligans and imperialism by the media folk and the professional middle classes who now control Britain. These are people far too well-educated and sophisticated to have any truck with anything as atavistic as national pride and who simply cannot conceive that anyone would see a Union Jack as a symbol of something positive. (Britain is not in fact a flag-waving country; its inhabitants have long been embarrassed by the kind of loud patriotism associated with their continental neighbors or the United States. But there’s a difference between this kind of reticence and actual hostility to the flag.)

Kipling once asked, “What do they know of England who only England know?” The Blairite elite, for all their vacations in French or Tuscan villages, have much less experience of the outside world than the imperial elite they replaced. It’s why they don’t know that the French, whom they worship, are utterly unembarrassed by the traditional pageantry being scourged in Britain and would not dream of deconcessioning the tricoleur. Have the Blairites never seen the Communist deputies saluting, as mounted republican guardsmen in breastplates and horsehair plumes lead the Bastille Day parade, just in front of the tanks? Apparently not, which is another reason no one in the new ruling elite even questions the assumption that Britain is an embarrassingly Ruritanian society, long overdue for a thorough house-cleaning.

Still less do they doubt that a country properly cleansed of cringe-inducing vestiges of a quaint, elitist past like the changing of the guard, Oxbridge, red telephone boxes, hereditary peers, and the monarchy will be both more efficient and more popular with foreign tourists. For them it is an article of faith that new is better.

Alas for Peter Hitchens, impassioned, perceptive, and courageous though he is, the opposite is also an article of faith: For him, all change is bad. Hitchens actually laments the advent of central heating and double glazing, because families are no longer brought together by having to huddle around a single hearth. When he contrasts the Britain of Princess Diana’s funeral with the Britain of Churchill’s funeral, his case that everything has gotten worse includes the “crazed over-use of private cars” and “the disappearance of hats and the decline of coats.”

Indeed, if you were going to be harsh you might almost subtitle this book “A compendious diatribe of everything I hate about Britain today, with minor, aesthetic irritations given the same weight as the destruction of the constitution.” There’s a silly chapter in which Hitchens bemoans the famous trial of D. H. Lawrence’s Lady Chatterly’s Lover, which made it all but impossible for the British government to ban books on the grounds of obscenity. Then there’s his notion that the “American Occupation” of Britain from 1941 to 1945 introduced adultery to British womanhood — a claim that would have amused Lord Nelson and Lady Hamilton.

But the most bizarrely wrong chapter is the one that blames the satirical television and wireless programs of the late 1950s and early 1960s for destroying national unity. The idea that a culture that survived Alexander Pope and Jonathan Swift could be brought down by Dudley Moore and Alan Bennett is preposterous. And if comedy “made an entire class too ridiculous to rule,” then P. G. Wodehouse and perhaps even Charles Dickens are also to blame.

Of course many things are worse in Britain than they were during the 1950s, the decade that Hitchens takes as his paradigm for the real, lost Britain. Even people of the Left look with disgust upon Tony Blair’s “Cool Britannia” with its ubiquitous youth culture awash with drugs, its government by glib marketing men, its increasing corruption, the ever-spreading coarseness, and the startling ubiquity of violent crime (you’re now much, much more likely to be mugged or burgled in London than in New York).

But is it so terrible that the food is better, that there are sidewalk cafes, that middle- and even working-class people can afford to travel, that the state plays a smaller role in the nation’s economic life (though a far greater one in other realms)? Some of Hitchens’s nostalgia fixes on things that were not especially British, or not laudably so — like censorship, or the prosecution and blackmailing of homosexuals. Other things Hitchens sees as quintessentially British were, in fact, freakish phenomena of the postwar decades. In particular, the placidity and gentleness in those years was an artificial state, the result of exhaustion and wartime discipline.

Hitchens should know that for centuries European and other visitors were struck by the amazing pugnaciousness of the English and by their quick sentimentality. (Two enjoyable recent books, Jeremy Paxman’s The English and Paul Langford’s Englishness Identified, take up this topic.) From the eighteenth century on, Britons were seen even by their many European admirers as terrifyingly violent. That’s why small numbers of them were able to defeat large numbers of foreigners either on the continent or the battlefields of empire. The British soccer hooligan is a mere return to form. So, too, the Victorians were famous for their weeping: They kept emotional reserve for important moments, like when they were about to be tortured by Fuzzy Wuzzies.

It’s a shame The Abolition of Britain includes so much cranky fogeyism (including nostalgia for the flogging of teenage criminals). It’s a shame, because at its best this book combines superb reporting (especially about the hijacking of education by frustrated leftists) with a heartbreaking analysis of one of the strangest revolutions in history. And in many ways it is the most important of the torrent of books that have dealt with the crisis of British identity.

What Hitchens understands is that bourgeois New Labour is far more revolutionary than any government before — although, ironically, it learned just how easy it is to defy tradition and make radical constitutional changes from Margaret Thatcher, who abolished the Greater London Council merely because it was dominated by her political enemies. Hitchens rightly sees the New Labour “project” as a kind of politically correct Thatcherism with a punitive cultural agenda aimed at certain class enemies. The House of Commons’s vote to abolish fox hunting is a perfect example: an interference in British liberty enacted by our urban middle-class rulers in order to kick toffs in the teeth — one that will put thousands of rural working-class people out of work. When Labour was dominated by cloth-capped, working-class socialists, ownership of the means of production may have been at issue, but the party never threatened the structure of the kingdom. Tony Blair heads the least socialist, least redistributive Labour government ever. Yet at the same time he has used the legally unchecked powers of a House of Commons majority to enact the most revolutionary changes in the British constitution since the Civil War of the 1640s.

It still isn’t clear whether the Blair government sees its steady stream of attacks on the old order’s structure and accouterments as a clever and harmless way of distracting its genuinely socialist members and supporters from their fiscal conservatism, or whether they actually know that traditions and rituals are rather more important than marginal tax rates when it comes to destroying the old United Kingdom they despise.

Because the reforms, enacted swiftly and without serious debate, were intended mostly to proclaim the new government’s difference from the Tories, they followed no consistent theory. Scotland and Wales got separate parliaments but continue to send MPs to Westminster where they make laws for the English (some 80 percent of the population) who do not have their own separate parliament.

Of course, it never occurred to the Blairites — who see themselves as technocrats above primitive feelings of attachment to nation or any community other than their own cosmopolitan class — that by tossing bones to the Welsh and Scots nationalist minorities they might awaken the long slumbering beast of English nationalism. These people have lived so long under the protection of an inclusive British nationalism, they couldn’t imagine that English nationalism, fed by growing submission to Europe and the unfair favoring of Scotland, will of necessity be racial and resentful. When a few old souls mentioned the danger of awakening nationalisms after centuries of peace and comity, they were laughed at by the Blairites. Now you see all over England the red cross of St. George, a symbol from the medieval past that spontaneously appeared in the hands of soccer fans and on the dashboards of London taxicabs. It’s enough to make Hitchens warn of “interesting times” ahead — in the scary sense of “interesting.” As he says, “When a people cease to believe their national myths and cease to know or respect their history, it does not follow that they become blandly smiling internationalists. Far from it.”

Of course, you can detect in the Blair generation’s discomfort with Britain’s past an element of envy and insecurity. It cannot be easy for middle-aged Britons to look back on the achievements of their fathers and grandfathers (who defeated Hitler and the Kaiser), or, worse still, those of their great grandfathers (who brought peace and prosperity to millions around the globe), without wishing to denigrate those achievements.

But if you want to understand why a significant chunk of the British population loathes Britain and wants to undo it, you have to look beyond generational resentment to class. An acquaintance of mine was on his way to a party for the fiftieth anniversary of VE day in 1995 when he bumped into Jon Snow, a well-known British broadcaster and fairly typical figure of the new British establishment. He asked Snow if he too were going to a VE celebration. Snow sneered back that he was going to “an anti-VE day party.” Not for him any of that jingoistic nostalgia for World War II.

As Orwell pointed out, the English intelligentsia has always been severed from the common culture of the country. But in the 1930s, the intellectuals were joined in their alienation by a significant number of mandarins, upper- and upper-middle-class civil servants, who responded to democratization and the simultaneous decline of British influence by deciding that their country would be better off ruled by Nazi Germany or the Soviet Union.

The modern equivalent is to transfer one’s allegiance to the “European ideal,” which means, in practice, rule by the smooth bureaucrats of Brussels. For the remnants of the mandarin class, there’s something comforting in the idea that Britain and Europe can be run by a sophisticated international elite — made up of chaps not unlike themselves.

“Europe” also solves a status problem for the new public-sector middle class. Unlike the treacherous mandarins, these people have not lost position; they never had it. They therefore define themselves as being more “civilized” than the country-house toffs above them and the bigoted proles below. And they take to an extreme the retarditaire notion that everything is done better on the Continent. The basic idea is that if you are the kind of person sophisticated enough to appreciate wine and cappuccino — rather than beer and tea — then, of course, you must favor the transfer of sovereignty from Britain to Brussels.

There are good reasons for Americans to study Peter Hitchens’s The Abolition of Britain. It won’t be a good thing for America if British PC multiculturalists manage to discredit the parent culture of the United States. More important, however, is the lesson about the fragility of culture that Americans should take from this book. In his famous essay “England, Your England,” George Orwell wrote, “It needs some very great disaster, such as prolonged subjugation by a foreign enemy, to destroy a national culture.” But reading Hitchens you soon realize that Orwell was wrong: A culture can be destroyed from the inside, as well.

 

American soldiers really aren't spoilt, trigger-happy yokels (Daily Telegraph 25 July 2003)

Old Articles/Archive Comments Off on American soldiers really aren’t spoilt, trigger-happy yokels (Daily Telegraph 25 July 2003)

Whether the deaths of Uday and Qusay Hussein were self-inflicted or not, the military operation to capture them was immaculate. There were no American deaths, 10 minutes of warnings were given over loudspeakers, and it was the Iraqis who opened fire. So sensitive was the American approach, they even rang the bell of the house before entering.

The neat operation fits squarely with the tenor of the whole American campaign, contrary to the popular negative depiction of its armed forces: that they are spoilt, well-equipped, steroid-pumped, crudely patriotic yokels who are trigger-happy yet cowardly in their application of overwhelming force.

And, unlike our chaps, none of them is supposed to have the slightest clue about Northern Ireland-style “peacekeeping”: never leaving their vehicles to go on foot patrols, never attempting to win hearts and minds by engaging with local communities and, of course, never removing their helmets, sunglasses and body armour to appear more human.

As a British journalist working for an American newspaper, who was embedded with American troops before, during and after the conquest of Saddam Hussein’s Iraq, I know this is all way off the mark; a collection of myths coloured by prejudice, fed by Hollywood’s tendentious depictions of Vietnam (fought by a very different US Army to today’s) and by memories of the Second World War.

The American soldiers I met were disciplined professionals. Many of them had extensive experience of peacekeeping in Kosovo and Bosnia and had worked alongside (or even been trained by) British troops. Thoughtful, mature for their years, and astonishingly racially integrated, they bore little resemblance to the disgruntled draftees in Platoon or Apocalypse Now.

Yes, American troops wear their helmets and armour even though removing them might ease local relations. But it’s easy to forget that British troops in Northern Ireland have very often worn helmets when patrolling unfriendly areas. And the disaster that took the lives of six Royal Military Police officers in Majar may indicate that American caution – whether it means wearing body armour, or ensuring that soldiers have sufficient back-up or are always in radio contact with headquarters – isn’t so foolish.

And it’s simply not true that the Americans don’t patrol at all, patrol only in tanks or never get out of their vehicles. I accompanied foot patrols in Baghdad as early as April 13, only days after Saddam’s presidential palace was taken. The unit carrying out these patrols was also assigned to escort SAS troopers around the city. The SAS men told me how impressed they were, not just with the Americans’ willingness to learn from them, but with their training and self-control.

The idea that American troops are lavishly equipped is also a myth, a fantasy bred out of resentment of American wealth in general. The battalion in which I was first embedded came to war in creaky, Vietnam-vintage M113 armoured personnel carriers, which frequently broke down in the desert.

The battalion fought in green heavyweight fatigues because the desert camouflage ones never arrived. And, though a shipment of desert boots turned up just before the invasion, many were the wrong size, so that these GIs had to make do with black leather clompers designed for northern Europe in December. Perhaps most resented by the troops, they were not issued with bullet-resistant vests, only flak jackets, making them vulnerable to small-arms fire.

Another myth is that the Americans are also softies who live and fight in amazing, air-conditioned comfort. The truth is that the GIs encamped in and outside palaces and Ba’ath party mansions not only lack air-conditioning but also running water, unlike most of the population they guard.

And, unlike their British counterparts, they have no communication with their families at home. Many British troops are able to use the “e-bluey” system to email their loved ones on a frequent basis. The only times most GIs in Iraq ever get to let their spouses know they are well is if a passing journalist lets them have a couple of minutes on the Satphone.

And I remember what a thrill it was when I got my hands on a British ration box after nearly three months on American MREs (meals ready to eat). GIs bored of endless variations upon chilli and macaroni were amazed to find that British rations included things such as chicken and herb paté. And they were willing to trade everything from boots to whole cases of their own rations to get some.

Though the US Army lacks our regimental system, different American divisions vary greatly in culture and experience. The Third Infantry Division – the unit that reached Baghdad first and took the city in a feat of great boldness – has been kept in Iraq because its soldiers are clearly better than newcomers at the difficult task of winning hearts and minds in a newly conquered country.

You could see this in the way the tank commander, Captain Philip Wolford , broke the rules and walked around the area his company controlled, alone and bare-headed, chatting with the locals and organising food, medical care and even employment. I wish that more British reporters had gone into the streets with 3ID men such as Sgt Darren Swain, a no-nonsense soldier from Alabama who is loved in the Baghdad area his men call “Swainsville” because, off his own bat, he takes humvees out every morning to provide security at local schools.

More recently, American soldiers have been charged with the sensitive task of searching those who enter the Palace district of Baghdad. One Shi’ite mullah felt it a great dishonour to be searched. The soldier responsible, Captain Wolford, agreed to take him round the back of the building and search him in private. Once there, the mullah agreed to be searched. Captain Wolford refused then to search him – the agreement to comply was enough. The gentlemanly approach much pleased the mullah.

It is because of this kind of sensitivity that the Americans have slowly and quietly achieved the intelligence triumph that led to the discovery and killing of the sons of Saddam Hussein.

 Jonathan Foreman writes for the New York Post 

http://www.telegraph.co.uk/comment/personal-view/3594199/American-soldiers-really-arent-spoilt-trigger-happy-yokels.html

Scorsese's "Gangs of New York" Distorts History (Daily Telegraph 15 Jan 2003)

Film/History, Old Articles/Archive Comments Off on Scorsese’s “Gangs of New York” Distorts History (Daily Telegraph 15 Jan 2003)

Scorsese’s film portrays racist mass murderers as victims

Martin Scorsese is rightly the most lauded living American film-maker – a beacon of integrity as well as a brilliant talent. But his bloody, visually gorgeous new epic, Gangs of New York, set in Civil War-era Manhattan, distorts history at least as egregiously as The Patriot, Braveheart or the recent remake of The Four Feathers. In its confused way, it puts even the revisionism of Oliver Stone to shame.

The film works so hard to make mid-19th-century Irish-American street gang members into politically correct modern heroes (and to fit them into Scorsese’s view of American history as one long ethnic rumble) that it radically distorts a great and terrible historical episode.

It treats the founding Anglo-Saxon Protestant culture of America with an ignorant contempt – where it doesn’t cleanse it from history altogether. Generally speaking, Hollywood sees that culture not as the root of treasured democratic freedoms, but as a fount of snobbery and dreary conformism. The paradoxical result of this Hollywood faux-Leftism is that the movie ends up casually glossing over the suffering of black Americans.

Gangs begins with a brutal battle in 1846 between two armies – “natives” (presumably Protestant) and immigrant Irish Catholics – for the control of the lawless Five Points area of Lower Manhattan. The leader of the Irish (Liam Neeson) is slain before the eyes of his five-year-old son, by the Natives’ leader, “Bill the Butcher” (a superb Daniel Day-Lewis). The son grows up to be Leonardo DiCaprio, a tough youth who comes back to the neighbourhood 16 years later determined to avenge his father’s death.

By 1862, Bill the Butcher has now incorporated many of the Irish thugs – including DiCaprio – into his own criminal organisation. Eventually he comes to see the boy almost as the son he never had. When the time comes for the two of them to square off, with DiCaprio in charge of the reborn “Dead Rabbits” gang, the Civil War is casting its shadow over the city with the 1863 Draft Riots.

These began with assaults on police by Irish immigrants enraged by Lincoln’s conscription order on July 11, 1863. Very quickly, they turned into a monstrous pogrom, with a 50,000-strong mob murdering and mutilating every black they could find.

The Coloured Orphans’ Asylum was set on fire, followed by several black churches and the Anglican mission in Five Points. The city’s small German Jewish population was also attacked. Panicked blacks fled to the safety of British and French vessels at anchor in the East and Hudson rivers. Many drowned. Those who were caught were often tortured and castrated before they were killed.

In the film, you don’t see any of this. Instead, a voice-over quoting from telegraph reports briefly mentions some of the mob’s racist violence. What you do see is the suppression of the riot: blue-clad troops massacring crudely armed civilians of all ages and both sexes. The rioters stand almost impassive, and are cut down by gunfire and mortar shells lobbed from warships in the harbour (a bombardment wholly invented by the film-makers).

The film’s narrator claims – and it’s a flat-out lie – that the mob was a multi-ethnic uprising of the city’s poor, that Germans and Poles joined with the Irish immigrants against New York’s epicene patricians and an unjust conscription policy that allowed the wealthy to buy their way out of military service for $300. In fact the city’s 120,000 German immigrants, many of them Catholics, took no part in the riots, there were almost no Poles living in the city and the rioters were almost entirely Irish.

They were furious with the city’s blacks because the city’s free negroes were often skilled artisans, economically and socially a rung or two above the newly arrived Irish, many of whom didn’t speak English.

Yet the film consistently portrays the “nativist” Yankees, led by Daniel Day-Lewis’s Bill the Butcher, as racists and the Irish underclass criminals, led by Leonardo DiCaprio, as multiculturalists avant la lettre.

The film’s misrepresentation of the “natives” begins early on. While the film’s Irish Catholics have a vibrant, energetic culture, the “native” Americans merely have prejudice. And you would never know that New York’s population included substantial numbers of Orange Ulstermen – a hundred people were killed in New York Orange-Green rioting as late as the riot of July 12, 1871.

Nor would you know from Scorsese’s depiction that Yankees – Northern Americans of English, Scottish, Welsh and Dutch extraction – increasingly thought that they were fighting the Civil War to abolish slavery. In the words of their favourite battle hymn, Jesus died to make men holy, and they would die to make men free.

The ending of slavery isn’t on Scorsese’s map, because its inclusion would be too difficult: it would require honesty and courage to reveal that his heroes – the Celtic predecessors of today’s beloved mafia – were on the wrong side of the most significant moral and political struggle in America’s history.

Nor would you know that many Irish volunteers fought with spectacular bravery on behalf of the union. Instead, everyone villainous in Gangs of New York is either a white Anglo-Saxon Protestant or an Irish Catholic who has sold out to WASPs.

There’s something bizarre about glorifying a subculture that fought to undermine Lincoln’s war to preserve the union and end slavery. Scorsese is treating racist mass murderers as heroes and victims. Yes, the Irish were cruelly abused in their adopted country. But it’s a strange modern fetish that assumes that victims cannot also be victimisers.

If, as the ad copy goes, “America was born in the streets”, it was not in the squalid, savage turf struggles of the Five Points, but in the streets of Boston and Lexington in 1776 – where the people traduced here as having no identity or qualities outside their xenophobia, fought for the liberties that all modern Americans take for granted.

  • Jonathan Foreman is film critic of the New York Post

http://www.telegraph.co.uk/comment/personal-view/3586345/Scorseses-film-portrays-racist-mass-murderers-as-victims.html

"Gladiator" Review (NYPost May 5, 2000)

Old Articles/Archive Comments Off on “Gladiator” Review (NYPost May 5, 2000)

Gladiator Kicks Butt

MORE than just a welcome revival of the toga movie – a genre dead for more than 30 years, if you don’t count Bob Guccione’s gamy “Caligula” – “Gladiator” is an exhilarating, sweeping epic that begs to be seen on the largest possible screen.

At times it’s surprisingly languorous for a modern actioner. But it also boasts some of the most exciting pre-gunpowder combat sequences ever: Not only are the battles in “Gladiator” superior to – and more realistic than – anything in “Braveheart,” they’re equal in excitement to the classic arena contests in “Ben Hur” and “Spartacus.”

They’re so gripping, in fact, that they’re disturbing: Long before the final duel, you find yourself cheering as wildly as the bloodthirsty Colosseum crowd.

Directed by Ridley Scott (“Alien,” “Blade Runner”), “Gladiator” also features breathtaking photography, sets and computer-generated images.

But the real glory of the movie is Russell Crowe, who is simply magnificent as a mythical Roman general turned gladiator. Like James Mason, he is one of those actors who can make the lamest line (and like its sword-and-sandal predecessors, “Gladiator” has some clunkers) sound like Shakespeare.

“Gladiator” opens on the empire’s wintry, forested northern frontier, with Maximus (Crowe) leading his legions against the ferocious German hordes. In a stunning battle sequence, clearly influenced by “Saving Private Ryan,” Maximus routs the last threat to Rome’s domination of Europe, as the ailing Emperor Marcus Aurelius (Richard Harris) looks on.

The emperor offers him supreme power; Maximus says he would rather retire to his farm in Spain. But before he can make up his mind, Commodus (Joaquin Phoenix) the emperor’s son, who is visiting the front with his sister Lucilla (Connie Nielsen), murders Aurelius and assumes the purple.

Commodus immediately arranges to have Maximus killed. The general escapes this fate but finds disaster at home before being captured by slave traders. Taken to North Africa, Maximus is sold to the gladiatorial impresario Proximo (the late Oliver Reed, as rascally and charming as ever in his final role).

Initially reluctant to fight, Maximus proves to be an extraordinarily deadly gladiator. Accordingly, Proximo brings him to Rome to compete in games sponsored by the sports-mad Commodus.

“Gladiator” draws heavily on its ’60s ancestors, but unlike them it contains no Christian message, and, more surprisingly, no sex.

Scott fills the movie with visual allusions to his own work as well as to “Spartacus” and even “Apocalypse Now.” There are also some arty indulgences, including Maximus’ bleached-out visions of his own death, shots of speeded-up clouds scudding over the desert, and black-and-white parade scenes that are clearly intended to evoke both Nazi-era Berlin and “Triumph of the Will.”

However, there are no silly anachronisms – apart from an attempt to give the drama a modern political dimension. Periodically the characters spout historical absurdities about “a dream that was Rome” and “giving power back to the people” as if screenwriters David Franzoni, John Logan and William Nicholson were recycling Princess Leia’s lines from “Star Wars.”

Ancient-history buffs might also quarrel with military details. The Romans didn’t use artillery except in sieges, for example, and employed their swords for stabbing, not slashing. Nor could they engage in cavalry charges, because the stirrup hadn’t yet made it to Europe.

——

GLADIATOR  31/2

Ridley Scott’s revival of the sword-and-sandal epic is a spectacular triumph, with sensational battle scenes and a terrific performance by Russell Crowe. First-class entertainment, it’s marred only by slow sections, occasionally leaden dialogue and some indulgently arty dream sequences. Running Time: 150 minutes. Rated R. At the Lincoln Square, the Ziegfeld, the Kips Bay, others.