joker123 Jonathan Foreman http://jonathanforeman.info Tue, 08 Nov 2016 16:16:43 +0000 en-US hourly 1 https://wordpress.org/?v=4.6.28 The School Runners (Commentary April 2016) http://jonathanforeman.info/the-school-runners-commentary-april-2016/ Mon, 18 Apr 2016 22:28:36 +0000 http://jonathanforeman.info/?p=1269 [Read more...]]]>

Review of “The Last Thousand” by Jeffrey E. Stern

A 2015 exposé on the Buzzfeed website created a stir by savaging the notion that the massive expansion of education in Afghanistan has been one of the triumphs of the international military effort. It was titled “Ghost Students, Ghost Teachers, Ghost Schools.”

“As the American mission faltered, U.S. officials repeatedly trumpeted impressive statistics— the number of schools built, girls enrolled, textbooks distributed, teachers trained, and dollars spent —to help justify the 13 years and more than 2,000 Americans killed since the United States invaded,” wrote a Pakistani-American journalist named Azmat Khan. The U.S. government’s claims are, Khan said, “massively exaggerated, riddled with ghost schools, teachers and students that exist only on paper.”

One-tenth of the schools that Buzzfeed’s employees claimed to have visited were not operating or had not been built. Some U.S.-funded schools lacked running water, toilets, or electricity. Others were not built to international construction standards. Teacher salaries, often U.S.-subsidized, were being paid to teachers at nonexistent schools. In some places local warlords had managed to divert U.S. aid into their own pockets.

The tone and presentation of the article leaves little doubt of its author’s conviction that 13 years of effort in Afghanistan, including the expenditure of 2,000 American lives and billions of dollars ($1 billion on education alone) were pointless and the entire intervention a horrendous mistake.

Unfortunately, it is all but certain that some of the gladdening numbers long cited by USAID and others are indeed inaccurate or misleading, especially given that they are based in large part on statistics supplied by various Afghan government ministries. The government of Afghanistan is neither good at, nor especially interested in, collecting accurate data. Here, as in all countries that receive massive amounts of overseas aid, local officials and NGOs have a tendency to tell foreign donors (and foreign reporters) what they think the latter want to hear. They are equally likely to exaggerate the effectiveness of a program or the desperate need for bigger, better intervention.

Moreover it would be remarkable if there weren’t legions of ghost teachers. No-show or nonexistent salaried employees are a problem in every Afghan government department. This is true even in the military: The NATO-led coalition battled for years to stop the practice whereby Afghan generals requested money to pay the salaries of units that existed only on paper. As for abandoned or incomplete school-construction projects, such things are par for the course not only in Afghanistan but everywhere in South Asia. India, Nepal, and Pakistan are littered with them. You don’t read about them much because no development effort has ever been put under the kind of (mostly hostile) scrutiny that has attended America’s attempt to drag Afghanistan into the modern era. Given the general record of all development aid over the past half century and the difficulty of getting anything done in a conflict-wrecked society like Afghanistan, it may well be the case that reconstruction efforts by the U.S. military and U.S. government in Afghanistan were relatively effective and efficient.

Despite all the money that may have been wasted or stolen, there really has been an astonishing education revolution in Afghanistan that is transforming the society. It is an undeniable fact that the U.S.-led intervention in Afghanistan has enabled the education of millions of children who would never have seen the inside of a school of any kind had it not been for the overthrow of the Taliban. The World Bank and UNICEF both estimate that at least 8 million Afghans are attending school. This means that even if a quarter of the children who are nominally enrolled in school aren’t getting any education at all, there are still 6 million other kids who are; in 2001 there were fewer than 1 million children in formal education, none of them female.

To get a sense of what education can achieve in Afghanistan, even in less than ideal circumstances, you can hardly do better than to read The Last Thousand, by the journalist and teacher Jeffrey E. Stern. It tells the extraordinary story of Marefat, a school on the outskirts of Kabul. Marefat (the Dari word means “knowledge” or “awareness”) was originally founded in a hut in a refugee camp in Pakistan. After the fall of the Taliban regime in November 2001, its founder, Aziz Royesh, brought the school to Afghanistan and set it up on a windblown patch of desert West of Kabul. By 2012, Teacher Aziz, as he is known to all, had enrolled a total of 4,000 pupils and was sending students to elite universities around the world, including Tufts, Brown, and Harvard.

The school primarily caters to the Hazara ethnic minority, of which Aziz (a former mujahideen fighter) is a member. As anyone who read The Kite Runner or saw the movie made from the bestselling novel by Khaled Hosseini knows, the Hazara have long been the victims of oppression by the majority Pashtuns and Tajiks. The Hazara (who account for about 10 percent of the Afghan population) bear a double ethnic burden. They are Shiites—heretics in the eyes of the Sunni majority of the country. And they look Asiatic. Indeed, they are widely but probably wrongly believed to be the descendants of Genghis Khan’s invaders.

Hazara were traded as slaves until the early 20th century. As late as the 1970s, they were barred from government schools and jobs and banned from owning property in downtown Kabul. As if certain parallels to another oppressed minority weren’t strong enough, the Hazara are well known for their appetite for education and resented for their business success since the establishment of a democratic constitution, and they have enthusiastically worked with the international military coalition—all of which has made them particular targets of the Taliban.

 From the start, Aziz was determined to give his students an education that would inoculate them against the sectarian and ethnic extremism that had destroyed his country. He taught them to question everything and happily educated both boys and girls, separating them only when pressure from conservative politicians put the school’s survival at risk. (When fathers balked at allowing their daughters to go to school, Aziz assured them that a literate girl would be more valuable in the marriage market.) Eventually the school also found itself educating some of the illiterate parents of its students and similarly changing the lives of other adult members of the school community.

The school’s stunning success in the face of enormous obstacles won it and its brave, resourceful founder affection as well as benefactors among the “Internationals”—the foreign civilian and military community in Afghanistan. When John R. Allen, the tough U.S. Marine general in command of all international forces in Afghanistan, finished his tour in February 2013, he personally donated enough money to the school to fund 25 scholarships. Thanks to reports about the school by a British journalist, a “Marefat Dinner” at London’s Connaught Hotel co-sponsored by Moet & Chandon raised $150,000 for the school in 2011. But by early 2013, Teacher Aziz was in despair for Marefat’s future, thanks to terrorist threats against the school and President Obama’s declaration that he would pull out half of America’s forces within a year regardless of the military and political situation in the country.

It’s a fascinating story. Which makes it a shame that much of it is told in a rather self-indulgent and mannered way. Stern’s prose tends to exude a world-weary smugness that can feel unearned, especially given some shallow or ill-informed observations on subjects such as Genghis Khan, Blitzkrieg, and the effect of Vietnam on current U.S. commanders, and his apparent ignorance of the role of sexual honor in Hazara culture.

Most exasperating, Stern patronizingly assumes an unlikely ignorance on the part of the reader. There are few newspaper subscribers who, after 15 years of front-page stories from Afghanistan, have not heard of the grand assemblies known as loya jirgas, or who don’t know that Talib literally means student. Yet Stern refers to the former as “Grand Meetings” and the latter as “Knowledge Seekers.” He also has his characters refer to Internationals as “the Outsiders,” even though any Afghan you are likely to meet knows perfectly well that the foreign presence comes in different and identifiable national and organizational flavors: Americans, NATO, the UN, the Red Cross, Englistanis (British), and so on. The same shtick apparently frees Stern from the obligation to specify an actual date on which an event occurred, or the actual name of a town or province.

Even so, The Last Thousand is a powerful and important book, especially in the way Stern conveys the sense of betrayal and the terror that many Afghans feel at the prospect of international abandonment. The Hazara children and staff at the Marefat school fear a prospective entente with the Taliban enthusiastically promoted by foreign-policy “realists” in the U.S. and UK. They correctly believe it would lead to cultural concessions that could radically diminish their safety and freedom—if not a complete surrender to murderous Pashtun racism and Sunni bigotry.

The book’s main characters are concerned by what seemed to be the imminent, complete departure of all foreign forces as part of the “zero option.” This option was seriously considered by the United States in 2013 and 2014 when then–President Karzai, in the middle of a bizarre descent into (hashish-fueled) paranoia and poisonous anti-Westernism, refused to sign a bilateral security agreement with the Western powers.

Aziz confessed to Stern (who was teaching English at the school) that he himself was in despair but was trying to hide his gloom from his pupils. He began to to urge his students, graduates, and protégés—especially the female ones—to be less vocal in their complaints about discrimination against Hazara, and he himself began controversially to cultivate unexpected allies such as the Pashtun presidential candidate Ashraf Ghani. But Marefat’s staff, students, and their parents had few illusions about the future. As one young girl said to Aziz: “If the Americans leave, we know there is no chance for us to continue our education.”

Although the future of Marefat and its Hazara pupils is uncertain, it is comforting that so much has already been achieved by the education revolution in Afghanistan. Assuming that the Taliban and its Pakistani government sponsors are not allowed to take over or prompt a collapse into civil war, this revolution may well have a tremendous and benign effect on the country’s future. After all, more than 70 percent of the Afghan population is under 25 and the median age is 17. Unlike their parents, these youths have grown up with television and radio (there are more than 90 TV stations and 174 FM radio stations), cellphones (there are at least 20 million mobile users), and even the Internet. Their horizons are wider than anything the leaders of the Taliban regime could even imagine.

As Stern relates in a hurried epilogue, the bilateral security agreement was finally signed in September 2014 after Karzai’s replacement by a new national unity government. There are still U.S. and other foreign troops in Afghanistan, even if not enough.

In Stern’s sympathetic portrayal of the Hazara and their predicament, it’s hard not to hear echoes of other persecuted minorities who put their trust in Western (and especially Anglo-Saxon) liberator-occupiers. The most recent example is the Montagnard hill tribes of Vietnam who fought alongside U.S. Special Forces and were brutally victimized by the victorious Stalinist regime after America pulled out of Indochina. Something similar happened to the Shan and Karen nations of Burma, who fought valiantly alongside the British during World War II but ever since have had to battle for survival against the majority Burmans who sided with the Japanese. In today’s Afghanistan, Gulbedin Hekmatyar, the Pakistan-backed Taliban leader, has overtly threatened the Hazara with something like the fate of the Harkis, the Algerians who fought with French during the war of independence between 1954 and 1962: At least 150,000 of the Harkis were slaughtered with the arrival of “peace.”

 The Last Thousand should remind those who are “war-weary” in the U.S. (which really means being weary of reading about the war) that bringing the troops home is far from an unalloyed good. Having met the extraordinary Teacher Aziz and his brave staff and students through the eyes of Jeffrey Stern, and knowing the fate they could face at the hands of their enemies, one finds it hard to think of President Obama’s enthusiasm for withdrawal—an enthusiasm echoed distressingly by several candidates in the presidential race—as anything but thoughtless, heartless, trivial, and unworthy of America.

The School Runners

]]>
Jeremy Corbyn and the End of the West (Commentary Magazine December 2015) http://jonathanforeman.info/jeremy-corbyn-and-the-end-of-the-west-commentary-magazine-december-2015/ Mon, 08 Feb 2016 12:49:30 +0000 http://jonathanforeman.info/?p=1262 [Read more...]]]> The Portents of Labour’s Extreme New Leader

In October 2015, the American novelist Jonathan Franzen gave a talk in London in which he expressed pleasure that Jeremy Corbyn had just been elected leader of Britain’s opposition Labour Party. To his evident surprise, Franzen’s endorsement was met with only scattered applause and then an embarrassed silence. 

Most of Franzen’s audience were the same sort of people likely to attend a Franzen talk in New York: Upper-middle-class bien-pensant Guardian readers who revile the name Thatcher the way a New York Times home-delivery subscriber reviles the name Reagan. For them, as for most Labour members of Parliament, the elevation of Jeremy Corbyn offers little to celebrate. Indeed, it looks a lot like a disaster—a bizarre and potentially devastating epilogue to the shocking rout of the Labour Party at the May 2015 general election.

Franzen probably imagined Corbyn as a kind of British Bernie Sanders, a supposedly lovable old coot-crank leftie willing to speak truth to power—and so assumed that any British metropolitan liberal audience would be packed with his fans. In fact, for all the obvious parallels between the two men, Corbyn is a very different kind of politician working in a very different system and for very different goals. Sanders may call himself a socialist, but he is relatively mainstream next to Corbyn, an oddball and an extremist even in the eyes of many British socialists.

It may seem extraordinary that a party most observers and pollsters were sure would be brought back to power in 2015—and that has long enjoyed the unofficial support of much of the UK’s media, marketing, and arts establishments—now looks to be on the verge of disintegration. But even if no one a year ago could have predicted the takeover of the party by an uncharismatic extreme-left backbencher with a fondness for terrorists and anti-Semites, the Labour Party might well be collapsing due to economic and social changes that have exposed its own glaring internal contradictions.The first stage of Labour’s meltdown was its unexpected defeat at the general election in May 2015. The experts and the polls had all predicted a hung Parliament and the formation of a coalition government led by Labour’s then-leader, Ed Milliband. But Labour lost 26 seats, was wiped out by nationalists in its former heartland of Scotland, and won less than 30 percent of the popular vote. The Liberal Democrats, the third party with whom Milliband had hoped to form a coalition, did far worse. Meanwhile the populist, anti-EU, anti-mass immigration, UK Independence Party (UKIP) won only one seat in the House of Commons but scored votes from some 3 million people—and took many more voters from Labour than from the Tories.

Milliband’s complacency about and ignorance of the concerns of ordinary working-class people played a major role in the defeat. So did his failure to contest the charge that Labour’s spendthrift ways under Tony Blair had made the 2008 financial crisis and recession much worse. Perhaps even more devastating was the widespread fear in England that Milliband would make a deal with Scottish nationalists that would require concessions such as getting rid of Britain’s nuclear deterrent. He had promised that he would never do this, but much of the public seemed to doubt the word of a man so ambitious to be prime minister that he had stabbed his own brother in the back. (David Milliband was set to take over the leadership of the party in 2010 when his younger brother, Ed, decided to challenge him from the left with the help of the party’s trade unionists.)

In the old industrial heartlands of the North and Midlands, Labour seemed at last to be paying a price for policies on immigration and social issues anathematic to many in the old British working class. As a workers’ party as well as a socialist party, and one that draws on a Methodist as well as a Marxist tradition, Labour has always had to accommodate some relatively conservative, traditional, and even reactionary social and political attitudes prevalent among the working classes (among them affection for the monarchy). Today the cultural divisions within the party between middle class activists, chattering class liberals, ethnic minority leaders, and the old working class can no longer be papered over.

With the ascension of Tony Blair to the leadership of the party in 1994, Labour began to pursue certain policies practically designed to alienate and drive out traditional working-class Labour voters and replace them not only with ordinary Britons who had grown tired of the nearly two-decade rule of the Tories but also with upper-middle-class opinion leaders attracted to multiculturalism and other fashionable enthusiasms.

One can even make a kind of quasi-Marxian argument that as the Labour Party has become more bourgeois over the decades, the more it has engaged in what amounts to conscious or unconscious class warfare against the working class it is supposed to represent. One of the first blows it struck was the abolition of the “grammar schools” (selective high schools similar to those of New York City) on the grounds that they were a manifestation of “elitism,” even though these schools gave millions of bright working-class children a chance to go to top universities. Then there was “slum clearance,” which resulted in the breakup and dispersal of strong working-class communities as residents were rehoused in high-rise tower blocks that might have been designed to encourage social breakdown and predation by teenage criminals. But the ultimate act of Labour anti-proletarianism came after the Party was recovering from the defection of working-class voters to Thatcherism and its gospel of opportunity and aspiration. This was the opening of the UK’s borders to mass immigration on an unprecedented scale by Tony Blair’s New Labour. Arguably this represented an attempt to break the indigenous working class both economically and culturally; inevitably, it was accompanied by a demonization of the unhappy indigenous working class as xenophobic and racist.

In the 2015 general election, many classic working-class Labour voters apparently couldn’t bring themselves to betray their tribe and vote Tory—but were comfortable voting for UKIP. This proved disastrous for Labour, which had once been able to count on the support of some two-thirds of working-class voters. But these cultural changes made it impossible for Labour to hold on to its old base in the same numbers. And its new base—the “ethnic” (read: Muslim) vote, a unionized public sector that is no longer expanding, and the middle-class liberals and leftists who populate the creative industries and the universities—is simply not large enough.

Labour should have won the election in 2015; it lost because of its own internal contradictions. Out of the recriminations and chaos that followed the defeat, there emerged Jeremy Corbyn.

 

To understand who Corbyn is and what he stands for, it helps to be familiar with the fictional character Dave Spart, a signature creation of the satirical magazine Private Eye. Spart is a parody of a left-wing activist with a beard and staring eyes and a predilection for hyperbole, clueless self-pity, and Marxist jargon, which spews forth from his column, “The Alternative Eye.” (He’s like a far-left version of Ed Anger, the fictional right-wing lunatic whose column graced the pages of the Weekly World News supermarket tabloid for decades.) A typical Spart column starts with a line like “The right-wing press have utterly, totally, and predictably unleashed a barrage of sickening hypocrisy and deliberate smears against the activities of a totally peaceful group of anarchists, i.e., myself and my colleagues.”

 
The column has given birth to the term spartist—which is used in the UK to refer to a type of humorless person or argument from the extreme left. There are thousands of real-life spartists to be found in the lesser reaches of academia, in Britain’s much-reduced trade-union movement, and in the public sector. For such activists, demonstrations and protests are a kind of super hobby, almost a way of life.

The 66-year-old Corbyn is the Ur-Spartist. He has always preferred marches and protests and speeches to more practical forms of politics. He was a member of Parliament for 32 years without ever holding any sort of post that would have moved him from the backbenches of the House of Commons to the front. During those three-plus decades, he has voted against his own party more than 500 times. Corbyn only escaped being “deselected” by Tony Blair—the process by which a person in Parliament can be removed from standing for his seat by his own party—because he was deemed harmless.

Many of Corbyn’s obsessions concern foreign policy. He is a bitter enemy of U.S. “imperialism,” a longtime champion of Third World revolutionary movements, and a sympathizer with any regime or organization, no matter how brutal or tyrannical, that claims to be battling American and Western hegemony. Corbyn was first elected to Parliament in 1983, and many of his critics in the Labour Party say he has never modified the views he picked up from his friends in the Trotskyite left as a young activist.

This is not entirely true, because Corbyn, like so much of the British left, has adapted to the post–Cold War world by embracing new enemies of the West and its values—in particular, those whom Christopher Hitchens labeled “Islamofascists.”

One of the qualities that sets spartists like Corbyn apart from their American counterparts is an almost erotic attraction to Islamism. They are fascinated rather than repelled by its call to violent jihad against the West. This is more than anti-Americanism or a desire to win support in Britain’s ghettoized Muslim communities. It is the newest expression of the cultural and national self-loathing that is such a strong characteristic of much progressive opinion in Anglo-Saxon countries—and which underlies much of the multiculturalist ideology that governs this body of opinion.

As a result, many on the British left today seem to have an astonishing ability to overlook, excuse, or even celebrate reactionary and atavistic beliefs and practices ranging from the murder of blaspheming authors to female genital mutilation. Corbyn has long been at the forefront of this tendency, not least in his capacity as longtime chair of Britain’s Stop the War Coalition. STWC is a pressure group that was founded to oppose not the war in Iraq but the war in Afghanistan. It was set up on September 21, 2001, by the Socialist Workers’ Party, with the Communist Party of Great Britain and the Muslim Association of Britain as junior partners. STWC supported the “legitimate struggle” of the Iraqi resistance to the U.S.-led coalition; declines to condemn Russian intervention in Syria and Ukraine; actively opposed the efforts of democrats, liberals, and civil-society activists against the Hussein, Assad, Gaddafi, and Iranian regimes; and has a soft spot for the Taliban.

Corbyn’s career-long anti-militarism goes well beyond the enthusiasm for unilateral nuclear disarmament that was widespread in and so damaging to the Labour Party in the 1980s, and which he still advocates today. He has called for the United Kingdom to leave NATO, argued against the admission to the alliance of Poland and the former Czechoslovakia, and more recently blamed the Ukrainian crisis on NATO provocation. In 2012, he apparently endorsed the scrapping of Britain’s armed forces in the manner of Costa Rica (which has a police force but no military).

As so often with the anti-Western left, however, Corbyn’s dislike of violence and military solutions mostly applies only to America and its allies. His pacifism—and his progressive beliefs in general—tend to evaporate when he considers a particular corner of the Middle East.

Indeed, Corbyn is an enthusiastic backer of some of the most violent, oppressive, and bigoted regimes and movements in the world. Only three weeks after an IRA bombing at the Conservative Party conference in Brighton in 1984 came close to killing Prime Minister Thatcher and wiping out her entire cabinet, Corbyn invited IRA leader Gerry Adams and two convicted terrorist bombers to the House of Commons. Neil Kinnock, then the leader of Labour and himself very much a man of the left, was appalled.

Corbyn is also an ardent supporter of the Chavistas who have wrecked Venezuela and thrown dissidents in prison. It goes almost without saying that he sees no evil in the Castro-family dictatorship in Cuba, and for a progressive he seems oddly untroubled by the reactionary attitudes of Vladimir Putin’s repressive, militarist kleptocracy in Russia.

Then we come to his relationship with Palestinian extremists and terrorists. A longtime patron of Britain’s Palestine Solidarity Committee, Corbyn described it as his “honor and pleasure” to host “our friends” from Hamas and Hezbollah in the House of Commons. If that weren’t enough, he also invited Raed Salah to tea at the House of Commons, even though the Palestinian activist whom Corbyn called “an honored citizen…who represents his people very well” has promoted the blood libel that Jews drink the blood of non-Jewish children. These events prompted a condemnation by Sadiq Khan MP, the Labour candidate for London’s mayoralty and a Muslim of Pakistani origin, who said that Corbyn’s support for Arab extremists could fuel anti-Semitic attacks in the UK.

That was no unrepresentative error. As Britain’s Jewish Chronicle also pointed out this year, Corbyn attended meetings of a pro-Palestinian organization called Deir Yassin Remembered. The group is run by the notorious Holocaust denier Paul Eisen. He is also a public supporter of the Reverend Stephen Sizer, a Church of England vicar notorious for promoting material on social media suggesting 9/11 was a Jewish plot.

Corbyn’s defense has been to say that he meets a lot of people who are concerned about the Middle East, but that doesn’t mean he agrees with their views. The obvious flaw of this dishonest argument is that Corbyn doesn’t make a habit of meeting either pro-Zionists or the Arab dissidents or Muslim liberals who are fighting against tyranny, terrorism, misogyny, and cruelty. And it was all too telling when, in an effort to clear the air, Corbyn addressed the Labour Friends of Israel without ever using the word Israel. It may not be the case that Corbyn himself is an anti-Semite—of course he denies being one—but he is certainly comfortable spending lots of quality time with them.

How could such a person become the leader of one of the world’s most august political parties? It took a set of peculiar circumstances. In the first place, he only received the requisite number of nominations from his fellow MPs to make it possible for him to stand for leader after the resignation of Ed Milliband because some foolish centrists thought his inclusion in the contest would “broaden the debate” and make it more interesting. They had not thought through the implications of a new election system that Milliband had put in place. An experiment in direct democracy, the new system shifted power from the MPs to the members in the country.

The party’s membership had shrunk over the years (as has that of the Tory Party), and so to boost its numbers, Milliband and his people decided to shift to a system in which new members could obtain a temporary membership in the party and take part in the vote for only £3 ($5). More than 100,000 did so. They included thousands of hard-left radicals who regard the Labour Party as a pro-capitalist sell-out. (They also included some Tories, encouraged by columnists like the Telegraph’s Toby Young, who urged his readers to vote for Corbyn in order to make Labour unelectable.) The result was a landslide for Corbyn.

Labour’s leadership was outplayed. The failure was in part generational. There is hardly anyone left in Labour who took part in or even remembers the bitter internal struggle in the late ’40s to find and exclude Communist and pro-Soviet infiltrators—one of the last great Labour anti-Communists, Denis Healey, died this October. (This was so successful that the British Trotskyite movement largely abandoned any attempt to gain power in Westminster, choosing instead to focus on infiltrating the education system in order to change the entire culture.) By the time Corbyn took over, most of Labour’s “modernizers”—those who had participated in the takeover of the party leadership by Tony Blair and his rival and successor Gordon Brown—had never encountered real Stalinists or Trotskyists and lacked the fortitude and ruthless skill to do battle with them.

Unfortunately for the centrists and modernizers, many of Corbyn’s people received their political education in extreme-left political circles, so brutal internal politics and fondness for purges and excommunications are (as Eliza Doolittle said) “mother’s milk” to them. For example: Corbyn’s right-hand men, John McDonnell and Ken Livingstone, were closely linked to a Trotskyite group called the Workers Revolutionary Party. The WRP was a deeply sinister political cult that included among its promoters not only the radical actors Vanessa and Corin Redgrave but also the directors of Britain’s National Theatre. Its creepy leader Gerry Healy was notorious for beating and raping female members of his party and took money from Muammar Gaddafi and Saddam Hussein.

Most people in British politics, and especially most British liberals, had fallen prey to the comforting delusion that the far left had disappeared—or that what remained of it was simply a grumpy element of Labour’s base rather than a devoted and deadly enemy of the center-left looking for an opportunity to go to war. As Nick Cohen, the author of What’s Left: How the Left Lost Its Way, has pointed out, this complacent assumption enabled the centrists to act as if they had no enemies to the left. Now they know otherwise.

Another reason for the seemingly irresistible rise of Corbyn and his comrades is what you might call Blair Derangement Syndrome. It is hard for Americans and other foreigners to understand what a toxic figure the former prime minister has become in his own country. Not only is he execrated in the UK more than George W. Bush is in the U.S., Blair is especially hated by his own party and on the left generally. It is a hatred that is unreasoning and fervid in almost exact proportion to the adoration he once enjoyed, and it feels like the kind of loathing that grows out of betrayed love. Those in the Labour Party who can’t stand Blair have accordingly rejected many if not all of the changes he wrought and the positions he took. And so, having eschewed Blairism, they were surprised when they lost two elections in a row to David Cameron—who, though a Tory, is basically Blair’s heir.  

Blair is detested not because he has used his time after leaving office to pursue wealth and glamour and has become a kind of fixer for corrupt Central Asian tyrants and other unsavory characters. Rather, it is because he managed to win three general elections in a row by moving his party to the center. Those victories and 12 years in office forced the left to embrace the compromises of governance without having much to show for it. This, more than Blair’s enthusiasm for liberal interventionism or his role in the Iraq war or even his unwavering support of Israel during the 2008 Gaza war, drove the party first to select the more leftist of the two Milliband brothers and now hand the reins to Corbyn.

As I write, Corbyn has been Leader of Her Majesty’s loyal opposition (a position with no equivalent in the United States) for a mere 10 weeks—and those 10 weeks have been disastrous both in terms of the polls and party unity. Corbyn’s own front bench has been on the verge of rebellion. Before the vote on the UK’s joining the air campaign in Syria, some senior members apparently threatened to resign from their shadow cabinet positions unless Corbyn moderated his staunch opposition to any British military action against ISIS in Syria. (It worked: Rather than face open revolt, Corbyn allowed a free vote instead of a “whipped” one, and 66 Labour MPs proceeded to vote for air strikes). Any notion that Corbyn’s elevation would prompt him to moderate his views quickly dissipated once he began recruiting his team. His shadow chancellor, John McDonnell, is one of the only people in Parliament as extreme as he. While serving as a London councillor in the 1980s, McDonnell lambasted Neil Kinnock, the relatively hard-left Labour leader defeated by Margaret Thatcher, as a “scab.” A fervent supporter of the IRA during the Northern Ireland troubles, McDonnell endorsed “the ballot, the bullet, and the bomb” and once half-joked that any MP who refused to meet with the “Provisionals” running the terror war against Great Britain should be “kneecapped” (the traditional Provo punishment involving the shattering of someone’s knee with a shotgun blast). Recently he made the headlines by waving a copy of Mao’s Little Red Book at George Osborne, the Chancellor of the Exchequer. As Nick Cohen has written of Corbyn and his circle: “These are not decent, well-meaning men who want to take Labour back to its roots…they are genuine extremists from a foul tradition, which has never before played a significant role in Labour Party history.”
 
During Corbyn’s first week as leader, he refused to sing the national anthem at a service commemorating the Battle of Britain, presumably because as a diehard anti-monarchist, he disagrees with the lyric “God save our Queen.” Soon after he declared that as a staunch opponent of Britain’s nuclear arsenal, he would not push the button even if the country were attacked.

He expressed unease at the assassination by drone strike of the infamous British ISIS terrorist “Jihadi John.” Corbyn said it would have been “far better” had the beheader been arrested and tried in court. (He did not say how he envisaged Jihadi John ever being subject to arrest, let alone concede that such a thing could happen only due to military action against ISIS, which he opposes).

Corbyn’s reaction to the Paris attacks prompted fury from the right and despair in his own party. He seemed oddly unmoved and certainly not provoked to any sort of anger by the horror. Indeed, he lost his chance to score some easy points against Prime Minister Cameron’s posturing. Cameron, trying to play tough in the wake of military and policing cuts, announced that British security forces would now “shoot to kill” in the event of a terrorist attack in the UK—as if the normal procedure would be to shoot to wound. Any normal Labour leader of the last seven decades would have taken the prime minister to task for empty rhetoric while reminding the public of Labour’s traditional hard stance against terrorism in Northern Ireland and elsewhere. Instead, Corbyn bleated that he was “not happy” with a shoot-to-kill policy. It was “quite dangerous,” he declared. “And I think can often be counterproductive.”

While there is no question that Labour has suffered a titanic meltdown, and that Corbyn’s triumph may mean the end of Labour as we know it, it’s not yet clear whether Corbyn is truly as electorally toxic as the mainstream media and political class believe him to be. What some observers within Labour fear is that Corbyn could indeed become prime minister after having transformed the party into a very different organization and having shifted the balance of British politics far to the left.

They concede that there is little chance of Corbyn’s ever winning over the 2–3 million swing voters of “middle England” who have decided recent elections. But they worry that in a rerun of the leadership election, Corbyn might be able to recruit a million or more new, young voters who have no memory of the Cold War, let alone Labour’s failures in the 1970s, and who think that he is offering something fresh and new.

It might not only be naive young people who would vote for Corbyn despite his apparent lack of parliamentary or leadership skills. In Britain, there is a growing disdain for, and distrust of, slick professional politicians—and for good reason. It’s not hard to seem sincere or refreshingly possessed of genuine political convictions if you’re going up against someone like David Cameron, who even more than Tony Blair can exude cynicism, smugness, and a branding executive’s patronizing contempt for the public. The fact that Corbyn is relatively old and unglamorous might also play in his favor; the British public is tired of glib, photogenic, boyish men. Corbyn and McDonnell are “an authentic alternative to the focus-group-obsessed poll-driven policies of the Blair days,” Cohen writes—but it is an authenticity based in “authentic far-left prejudices and hypocrisies.” Those prejudices and hypocrisies could sound a death knell for Britain’s historic role in advancing the Western idea—an idea that is, in large measure, this country’s greatest achievement.

https://www.commentarymagazine.com/articles/jeremy-corbyn-end-west/

]]>
In Britain and Across the World, An Age-Old Schism Becomes Ever More Bitter – Sunday Times 3 Jan 2016 http://jonathanforeman.info/in-britain-and-across-the-world-an-age-old-schism-becomes-ever-more-bitter-sunday-times-3-jan-2016/ Mon, 08 Feb 2016 12:34:10 +0000 http://jonathanforeman.info/?p=1258 [Read more...]]]> http://www.thesundaytimes.co.uk/sto/news/focus/article1652282.ece

Original Version:

For most Westerners the Shia-Sunni conflict has been a confusing but distant phenomenon that rumbles along in the background of Middle Eastern and South Asian politics: an obscure theological dispute within Islam that only makes headlines when a Shia mosque is destroyed in Pakistan or Hezbollah blows up a Sunni leader in Lebanon. But the rumble has been getting louder, and yesterday’s execution of the prominent Shia cleric Sheikh Nimr al-Nimr by the Saudi authorities could well turn it into a roar that will echo throughout the middle east and beyond.

            Since the souring of the Arab Spring, and especially since the beginning of the civil war in Syria, outsiders have become more aware that this ancient sectarian division is reflected in the struggle between two bitterly opposed power blocs in the region: a conservative Sunni one led by the Saudis and a Shia one led and inspired by Iran. Their various proxy militias are fighting each other not just in Iraq and Syria but also in Lebanon and Yemen. As bad as that may seem, this war may well be about to expand to include divided countries like Bahrain and even the Saudi kingdom itself. And it is far from unlikely that the sectarian struggle could spread much further, even into our own cities.

            Having misunderstood and even fostered Sunni-Shia tensions in the past, Western countries have tended to underestimate the importance of the Sunni-Shia divide in recent decades. You could see this in the poor planning for the Iraq war and in the fact that many Western media organizations covering that war initially had no idea if their translators and fixers were members of the Sunni minority and therefore likely to be supporters of the Saddam regime. 

            Sunnis of course make up the vast majority of Muslims around the world, and most of them are willing to live alongside Shia even if they don’t like or respect their beliefs. But hardline Sunnis and Salafists refer to Shia Muslims as Rafidah (a strongly pejorative term which roughly translates as “the rejectors” (a reference to the Shiite rejection of the first Caliphs in favour of Ali, the prophet Muhammad’s cousin and son-in-law), and see them as polytheists. As apostates, the Shia are worse and more deserving of death even than Jews and Christians.

            You can sense this even in the UK. Most British Muslims are Sunni. The concern expressed by British Muslims about the killing of the faithful in wars abroad never extended to the tens of thousands of Shia civilians slaughtered in mosques and market places during the Iraq war. Nor have there ever been any demonstrations against the large-scale killing of Shia Hazaras by the Taliban, or murderous attacks on Shiite places of worship in Pakistan.  

            One line of Salafist thought sees the Shia as a fifth column set up to destroy Islam (by the Jews of course), and blames Shiite traitors for every Muslim defeat from the Crusades onwards. Hardline Shia are equally hostile to the Sunnis but have rarely been in a position to persecute them.

            The mutual suspicion runs deeper than a theological dispute. It is political in that Sunni rulers fear that Shia minorities (or majorities in the case of Bahrain) may be more loyal to Tehran than the countries they live in. But it can also take bizarre forms: In Lebanon, both Sunni and Shia believe that the other community is prone to disgusting sexual immorality, and there are said to be some Sunnis who think that Shiites have little tails.

            In any case it is often hard for non-Muslim Westerners to get a sense of the depth and intensity of Sunni-Shia hostility or to understand when that hostility is likely to overwhelm or be overwhelmed by other political or ethnic concerns.

            Iraq’s Sunni Kurds were long happy to ally with the country’s Shiite Arabs against the ruling Sunni Arab minority. In Gaza Hamas is willing to accept support from Shia Iran while Islamic Jihad is not. On the other hand, senior Salafist clerics in Saudi Arabia celebrated Israel’s recent killing of a Hezbollah leader.

            Recent developments have made Sunni-Shia hostility more lethal and more dangerous. One is the massive global missionary effort funded by Salafist and Wahabi princes in Saudi Arabia and other parts of the Gulf. Another is the growing power and aggression of Iran, its remarkably successful drive to establish a “Shia Crescent” from Iran through Iraq into Syria and Lebanon. A third is the weakening of forces that used to dilute Sunni-Shia hostility, such as secular Arab nationalism and Pan-Arabism.

             A fourth is the sheer ferocity of the fighting in Syria. A leading Saudi cleric, Mohammad al-Barrak recently tweeted that Shiites are more harmful to the faithful than the Jews “because [the Shiite’s] crimes in four years have exceeded all the Jews’ crimes in 60 years”.

             How bad could things get? Both the Saudi and Bahraini monarchies could face genuine uprisings. Lebanon is already on the brink of another civil war. But even more frightening perhaps is the prospect of attacks on Shia targets in the many countries and cities around the world – including in Europe – where there are Shia minorities surrounded by Sunni majorities. These could and probably would lead to reprisal terror attacks, most likely by Hezbollah and Iran’s Revolutionary Guards, both of whom have carried out operations as far away from the Middle East as Argentina. Much depends on whether the Saudi monarchy can appease or control its own furious Shiite minority, and whether calmer heads will prevail in both communities around the world. 

]]>
Does US Foreign Aid Really Do Good (Washington Examiner Magazine 09/27/15) http://jonathanforeman.info/does-us-foreign-aid-really-do-good-washington-examiner-magazine-092715/ Fri, 16 Oct 2015 18:48:01 +0000 http://jonathanforeman.info/?p=1247 [Read more...]]]> About a decade ago, a five-car convoy of Toyota Land Cruisers pulled up in a cloud of dust at a remote village on the edge of a South Asian mountain range. The passengers, all of them Westerners apart from an interpreter, walked over to where a canopy had been set up by an advance team the previous day. About 25 villagers were already there, enticed by free cookies and snacks.

One of the new arrivals gave a quick talk that was translated into a local language and then the others handed leaflets to the villagers. Then the foreigners climbed into their Land Cruisers and raced back to the relative safety and comfort of the capital. The leaflets concerned a micro-finance scheme, and the men and women handing them out were part of a project sponsored by the World Bank.

Not a single person in the village ever read the leaflets for the simple reason that no one in the village could read, a problem that had apparently not occurred to the people running the project. Nevertheless, the forms sent to Washington would, no doubt, confirm that outreach had taken place, that awareness had been raised, and key step had been taken in the process of helping members of an impoverished community help themselves.

This expedition took place in Afghanistan, but it could have been in any of a dozen heavily aided countries. While it would be an exaggeration to say that no local person benefited from this particular project (after all, its foreign and local employees probably contributed a good deal to the capital’s economy), its wastefulness was arguably a betrayal both of the taxpayers who funded it and of its purported beneficiaries.

If that weren’t bad enough, even if this particular project had been better conceived and executed, and awareness really had been raised, it probably wouldn’t have done much good. That is because micro-finance, celebrated as a development panacea, simply doesn’t work in certain cultures. It can be successful especially in quasi-matriarchal societies such as Bangladesh where it was invented; but it has abjectly failed in violently macho cultures like those of Rajasthan or Pashtun Afghanistan.

The point of this story isn’t to imply that all aid is a similarly arrogant waste of effort and money, but to serve as a counter-anecdote: a reminder that real aid requires more than just good intentions, and a snapshot of the realities that all too often lie behind the heartwarming imagery and simplistic appeals to compassion used by aid advocates when selling the work of their vast global industry to the public.
 

To be fair, the aid industry has in the last couple of decades come to acknowledge that good intentions are not enough. Hence the conferences and academic papers on “aid effectiveness,” the shift to “evidence-based aid” and the increasingly rigorous efforts to understand what programs work with real people in specific cultures. Critics, skeptics and disillusioned practitioners such as William Easterly, an economics professor at New York University, are now given a grudging hearing rather than ignored or dismissed as apostles of heartlessness.

That is not quite the same as conceding that seven decades and trillions of dollars in development aid have had remarkably disappointing results, in stark contrast to the Marshall Plan that was its original inspiration. And you will rarely encounter any acknowledgement that those countries that have emerged from long-endured poverty and underdevelopment, for instance South Korea after the 1960s, or some of today’s booming African economies, have done so for reasons unconnected with aid.

Another awkward fact is that many of the attempts to bring accountability, transparency and value for money to the enterprise of development aid have actually made it less rather than more effective. U.S. aid efforts are especially compromised by the oversight requirements that would be comical if they didn’t do such a disservice to both the taxpayer and the theoretical beneficiaries of aid.

USAID in particular is notorious for an obsession with “metrics” strongly reminiscent of the McNamara approach to “winning” the Vietnam war; an approach that inevitably prompts managers to favor projects that produce crunchable data, no matter how useless those projects might otherwise be.

Moreover, as the anecdote above should suggest, a great deal of aid data isn’t worth the time it took to input into a spreadsheet. The more impoverished, chaotic and badly governed an aid-receiving country is, the less you can or should rely on official data or even aid agency estimates of its birth rates, population, mortality, literacy, family size, income. Most statistics from basket-case countries, those in which it is too difficult or dangerous for researchers and officials to visit villages far from the capital, are a combination of guesswork and garbage.

No one knows, for example, how many people live in countries like Afghanistan that have not had a census in decades, let alone how much they live on or how long they live. Often, statistics from even the largest and best-funded aid organizations are based on marketing needs rather than rigorous research. For instance, last year a U.N. agency claimed that malnutrition has gotten worse in Afghanistan since the overthrow of the Taliban, even though it’s almost impossible to know with any degree of accuracy how good or bad malnutrition was anywhere in the country in 2001 or how bad it is in large swaths of the country today.

In general, those who market development or humanitarian assistance to the public are still unwilling to admit that delivering effective aid is difficult in the best of circumstances, and even harder in the ill-governed, chaotic, impoverished societies where it seems most needed. They are even less likely to confront the reality that foreign aid all too often does actual harm.

This awkward fact is true of both development aid and humanitarian or emergency aid. The former accounts for more than 85 percent of American foreign aid even though it’s less visible to and much less understood by the public. And, if you take seriously the criticism coming from a growing number of African dissidents, activists and intellectuals, it has contributed massively to the corruption, misgovernment and tyranny that has kept their nations mired in misery.

Even before people such as Zambia’s Dambisa Moyo, Uganda’s Andrew Mwenda and Ghana’s George Ayittey became a public relations nightmare for the aid industry, some economists had noted a correlation between being aided on a huge scale and subsequently experiencing economic, political and social catastrophe.

It was after the great increase in aid to sub-Saharan Africa that begin in 1970, that per capita income dropped and many African countries endured negative growth. Other circumstantial evidence for aid as a corrosive force includes the fact that the countries that have received the most non-military foreign aid in the last six decades have a disproportionate tendency to collapse into anarchy: Besides Afghanistan and Iraq, the countries that have received the most aid per capita include Somalia, pre-earthquake Haiti, Liberia, Nepal, Zaire and the Palestinian territories.

It’s almost as hard to measure the alleged harm inflicted by aid as it is to find reliable and truly relevant metrics for aid success. On the other hand, the evident failure of many heavily aided societies speaks volumes.

How aid feeds corruption

As Mwenda said, having such a huge source of unearned revenue allows the government to avoid accountability to the citizenry. This is true of his own Uganda, where foreign aid accounts for 50 percent of the government’s budget. There, President Yuweri Museveni, once hailed as a model of modern, democratic African leadership, has responded to the generosity of the rich world not by pursuing the U.N. Millennium Development Goals, but instead by purchasing top-of-the-line Russian Su-30 warplanes for his Air Force and a Gulfstream private jet for himself.

Nor is it just in Uganda that foreign aid actually seems to discourage what donors would regard as good behavior. A recent study from the Lancet magazine showed that aid funding earmarked to supplement healthcare budgets in Africa invariably prompted recipient governments to decrease their own contributions.

It also enables such governments to avoid or postpone necessary reforms, such as the establishment of a working tax system. In Pakistan, for example, a country with a significant middle class as well as a wealthy ruling elite, less than 1 percent of the population pays income taxes. Because states with little or no income from taxation cannot afford to pay decent salaries, this makes large-scale official corruption and extortion all but inevitable.

Aid feeds corruption in other ways as well. This is partly because large-scale, state-to-state aid has the same economic and political effects as the discovery of a natural resource like oil. But it is also because so many aid agencies will do almost anything to ensure that their good works can continue.

This is especially true in humanitarian intervention. Disasters such as earthquakes and tsunamis can be tremendous windfalls for ruthless officials in places such as Sri Lanka, India and Pakistan. Their people know that if they want to help the poor and vulnerable, they will have to pay bribes to government officials. And the government officials know that the agencies they are extorting will never close their offices and pull out rather than pay upfront.

Large inflows of development aid also seem to encourage political instability. This makes sense to the extent that once the state becomes the sole source of wealth and leverage, getting control of it for one’s own party or tribe becomes all the more important, certainly worth cheating, fighting and killing to secure.

Aid can also encourage a dependency that is not just morally problematic, but also dangerous. Food aid is particularly destructive. When foreign aid agencies hand out grain, it bankrupts local farmers or at least discourages them from sowing next year’s crops, all but guaranteeing future shortages.

Afghan people carry their ration of American wheat from the United Nations High Commission for Refugees (UNHCR) food distribution center in Kabul, Afghanistan, Tuesday, July 9, 2002. (AP Photo/Sergei Grits)

At the same time, governments that ought to be preparing for the next famine don’t bother because they assume that the foreigners will deal with the problem. The United States is by far the worst offender in this regard. Its food aid programs are now and have always been little more than a corporate welfare program for American agribusiness. It boosts the bottom line of companies such as Cargill while wreaking deadly havoc abroad.

On the other hand, the United States has pioneered aid to encourage the civil society organizations that are essential checks on “poor governance,” that is, irresponsible and corrupt government officials. Unfortunately, such efforts are often undermined by other forms of aid such as budget support. After all, it’s deeply discouraging for third world anti-corruption campaigners, civil society organizations and political dissidents when they see foreign aid agencies talk about the importance of good governance, democracy and human rights while handing over yet more money to tyrants and kleptocrats.

One of the less dramatic but no less damaging side effects of humanitarian aid is the distortion of local economy when aid agencies arrive to set up refugee camps or hand out emergency rations. Not only do prices go up for everything from water to fuel, but professionals abandon their jobs to work as interpreters and drivers. The standard aid agency/media salary of $100 per day can be more than a doctor makes in a month.

Then there are the “taxes” that the agencies routinely pay local warlords or garrison commanders to secure permission to operate or in return for “security” in dangerous regions. These payments sometimes take the form of food, radios or even vehicles. As a result, the armed payee is not only wealthier, and better able to continue the fight against his rivals, he also gains vital prestige; local people see that foreigners pay court to him.

Sometimes agencies go further and allow militias or equally vicious army units or oppressive political parties to control who gets food and water. This notoriously happened in the Hutu refugee camps in Goma and happens today in parts of Ethiopia.

Some moral compromise is inevitable in the grueling, dangerous business of emergency aid. But again and again, as critics Linda Polman, David Rieff and Michael Maren have shown, aid agencies have followed the path of “Apocalypse Now’s” Col. Kurtz in pursuit of their ideals. They have become the enablers and accomplices of murderous militias and brutal regimes, prolonged wars, and even collaborated in forced relocations. The refugee camps they operate have become sanctuaries for terrorists and rear bases for guerrilla armies.

The most infamous example of this was the aid complex that grew up on Goma in what is now Eastern Congo but was then Zaire in the wake of the Rwandan genocide. There, as chronicled by Linda Polman in her devastating book War Games, the world’s aid agencies and NGOs competed fiercely to help the Hutu power genocidaires who had fled Rwanda with their families.

As so often happens, they ran the refugee camps, taxing the population, taking vehicles and equipment when they needed it, and controlling the supply of food to civilians so as to favor their members. Even worse, they used the Goma refugee camps as bases for murderous raids into Rwanda. The massacres they carried out stopped only after the army of the new Rwandan government crossed the border and overran the camps.

As Rieff pointed out, an analogous situation would have been if at the end of World War II, an SS brigade fled from the death camps it was administering and took refuge, along with its families, in Switzerland, and then, fed by aid workers, raided into Germany in an effort to kill yet more Jews.

There are many other examples of conflict being fomented and prolonged by those housing and aiding refugees, accidentally or deliberately. Refugee warriors, as some have called them, operating from the sanctuary of camps established by the United Nations High Commissioner for Refugees and others, have created mayhem everywhere from the Thai-Cambodia border to Central America and the Middle East.

Sometimes aid agencies have allowed this to happen as a result of ignorance. Sometimes it’s a matter of Red-Cross-style humanitarian ideology taken to the edge: a conviction that even the guilty need to be fed or a belief that providing security in refugee camps would be an abandonment of neutrality. And sometimes it’s because those providing aid are supporting one side in a conflict. The U.S. and Western countries did so from Pakistan during the Soviet-Afghan war.

For decades, Syria, Lebanon and Jordan allowed or encouraged Palestinian refugee camps to become bases for guerrilla and terrorist activity. This should make it clear that the aid world’s traditional ways of dealing with refugee flows are inadequate. Even purely civilian camps such as Zaatari, the sea of tented misery in Jordan that houses a million Syrians, quickly became hotbeds of radicalism and sinkholes of crime and violence, not least because they are unpoliced and because they are filled with working age men with nothing to do.

The American way of aid

A wounded Laotian soldier (wounded in action at Long Cheng mid February) walks to his cot at the USAID managed and financed hospital at Ban Xon in February 1971, during the Nixon administration. (AP Photo/Horst Faas)

American foreign assistance is carried out by a number of different agencies. USAID, founded in 1961, continues to be the largest and most important. Its priorities have shifted over the years.

During the Kennedy and Johnson administrations, USAID emphasized the development of infrastructure and embarked on large-scale projects modeled on the Tennessee Valley Authority. President Nixon took American aid in a different direction, working with Hubert Humphrey to pass the so-called “New Directions” legislation that prompted a new emphasis on health, education and rural development. It wasn’t until the Reagan administration that USAID began to emphasize democracy and governance.

The next revolution in American foreign aid took place during the administration of George W. Bush, who almost tripled USAID’s budget. The Bush administration also began two huge aid initiatives outside USAID: PEPFAR America’s HIV/AIDS program, which has been a great success, and the Millennium Challenge Corp. But the most radical Bush administration shift in aid policy may have been its increase in aid to Africa. Among other initiatives, Bush more than quadrupled funding for education on the continent. It was subsequently cut by the Obama administration.

Although aid is traditionally divided into two main types, development aid and humanitarian aid, one can categorize American aid in terms of the places it’s sent, bearing in mind some amount of foreign assistance goes to 100 countries.

There is aid given to genuinely poor countries in an honest effort to help needy people there. Then there is aid to relatively wealthy states whose elites are too irresponsible to take care of their own people. A good example is the aid the United States sends to India, a country that can afford to send rockets to Mars and which has its own growing aid program, but whose ruling elite is content to tolerate rates of malnutrition, illiteracy and curable disease that are worse than those of sub-Saharan Africa.

A third category is aid given as a foreign policy bribe. This is not the same thing as aid used as a tool of public diplomacy, because its target is a foreign country’s ruling elite. The most obvious examples are Egypt and Pakistan. America gives Egypt money and in return the enriched Egyptian military, with its prestigious American weaponry, promises not to attack Israel. U.S. aid to Egypt has preserved peace, but it has not been successful in its secondary purpose of promoting economic development and political stability.

Aid to Pakistan is arguably less successful. Its purpose was to persuade the Pakistani military and intelligence establishment to reduce its sponsorship of Islamist terrorism in the region and in particular its murderous efforts to destabilize the U.S.-supported government in Afghanistan in favor of its Taliban clients.

A fourth category is aid given as part of what was called the war on terrorism, which has been dominated by reconstruction efforts in Iraq and Afghanistan.

A fifth, linked category is aid used for the purpose of public diplomacy. This has become increasingly controversial in the aid community.

Controversies about the utility and effectiveness of aid do not necessarily break down along conventional Left vs. Right ideological lines. Interestingly, people who identified with the Left rather than the Right have recently argued that foreign aid does not win friends for America and should not be seen as a useful tool of public diplomacy.

They often refer to Pakistan and a study that showed that American humanitarian aid after the 2005 Kashmir earthquake did not have a lasting positive effect on Pakistani attitudes toward the United States. This is a problematic argument, not least because Pakistan is a special case. It is a heavily aided country in which key state actors foster anti-Americanism and have done so for a long time. An American rescue effort in one corner of the country was never likely to win over the population, especially as the state played down that effort in order to make its own efforts seem less feeble.

Moreover, those who insist that aid does not win friends abroad or influence foreign populations may have philosophical and ideological reasons for taking such a view. Many in the aid industry prefer to see aid as something that should be given without regard to any benefit to the donor country, other than that feeling of having done the right thing that comes from an altruistic act. Others are politically hostile to efforts by Western governments to win hearts and minds as part of the war on terrorism.

My experiences in aided countries in Africa and South Asia tend to contradict this argument. In Somalia, for example, the vital but decayed highway between the capital and the coast is still referred to affectionately as “the Chinese Road” some three decades after it was built.

It’s also no secret that in many parts of Africa, you encounter positive attitudes to contemporary China thanks to more recent infrastructure projects, despite the abuses and corruption that so often accompany Chinese economic activity.

A general view of a refugee camp is seen in the city of Kabul, Afghanistan, Thursday, June 12, 2008. (AP Photo/Musadeq Sadeq)

In Afghanistan’s Panjsher valley, any local will tell you how grateful the people are for the bridges built by the U.S. Provincial Reconstruction Team before it closed. This is a localized response to a local benefit. It would be naive, though not unusual on the part of U.S. officials, to expect people in other Afghan localities to be grateful for help given to their fellow countrymen.

At the same time, there seems to be evidence that bad aid makes things worse. That could take the shape of shoddy or failed projects; projects that employ disliked outsiders, and programs that everyone knows have been commandeered or ripped off by corrupt local officials.

It is probably fair to say that effective foreign aid can win friends for America, but mainly on a local basis and only if it reflects genuine local needs and preferences, and if the beneficiary population is not already steeped in anti-American prejudice.

Aid and Afghanistan

Anyone who follows media reports about the American-led reconstruction effort in Afghanistan — the greatest aid effort since the Marshall plan — could be forgiven for thinking it has been a total disaster. But anyone who has spent time there and seen how much has changed since 2001 knows that this is nonsense. The millions of girls in school, the physical and economic transformation of Kabul and other cities, the smooth highways that make commerce possible are only the most obvious manifestations of success. At the same time, the waste, theft, corruption and incompetence is at least as spectacular as these achievements.

It is not clear that Afghanistan is a radically more corrupt society than other countries that have been the target of major aid efforts, that its ruling elite is uniquely irresponsible, cynical and self-interested, or that foreign government agencies and NGOs working there have been especially naive and incompetent.

But it’s important to remember that aid to Afghanistan is not just on a uniquely large scale, offering vast opportunities for theft, misdirection and waste. It’s also much more closely scrutinized than any other aid effort in history.

Afghan police officers drag a sack full of blankets. Soldiers were meeting with village representatives to assess their needs, provide humanitarian aid assistance and to gain intelligence about the region.(AP Photo/Rafiq Maqbool)

Nothing like the same level of skeptical attention has ever been paid by media organizations to development or humanitarian aid programs in sub-Saharan Africa or South Asia. Nor has there been an equivalent of the Special Inspector General for Afghanistan Reconstruction turning a jaundiced eye on big, notoriously inefficient U.N. agencies such as UNHCR, or the efforts of giant nonprofits such as Oxfam and Save the Children.

On the other hand, Afghanistan may well be uniquely infertile ground for development aid, thanks to decades of brutalizing war, a historically feeble state whose primary function has been preying on those who lack access to effective armed force and a traditional political culture in which no one expects government officials to be better than licensed bandits.

Much of the controversy that has accompanied the aid effort in Afghanistan has involved criticism of work by the Defense Department and the military.

No one who has seen how weapons systems are procured for the U.S. military would be surprised if some Defense Department-funded aid projects in Afghanistan turned out to be wasteful, ill-considered and poorly administered. But whether they are that much worse than efforts funded by other government departments such as the State Department or USAID is another question. That they have tended to attract particular opprobrium from news media and the special IG could simply reflect institutional dislike of the military or opposition to the Afghan war.

There is evidence that in many places the military did a better job of providing aid than USAID and the rest of the aid establishment. This was partly because the military wasn’t hamstrung by security concerns; unlike USAID, its employees were willing and able to go anywhere in the country. Local commanders with the ability to hand out funds may have lacked development experience, but they were there where help was needed and, unlike many aid professionals, saw no shame in asking locals what assistance they wanted.

USAID’s bureaucratic, box-ticking approach was arguably unsuitable for a country as damaged, impoverished, misgoverned, traumatized and dysfunctional as Afghanistan. Where the military decided to build schools, it did so quickly and efficiently, assuming that American or Afghan aid agencies would then find teachers, buy schoolbooks and make the projects sustainable.

USAID, by contrast, was required to get the relevant permissions from the ministry of education in Kabul and then provincial ministries, both of which were incompetent and corrupt, and was so slow in the execution of its mandates that its tardiness threatened to undermine the war for hearts and minds.

The Obama administration and Aid

Despite what one might have expected from the candidate’s internationalist rhetoric, foreign aid was far from a priority for the first Obama administration. Key positions such as the head of the Office of U.S. Foreign Disaster Assistance went unfilled for an unconscionably long time, and overall aid spending fell. To the extent that the administration paid attention to foreign aid, its primary concern seems to have been to reverse or undo the priorities of the Bush administration.

Accordingly, efforts to promote democracy and civil society in third world countries were defunded. Countries that had been given more aid as an apparent reward for joining the international coalition in Iraq were now penalized for the same reason.

The second Obama administration has seen a relative normalization of aid policy and an increase in overall aid spending. Although democracy promotion is not the priority it was during the Bush years, it is still a sufficiently important part of U.S. foreign aid to cause USAID to be expelled from Bolivia and Ecuador. In both cases, USAID was targeted by authoritarian left-wing governments for supporting the kind of civil society organizations that can make a genuine difference to bad governance in poor countries.

But normalization is not necessarily a good thing. It means that the United States is still committed to the U.N.’s absurd Millennium Development Goals, a vast utopian list of targets whose realization would, as Rieff has put it, amount to “quite literally, the salvation of humanity,” and was always hopelessly unrealistic.

It also means that those guiding America’s aid efforts continue to be naively enthusiastic about cooperation with big business and to put excessive faith in the potential of high technology to solve third world problems. It has become increasingly clear that the Gates foundation and other new philanthropic giants are influencing the overall direction of U.S. development aid in undesirable ways.

In particular, it has meant a heightened, even feverish emphasis on technological solutions for development problems, as if cheap laptops or genetically modified crops might really be the magic bullet that “ends poverty.” As David Rieff has pointed out, such “techno-messianism” has often failed in the past. If you have a high-tech cyberhammer, all problems start to look like nails.

But reducing poverty, promoting economic growth and rescuing failing states are — this is the real lesson of six decades of development aid — extremely difficult and complicated things to achieve. One gain can lead to new problems in the way that lowering infant mortality may have contributed to overpopulation and therefore malnutrition and even starvation in some African societies.

The challenge presented by the influence of Gates and other tech billionaires on American government aid policy is not just a matter of a techno-fetishism even more intense than that of the rest of American society. It’s also a matter of priorities, of which problems get the most attention. It may be that the only thing worse than aid directed by ignorant box-ticking bureaucrats or by self-serving aid industry ideologues, is aid directed by the spouses of Silicon Valley billionaires.

The way forward

Foreign aid has so far not been a key topic for presidential candidates. Those who have said anything on the subject have tended to be relatively uncontroversial.

Hillary Clinton is especially keen on aid that benefits women and girls. Jeb Bush believes that aid is a vital instrument of U.S. foreign policy and approves of the administration’s aid boost to Central America. Marco Rubio is, perhaps surprisingly, a stalwart advocate of foreign aid, though not, of course, to Cuba while it remains in the Castro family’s grasp. His fellow Republicans Chris Christie and Mike Huckabee also see aid as key to America’s moral authority, though the latter is particularly enthusiastic about faith-based aid efforts.

On the other hand, both Rand Paul and Rick Perry have expressed a libertarian or isolationist suspicion of foreign aid, although the latter has also indicated that he thinks aid should be used more explicitly as a foreign policy lever, calling for aid to Mexico and Central American states to be withheld until those countries do more to stop the flow of immigrants to the United States. It matters less since he has dropped out of the race.

Ted Cruz supported Paul’s 2013 proposal to withhold aid from Egypt after the military coup that ousted President Mohammed Morsi, but does not seem to be against the idea of giving aid to key allies. On the other hand, Cruz did say that same year that “we need to stop sending foreign aid to nations that hate us.”

Donation goods from U.S. military, are seen hung by parachutes over the earthquake-hit area in Pakistan on Friday Oct. 14, 2005. (AP Photo/ Musadeq Sadeq, Pool)

It’s a reasonable sentiment that could resonate with the public. Carrying out such a policy switch would entail stopping aid to Pakistan (one of the countries where American aid is not only unappreciated but seems actually to feed anti-American resentment), the Palestinian territories (whose citizens are per capita the most aided people on Earth), Turkey, China and Russia.

Whatever Republican and Democratic candidates say now, it seems unlikely that questions of cutting or boosting or reforming foreign aid will play a major role in the 2016 election. That is unless the migrant and refugee crises in the Middle East and Europe gets so much worse that there are loud and popular calls for Washington to intervene in some way.

If that does happen, then it will be probably be the military that once again leads an American humanitarian effort, assisted, with the usual reservations and resentments by USAID and other agencies. It is worth remembering that during the Asian tsunami and Philippine typhoon disasters, no aid agency rescued as many people, did as much good or could have done as much good as the United States Navy, and that this was a source of pride for most Americans.

http://www.washingtonexaminer.com/george-w.-bush-top-target-in-dem-and-gop-debates/article/2574296

]]>
The Battle of Britain (Weekly Standard March 12, 2001) http://jonathanforeman.info/the-battle-of-britain-weekly-standard-march-12-2001/ Tue, 29 Sep 2015 17:32:32 +0000 http://jonathanforeman.info/?p=1243 [Read more...]]]> Is the Sun Setting on the United Kingdom?

The Abolition of Britain – From Winston Churchill to Princess Diana, by Peter Hitchens

A few years ago, I was hiking up to an observatory in Georgetown on the Malaysian island of Penang. On the steep, winding road to the top, I fell into conversation with a well-dressed middle-aged man, a Malaysian Chinese, who told me about the problems his daughter faced getting into university because of the regime’s nastily racist program that favored ethnic Malays and penalized the ethnic Chinese minority. It was unfair, unjust. “You’re British,” he said. “You should do something about this.”

It was touching and not a little sad that he thought British influence still counted for so much, and that he automatically associated the concept of fair play with the former colonial power. From a historical point of view, he wasn’t entirely mistaken: Over the centuries, many people — African slaves in agony in the Middle Passage, Hindu widows being burned alive, Indian travelers strangled by religious lunatics, Belgian civilians brutalized by Wilhelmine soldiery, and Jews being kicked to death by Nazi brownshirts — have all wanted the British to do something about it, and eventually they did.

But then Britain and its prestige are perceived differently abroad than at home these days — especially by the political class. When Peter Hitchens, the former Trotskyite who is now Britain’s most forthright conservative pundit, laments the “abolition of Britain,” he isn’t talking just about the Blair government’s formal destruction of the United Kingdom as a unitary state or even the modernizing Kulturkampf against such vestiges of the imperialist, racist, class-ridden past as the breeches worn by the Lord Chancellor and the popular Royal Tournament show of military pageantry.

He’s also talking about the long-term shift in national self-perception that allowed all this to happen — a shift, strangely enough, that accelerated as Britain left the strikebound malaise of the late 1970s for the prosperity of the 1980s and 1990s. Essentially, the British seem to have reacted, rather belatedly, to the loss of empire with an orgy of self-contempt. Pushed along by a middle-class minority who passionately desire the submersion of Britain in a European superstate, this peculiar self-loathing has made the British particularly vulnerable to a virulent form of PC multiculturalism and to the idea that Britain’s institutions and traditions are, at best, outmoded and absurd.

“We allowed our patriotism to be turned into a joke, wise sexual restraint to be mocked as prudery, our families to be defamed as nests of violence, loathing, and abuse, our literature to be tossed aside as so much garbage, and our church turned into a department of the Social Security system,” Hitchens writes in his concluding chapter.

We let our schools become nurseries of resentment and ignorance, and humiliated our universities by forcing them to take unqualified students in large numbers. . . . We abandoned a coinage which. . . . spoke of tradition and authority. . . . We tore up every familiar thing in our landscape, adopted a means of transport wholly unfitted to our small crowded island, demolished the hearts of hundreds of handsome towns and cities, and in the meantime we castrated our criminal law, because we no longer knew what was right or wrong.

Some of these changes were organic and others artificial (though Hitchens, to the detriment of his argument, rarely distinguishes the two). Some were initiated by Labour governments, but a surprising number were the work of Conservative administrations.

So, for instance, the foreign office under Margaret Thatcher pursued a relentless policy of post-imperial betrayal, beginning with hints to the Argentines that Britain no longer cared about the Falkland Islands and culminating in the selling of the people of Hong Kong to Communist China — after first removing their right to reside in the United Kingdom, so they’d have no leverage and nowhere to run.

And so, for another instance, the Tories under John Major took the country deeper into the European Union — while reciting the mantra that further integration into the emerging superstate was the only way Britain could hope to exert any influence, now that it was merely a “fourth-rate power.” (This phrase is always delivered in tones of such gloomy satisfaction, no one notices that such a “rating” ignores factors like economic strength, nuclear deterrents, seats on the U.N. Security Council, and cultural influence.)

But Tory surrenders of sovereignty pale beside the changes instituted by the “New Labour” government of Tony Blair. For the most part, the British population has been an unenthusiastic but oddly resigned witness to even more revolutionary changes. (Though the drive to abolish British currency and replace it with the Euro provoked a surprisingly vocal opposition.) The most important of these changes are the constitutional “reforms” carried out merely because the need for such changes was self-evident to the London media elite that calls the tune in British society.

The fact that the United Kingdom seemed to work — despite the oddness and antiquity and irrationalism of its constitutional arrangements — was declared irrelevant. Sure, it provided reasonable prosperity, liberty, and security at least as effectively as systems in use in the Continent (or across the Atlantic). Sure it proved less vulnerable to economic and political storms than, say, the modern German state since 1870 or the various republics, empires, and monarchies that have ruled France since 1789. But that’s all ancient history. The key thing is that nothing about the old United Kingdom conforms to what the new British elite conceives of as “modernity.”

The idea that there might be risks in sudden, radical constitutional change, that for a constitution to be effective it needs legitimacy and the emotional allegiance of the people, is not one that Britain’s hyper-rationalist but parochial reformers have given much thought to, despite the warnings flashed from Yugoslavia. For the new public-sector middle class and the metropolitan media elite, a single idea is paramount: Britain is a musty, provincial place “held back” by dated, irrational institutions and a culture that wrongly venerates a history that is essentially a record of shame and oppression.

In its mildest form, this idea is manifested in the culturalist theory of British decline that influenced Thatcher as much as Blair: the idea that postwar economic failure is inextricably linked to the persistence in Britain of a culture of deference. Better policy might well have been found by asking instead how a pair of small islands off the coast of Europe managed to become the world’s most powerful nation for a century and a half, producing a fair number of the world’s best scientists, poets, admirals, and statesmen. But those old successes were dismissed. As the newly elected Tony Blair put it in 1997 — so memorably and tellingly, in marketing-man’s jargon — Britain desperately needs to be “rebranded” as a “young country.”

That the Blair government has been able to tear so much down in so short a time with so little effective opposition is one of the most fascinating mysteries of modern politics. After all, it’s rare for a perfectly viable system of government to be dismantled in a time of peace and prosperity. Peter Hitchens understands that Britain came to this pass because of a series of social and cultural changes, some of them inevitable results of postwar exhaustion and impoverishment, but many more of them the products of cultural and class warfare.

Unfortunately his Abolition of Britain is arranged in such a scattershot way that it conveys no real sense of either the chronology or the interplay of the various factors that broke British morale and allowed a resentful section of the population, without previous experience of power and responsibility, to make a revolution. Still, The Abolition of Britain is an entertaining and moving read that helps explain why certain key strata of the British middle classes are such enthusiasts for eliminating the things that make Britain unique. It offers a key to such mysteries as how the British state could actually prosecute merchants for using non-metric measures, jail a farmer for defending himself against brutal robbers, and arrest a man for the “racist” act of flying a flag above a pub.

There are so many effective anecdotes in Hitchens’s book that it is difficult to pick one as particularly telling. So, for symbolic concision, how about the abolition of the flag? It was in 1997, the year of Blair’s election, that British Airways removed the Union Jack flag from the tails of its aircraft and replaced it with “ethnic” designs that it hoped foreign customers would find more sympathetic.

The airline’s then-CEO, Robert Ayling, apparently feared that foreigners associated the British national flag with skinheads, soccer hooligans, and imperialism. This was not based, of course, on any polling of Africans or Asians or Europeans. But Ayling did know that the Union Jack is associated with skinheads and soccer hooligans and imperialism by the media folk and the professional middle classes who now control Britain. These are people far too well-educated and sophisticated to have any truck with anything as atavistic as national pride and who simply cannot conceive that anyone would see a Union Jack as a symbol of something positive. (Britain is not in fact a flag-waving country; its inhabitants have long been embarrassed by the kind of loud patriotism associated with their continental neighbors or the United States. But there’s a difference between this kind of reticence and actual hostility to the flag.)

Kipling once asked, “What do they know of England who only England know?” The Blairite elite, for all their vacations in French or Tuscan villages, have much less experience of the outside world than the imperial elite they replaced. It’s why they don’t know that the French, whom they worship, are utterly unembarrassed by the traditional pageantry being scourged in Britain and would not dream of deconcessioning the tricoleur. Have the Blairites never seen the Communist deputies saluting, as mounted republican guardsmen in breastplates and horsehair plumes lead the Bastille Day parade, just in front of the tanks? Apparently not, which is another reason no one in the new ruling elite even questions the assumption that Britain is an embarrassingly Ruritanian society, long overdue for a thorough house-cleaning.

Still less do they doubt that a country properly cleansed of cringe-inducing vestiges of a quaint, elitist past like the changing of the guard, Oxbridge, red telephone boxes, hereditary peers, and the monarchy will be both more efficient and more popular with foreign tourists. For them it is an article of faith that new is better.

Alas for Peter Hitchens, impassioned, perceptive, and courageous though he is, the opposite is also an article of faith: For him, all change is bad. Hitchens actually laments the advent of central heating and double glazing, because families are no longer brought together by having to huddle around a single hearth. When he contrasts the Britain of Princess Diana’s funeral with the Britain of Churchill’s funeral, his case that everything has gotten worse includes the “crazed over-use of private cars” and “the disappearance of hats and the decline of coats.”

Indeed, if you were going to be harsh you might almost subtitle this book “A compendious diatribe of everything I hate about Britain today, with minor, aesthetic irritations given the same weight as the destruction of the constitution.” There’s a silly chapter in which Hitchens bemoans the famous trial of D. H. Lawrence’s Lady Chatterly’s Lover, which made it all but impossible for the British government to ban books on the grounds of obscenity. Then there’s his notion that the “American Occupation” of Britain from 1941 to 1945 introduced adultery to British womanhood — a claim that would have amused Lord Nelson and Lady Hamilton.

But the most bizarrely wrong chapter is the one that blames the satirical television and wireless programs of the late 1950s and early 1960s for destroying national unity. The idea that a culture that survived Alexander Pope and Jonathan Swift could be brought down by Dudley Moore and Alan Bennett is preposterous. And if comedy “made an entire class too ridiculous to rule,” then P. G. Wodehouse and perhaps even Charles Dickens are also to blame.

Of course many things are worse in Britain than they were during the 1950s, the decade that Hitchens takes as his paradigm for the real, lost Britain. Even people of the Left look with disgust upon Tony Blair’s “Cool Britannia” with its ubiquitous youth culture awash with drugs, its government by glib marketing men, its increasing corruption, the ever-spreading coarseness, and the startling ubiquity of violent crime (you’re now much, much more likely to be mugged or burgled in London than in New York).

But is it so terrible that the food is better, that there are sidewalk cafes, that middle- and even working-class people can afford to travel, that the state plays a smaller role in the nation’s economic life (though a far greater one in other realms)? Some of Hitchens’s nostalgia fixes on things that were not especially British, or not laudably so — like censorship, or the prosecution and blackmailing of homosexuals. Other things Hitchens sees as quintessentially British were, in fact, freakish phenomena of the postwar decades. In particular, the placidity and gentleness in those years was an artificial state, the result of exhaustion and wartime discipline.

Hitchens should know that for centuries European and other visitors were struck by the amazing pugnaciousness of the English and by their quick sentimentality. (Two enjoyable recent books, Jeremy Paxman’s The English and Paul Langford’s Englishness Identified, take up this topic.) From the eighteenth century on, Britons were seen even by their many European admirers as terrifyingly violent. That’s why small numbers of them were able to defeat large numbers of foreigners either on the continent or the battlefields of empire. The British soccer hooligan is a mere return to form. So, too, the Victorians were famous for their weeping: They kept emotional reserve for important moments, like when they were about to be tortured by Fuzzy Wuzzies.

It’s a shame The Abolition of Britain includes so much cranky fogeyism (including nostalgia for the flogging of teenage criminals). It’s a shame, because at its best this book combines superb reporting (especially about the hijacking of education by frustrated leftists) with a heartbreaking analysis of one of the strangest revolutions in history. And in many ways it is the most important of the torrent of books that have dealt with the crisis of British identity.

What Hitchens understands is that bourgeois New Labour is far more revolutionary than any government before — although, ironically, it learned just how easy it is to defy tradition and make radical constitutional changes from Margaret Thatcher, who abolished the Greater London Council merely because it was dominated by her political enemies. Hitchens rightly sees the New Labour “project” as a kind of politically correct Thatcherism with a punitive cultural agenda aimed at certain class enemies. The House of Commons’s vote to abolish fox hunting is a perfect example: an interference in British liberty enacted by our urban middle-class rulers in order to kick toffs in the teeth — one that will put thousands of rural working-class people out of work. When Labour was dominated by cloth-capped, working-class socialists, ownership of the means of production may have been at issue, but the party never threatened the structure of the kingdom. Tony Blair heads the least socialist, least redistributive Labour government ever. Yet at the same time he has used the legally unchecked powers of a House of Commons majority to enact the most revolutionary changes in the British constitution since the Civil War of the 1640s.

It still isn’t clear whether the Blair government sees its steady stream of attacks on the old order’s structure and accouterments as a clever and harmless way of distracting its genuinely socialist members and supporters from their fiscal conservatism, or whether they actually know that traditions and rituals are rather more important than marginal tax rates when it comes to destroying the old United Kingdom they despise.

Because the reforms, enacted swiftly and without serious debate, were intended mostly to proclaim the new government’s difference from the Tories, they followed no consistent theory. Scotland and Wales got separate parliaments but continue to send MPs to Westminster where they make laws for the English (some 80 percent of the population) who do not have their own separate parliament.

Of course, it never occurred to the Blairites — who see themselves as technocrats above primitive feelings of attachment to nation or any community other than their own cosmopolitan class — that by tossing bones to the Welsh and Scots nationalist minorities they might awaken the long slumbering beast of English nationalism. These people have lived so long under the protection of an inclusive British nationalism, they couldn’t imagine that English nationalism, fed by growing submission to Europe and the unfair favoring of Scotland, will of necessity be racial and resentful. When a few old souls mentioned the danger of awakening nationalisms after centuries of peace and comity, they were laughed at by the Blairites. Now you see all over England the red cross of St. George, a symbol from the medieval past that spontaneously appeared in the hands of soccer fans and on the dashboards of London taxicabs. It’s enough to make Hitchens warn of “interesting times” ahead — in the scary sense of “interesting.” As he says, “When a people cease to believe their national myths and cease to know or respect their history, it does not follow that they become blandly smiling internationalists. Far from it.”

Of course, you can detect in the Blair generation’s discomfort with Britain’s past an element of envy and insecurity. It cannot be easy for middle-aged Britons to look back on the achievements of their fathers and grandfathers (who defeated Hitler and the Kaiser), or, worse still, those of their great grandfathers (who brought peace and prosperity to millions around the globe), without wishing to denigrate those achievements.

But if you want to understand why a significant chunk of the British population loathes Britain and wants to undo it, you have to look beyond generational resentment to class. An acquaintance of mine was on his way to a party for the fiftieth anniversary of VE day in 1995 when he bumped into Jon Snow, a well-known British broadcaster and fairly typical figure of the new British establishment. He asked Snow if he too were going to a VE celebration. Snow sneered back that he was going to “an anti-VE day party.” Not for him any of that jingoistic nostalgia for World War II.

As Orwell pointed out, the English intelligentsia has always been severed from the common culture of the country. But in the 1930s, the intellectuals were joined in their alienation by a significant number of mandarins, upper- and upper-middle-class civil servants, who responded to democratization and the simultaneous decline of British influence by deciding that their country would be better off ruled by Nazi Germany or the Soviet Union.

The modern equivalent is to transfer one’s allegiance to the “European ideal,” which means, in practice, rule by the smooth bureaucrats of Brussels. For the remnants of the mandarin class, there’s something comforting in the idea that Britain and Europe can be run by a sophisticated international elite — made up of chaps not unlike themselves.

“Europe” also solves a status problem for the new public-sector middle class. Unlike the treacherous mandarins, these people have not lost position; they never had it. They therefore define themselves as being more “civilized” than the country-house toffs above them and the bigoted proles below. And they take to an extreme the retarditaire notion that everything is done better on the Continent. The basic idea is that if you are the kind of person sophisticated enough to appreciate wine and cappuccino — rather than beer and tea — then, of course, you must favor the transfer of sovereignty from Britain to Brussels.

There are good reasons for Americans to study Peter Hitchens’s The Abolition of Britain. It won’t be a good thing for America if British PC multiculturalists manage to discredit the parent culture of the United States. More important, however, is the lesson about the fragility of culture that Americans should take from this book. In his famous essay “England, Your England,” George Orwell wrote, “It needs some very great disaster, such as prolonged subjugation by a foreign enemy, to destroy a national culture.” But reading Hitchens you soon realize that Orwell was wrong: A culture can be destroyed from the inside, as well.

 

]]>
American soldiers really aren’t spoilt, trigger-happy yokels (Daily Telegraph 25 July 2003) http://jonathanforeman.info/american-soldiers-really-arent-spoilt-trigger-happy-yokels-daily-telegraph-25-july-2003/ Sun, 13 Sep 2015 20:18:07 +0000 http://jonathanforeman.info/?p=1239 [Read more...]]]>

Whether the deaths of Uday and Qusay Hussein were self-inflicted or not, the military operation to capture them was immaculate. There were no American deaths, 10 minutes of warnings were given over loudspeakers, and it was the Iraqis who opened fire. So sensitive was the American approach, they even rang the bell of the house before entering.

The neat operation fits squarely with the tenor of the whole American campaign, contrary to the popular negative depiction of its armed forces: that they are spoilt, well-equipped, steroid-pumped, crudely patriotic yokels who are trigger-happy yet cowardly in their application of overwhelming force.

And, unlike our chaps, none of them is supposed to have the slightest clue about Northern Ireland-style “peacekeeping”: never leaving their vehicles to go on foot patrols, never attempting to win hearts and minds by engaging with local communities and, of course, never removing their helmets, sunglasses and body armour to appear more human.

As a British journalist working for an American newspaper, who was embedded with American troops before, during and after the conquest of Saddam Hussein’s Iraq, I know this is all way off the mark; a collection of myths coloured by prejudice, fed by Hollywood’s tendentious depictions of Vietnam (fought by a very different US Army to today’s) and by memories of the Second World War.

The American soldiers I met were disciplined professionals. Many of them had extensive experience of peacekeeping in Kosovo and Bosnia and had worked alongside (or even been trained by) British troops. Thoughtful, mature for their years, and astonishingly racially integrated, they bore little resemblance to the disgruntled draftees in Platoon or Apocalypse Now.

Yes, American troops wear their helmets and armour even though removing them might ease local relations. But it’s easy to forget that British troops in Northern Ireland have very often worn helmets when patrolling unfriendly areas. And the disaster that took the lives of six Royal Military Police officers in Majar may indicate that American caution – whether it means wearing body armour, or ensuring that soldiers have sufficient back-up or are always in radio contact with headquarters – isn’t so foolish.

And it’s simply not true that the Americans don’t patrol at all, patrol only in tanks or never get out of their vehicles. I accompanied foot patrols in Baghdad as early as April 13, only days after Saddam’s presidential palace was taken. The unit carrying out these patrols was also assigned to escort SAS troopers around the city. The SAS men told me how impressed they were, not just with the Americans’ willingness to learn from them, but with their training and self-control.

The idea that American troops are lavishly equipped is also a myth, a fantasy bred out of resentment of American wealth in general. The battalion in which I was first embedded came to war in creaky, Vietnam-vintage M113 armoured personnel carriers, which frequently broke down in the desert.

The battalion fought in green heavyweight fatigues because the desert camouflage ones never arrived. And, though a shipment of desert boots turned up just before the invasion, many were the wrong size, so that these GIs had to make do with black leather clompers designed for northern Europe in December. Perhaps most resented by the troops, they were not issued with bullet-resistant vests, only flak jackets, making them vulnerable to small-arms fire.

Another myth is that the Americans are also softies who live and fight in amazing, air-conditioned comfort. The truth is that the GIs encamped in and outside palaces and Ba’ath party mansions not only lack air-conditioning but also running water, unlike most of the population they guard.

And, unlike their British counterparts, they have no communication with their families at home. Many British troops are able to use the “e-bluey” system to email their loved ones on a frequent basis. The only times most GIs in Iraq ever get to let their spouses know they are well is if a passing journalist lets them have a couple of minutes on the Satphone.

And I remember what a thrill it was when I got my hands on a British ration box after nearly three months on American MREs (meals ready to eat). GIs bored of endless variations upon chilli and macaroni were amazed to find that British rations included things such as chicken and herb paté. And they were willing to trade everything from boots to whole cases of their own rations to get some.

Though the US Army lacks our regimental system, different American divisions vary greatly in culture and experience. The Third Infantry Division – the unit that reached Baghdad first and took the city in a feat of great boldness – has been kept in Iraq because its soldiers are clearly better than newcomers at the difficult task of winning hearts and minds in a newly conquered country.

You could see this in the way the tank commander, Captain Philip Wolford , broke the rules and walked around the area his company controlled, alone and bare-headed, chatting with the locals and organising food, medical care and even employment. I wish that more British reporters had gone into the streets with 3ID men such as Sgt Darren Swain, a no-nonsense soldier from Alabama who is loved in the Baghdad area his men call “Swainsville” because, off his own bat, he takes humvees out every morning to provide security at local schools.

More recently, American soldiers have been charged with the sensitive task of searching those who enter the Palace district of Baghdad. One Shi’ite mullah felt it a great dishonour to be searched. The soldier responsible, Captain Wolford, agreed to take him round the back of the building and search him in private. Once there, the mullah agreed to be searched. Captain Wolford refused then to search him – the agreement to comply was enough. The gentlemanly approach much pleased the mullah.

It is because of this kind of sensitivity that the Americans have slowly and quietly achieved the intelligence triumph that led to the discovery and killing of the sons of Saddam Hussein.

 Jonathan Foreman writes for the New York Post 

http://www.telegraph.co.uk/comment/personal-view/3594199/American-soldiers-really-arent-spoilt-trigger-happy-yokels.html

]]>
Scorsese’s “Gangs of New York” Distorts History (Daily Telegraph 15 Jan 2003) http://jonathanforeman.info/scorseses-gangs-of-new-york-distorts-history-daily-telegraph-15-jan-2003/ Sun, 13 Sep 2015 20:07:59 +0000 http://jonathanforeman.info/?p=1235 [Read more...]]]>

Scorsese’s film portrays racist mass murderers as victims

Martin Scorsese is rightly the most lauded living American film-maker – a beacon of integrity as well as a brilliant talent. But his bloody, visually gorgeous new epic, Gangs of New York, set in Civil War-era Manhattan, distorts history at least as egregiously as The Patriot, Braveheart or the recent remake of The Four Feathers. In its confused way, it puts even the revisionism of Oliver Stone to shame.

The film works so hard to make mid-19th-century Irish-American street gang members into politically correct modern heroes (and to fit them into Scorsese’s view of American history as one long ethnic rumble) that it radically distorts a great and terrible historical episode.

It treats the founding Anglo-Saxon Protestant culture of America with an ignorant contempt – where it doesn’t cleanse it from history altogether. Generally speaking, Hollywood sees that culture not as the root of treasured democratic freedoms, but as a fount of snobbery and dreary conformism. The paradoxical result of this Hollywood faux-Leftism is that the movie ends up casually glossing over the suffering of black Americans.

Gangs begins with a brutal battle in 1846 between two armies – “natives” (presumably Protestant) and immigrant Irish Catholics – for the control of the lawless Five Points area of Lower Manhattan. The leader of the Irish (Liam Neeson) is slain before the eyes of his five-year-old son, by the Natives’ leader, “Bill the Butcher” (a superb Daniel Day-Lewis). The son grows up to be Leonardo DiCaprio, a tough youth who comes back to the neighbourhood 16 years later determined to avenge his father’s death.

By 1862, Bill the Butcher has now incorporated many of the Irish thugs – including DiCaprio – into his own criminal organisation. Eventually he comes to see the boy almost as the son he never had. When the time comes for the two of them to square off, with DiCaprio in charge of the reborn “Dead Rabbits” gang, the Civil War is casting its shadow over the city with the 1863 Draft Riots.

These began with assaults on police by Irish immigrants enraged by Lincoln’s conscription order on July 11, 1863. Very quickly, they turned into a monstrous pogrom, with a 50,000-strong mob murdering and mutilating every black they could find.

The Coloured Orphans’ Asylum was set on fire, followed by several black churches and the Anglican mission in Five Points. The city’s small German Jewish population was also attacked. Panicked blacks fled to the safety of British and French vessels at anchor in the East and Hudson rivers. Many drowned. Those who were caught were often tortured and castrated before they were killed.

In the film, you don’t see any of this. Instead, a voice-over quoting from telegraph reports briefly mentions some of the mob’s racist violence. What you do see is the suppression of the riot: blue-clad troops massacring crudely armed civilians of all ages and both sexes. The rioters stand almost impassive, and are cut down by gunfire and mortar shells lobbed from warships in the harbour (a bombardment wholly invented by the film-makers).

The film’s narrator claims – and it’s a flat-out lie – that the mob was a multi-ethnic uprising of the city’s poor, that Germans and Poles joined with the Irish immigrants against New York’s epicene patricians and an unjust conscription policy that allowed the wealthy to buy their way out of military service for $300. In fact the city’s 120,000 German immigrants, many of them Catholics, took no part in the riots, there were almost no Poles living in the city and the rioters were almost entirely Irish.

They were furious with the city’s blacks because the city’s free negroes were often skilled artisans, economically and socially a rung or two above the newly arrived Irish, many of whom didn’t speak English.

Yet the film consistently portrays the “nativist” Yankees, led by Daniel Day-Lewis’s Bill the Butcher, as racists and the Irish underclass criminals, led by Leonardo DiCaprio, as multiculturalists avant la lettre.

The film’s misrepresentation of the “natives” begins early on. While the film’s Irish Catholics have a vibrant, energetic culture, the “native” Americans merely have prejudice. And you would never know that New York’s population included substantial numbers of Orange Ulstermen – a hundred people were killed in New York Orange-Green rioting as late as the riot of July 12, 1871.

Nor would you know from Scorsese’s depiction that Yankees – Northern Americans of English, Scottish, Welsh and Dutch extraction – increasingly thought that they were fighting the Civil War to abolish slavery. In the words of their favourite battle hymn, Jesus died to make men holy, and they would die to make men free.

The ending of slavery isn’t on Scorsese’s map, because its inclusion would be too difficult: it would require honesty and courage to reveal that his heroes – the Celtic predecessors of today’s beloved mafia – were on the wrong side of the most significant moral and political struggle in America’s history.

Nor would you know that many Irish volunteers fought with spectacular bravery on behalf of the union. Instead, everyone villainous in Gangs of New York is either a white Anglo-Saxon Protestant or an Irish Catholic who has sold out to WASPs.

There’s something bizarre about glorifying a subculture that fought to undermine Lincoln’s war to preserve the union and end slavery. Scorsese is treating racist mass murderers as heroes and victims. Yes, the Irish were cruelly abused in their adopted country. But it’s a strange modern fetish that assumes that victims cannot also be victimisers.

If, as the ad copy goes, “America was born in the streets”, it was not in the squalid, savage turf struggles of the Five Points, but in the streets of Boston and Lexington in 1776 – where the people traduced here as having no identity or qualities outside their xenophobia, fought for the liberties that all modern Americans take for granted.

  • Jonathan Foreman is film critic of the New York Post

http://www.telegraph.co.uk/comment/personal-view/3586345/Scorseses-film-portrays-racist-mass-murderers-as-victims.html

]]>
“Gladiator” Review (NYPost May 5, 2000) http://jonathanforeman.info/gladiator-review-nypost-may-5-2000/ Sun, 13 Sep 2015 19:59:21 +0000 http://jonathanforeman.info/?p=1232 [Read more...]]]> Gladiator Kicks Butt

MORE than just a welcome revival of the toga movie – a genre dead for more than 30 years, if you don’t count Bob Guccione’s gamy “Caligula” – “Gladiator” is an exhilarating, sweeping epic that begs to be seen on the largest possible screen.

At times it’s surprisingly languorous for a modern actioner. But it also boasts some of the most exciting pre-gunpowder combat sequences ever: Not only are the battles in “Gladiator” superior to – and more realistic than – anything in “Braveheart,” they’re equal in excitement to the classic arena contests in “Ben Hur” and “Spartacus.”

They’re so gripping, in fact, that they’re disturbing: Long before the final duel, you find yourself cheering as wildly as the bloodthirsty Colosseum crowd.

Directed by Ridley Scott (“Alien,” “Blade Runner”), “Gladiator” also features breathtaking photography, sets and computer-generated images.

But the real glory of the movie is Russell Crowe, who is simply magnificent as a mythical Roman general turned gladiator. Like James Mason, he is one of those actors who can make the lamest line (and like its sword-and-sandal predecessors, “Gladiator” has some clunkers) sound like Shakespeare.

“Gladiator” opens on the empire’s wintry, forested northern frontier, with Maximus (Crowe) leading his legions against the ferocious German hordes. In a stunning battle sequence, clearly influenced by “Saving Private Ryan,” Maximus routs the last threat to Rome’s domination of Europe, as the ailing Emperor Marcus Aurelius (Richard Harris) looks on.

The emperor offers him supreme power; Maximus says he would rather retire to his farm in Spain. But before he can make up his mind, Commodus (Joaquin Phoenix) the emperor’s son, who is visiting the front with his sister Lucilla (Connie Nielsen), murders Aurelius and assumes the purple.

Commodus immediately arranges to have Maximus killed. The general escapes this fate but finds disaster at home before being captured by slave traders. Taken to North Africa, Maximus is sold to the gladiatorial impresario Proximo (the late Oliver Reed, as rascally and charming as ever in his final role).

Initially reluctant to fight, Maximus proves to be an extraordinarily deadly gladiator. Accordingly, Proximo brings him to Rome to compete in games sponsored by the sports-mad Commodus.

“Gladiator” draws heavily on its ’60s ancestors, but unlike them it contains no Christian message, and, more surprisingly, no sex.

Scott fills the movie with visual allusions to his own work as well as to “Spartacus” and even “Apocalypse Now.” There are also some arty indulgences, including Maximus’ bleached-out visions of his own death, shots of speeded-up clouds scudding over the desert, and black-and-white parade scenes that are clearly intended to evoke both Nazi-era Berlin and “Triumph of the Will.”

However, there are no silly anachronisms – apart from an attempt to give the drama a modern political dimension. Periodically the characters spout historical absurdities about “a dream that was Rome” and “giving power back to the people” as if screenwriters David Franzoni, John Logan and William Nicholson were recycling Princess Leia’s lines from “Star Wars.”

Ancient-history buffs might also quarrel with military details. The Romans didn’t use artillery except in sieges, for example, and employed their swords for stabbing, not slashing. Nor could they engage in cavalry charges, because the stirrup hadn’t yet made it to Europe.

——

GLADIATOR  31/2

Ridley Scott’s revival of the sword-and-sandal epic is a spectacular triumph, with sensational battle scenes and a terrific performance by Russell Crowe. First-class entertainment, it’s marred only by slow sections, occasionally leaden dialogue and some indulgently arty dream sequences. Running Time: 150 minutes. Rated R. At the Lincoln Square, the Ziegfeld, the Kips Bay, others.

]]>
RIP Candida Royalle http://jonathanforeman.info/rip-candida-royalle/ Fri, 11 Sep 2015 13:49:07 +0000 http://jonathanforeman.info/?p=1227 [Read more...]]]> So sad to hear of the death of my friend Candace Vadala, who under the name Candida Royalle (her website is here) was a pioneering maker of erotic films for women and a feminist fighter for free expression, having been a pornographic film actress in the 1970s. For the last few years she’d been working on a documentary called “While You Were Gone” about her abandonment as a child and her search for her birth mother. She was a very brave woman with a wonderful sense of humor, and so full of life it’s hard to believe she’s gone.  IMG_3767

]]>
The Timothy Hunt Witch Hunt (Commentary Sept. 2015) http://jonathanforeman.info/the-timothy-hunt-witch-hunt-commentary-sept-2015/ Wed, 09 Sep 2015 17:39:47 +0000 http://jonathanforeman.info/?p=1224 [Read more...]]]> What Really Happened in the Tim Hunt Affair and Why It Matters

In 1983, the British biochemist Timothy Hunt discovered cyclins, a family of proteins that help regulate the life of cells. Eighteen years later, in 2001, he was awarded the Nobel Prize in Physiology and Medicine. Between June 8 and June 10 of this year, the 72-year-old Hunt went from being a universally respected and even beloved figure at the top of the scientific establishment to an instant pariah, condemned everywhere for antiquated opinions about women’s role in science that he does not, in fact, hold.

In only 48 hours, he found himself compelled to resign his positions at University College London and at the august Royal Society (where Isaac Newton and Robert Hooke once fought petty battles) after being told that failure to do so would lead to his outright firing.

The Timothy Hunt affair represents more than the gratuitous eye-blink ruination of a great man’s reputation and career. It demonstrates the danger of the extraordinary, almost worshipful deference that academia, government institutions, and above all the mainstream media now accord to social media. It is yet more evidence of the way moral panic and (virtual) mob rule can be accelerated and intensified by the minimalism of Twitter, with its 140-character posts and its apparently inherent tendency to encourage snap judgments, prejudice, and cruelty.

Fortunately, the story did not end on June 10. In the weeks following the initial assault, some of Hunt’s most ardent persecutors have been exposed as liars or blinkered ideologues, abetted by cynical hacks and academic rivals on a quest to bring him down or use him as grist to a political mill. Hunt’s partial rehabilitation has largely come about thanks to the dogged investigations of Louise Mensch, the British novelist and former conservative member of parliament who lives in New York City and is herself a powerful presence on Twitter. Mensch was alarmed by what she calls ‘the ugly combination of bullying and sanctimony” in the reaction to remarks made by “an evidently sweet and kind” older man.

She did some checking on Twitter and soon found that the two main witnesses for the prosecution contradicted each other. Then she began a more thorough investigation of Hunt’s offending comments and the lack of due process involved in his punishment by various academic and media institutions. The results of her exhaustive research, published on her blog, Unfashionista.com, encouraged an existing groundswell of support for Hunt from scientists around the world but most important from Hunt’s own female colleagues and former students.

As a result, the false picture of Hunt as a misogynist opposed to the equal participation of women in science has mostly been dispelled. Hunt, who is married to a distinguished immunologist named Mary Collins, has ceased being the science academy’s equivalent of George Orwell’s Emmanuel Goldstein—the object of the Two Minutes Hate in 1984—on Twitter. Indeed, one of the Britain’s most respected female scientists, Dame Athene Donald, master of Churchill College, Cambridge, has publicly lamented the wrecking of Hunt’s reputation by “sloppy journalism fueled by self-righteous fervor.”
Nevertheless various senior figures continue to insist that whether or not Hunt’s remarks were jokes or correctly reported, he is deservedly a symbol of the sexism that allegedly pervades science. At the time of this writing, moreover, he has not been restored to the positions from which he was expelled or forced to resign.

On June 8, Hunt was in Seoul to give the opening lecture at the World Conference of Science Journalists. He was also invited to give an informal toast at a luncheon sponsored by the Korea Federation of Women’s Science and Technology Associations. It was this toast—or rather the way it was reported and reacted to—that led to his disgrace.
Speaking for fewer than five minutes, Hunt praised female scientists with whom he has worked, and then he said this:

It’s strange that a chauvinist monster like me has been asked to speak to women scientists. Let me tell you about my trouble with girls. Three things happen when they are in the lab: You fall in love with them, they fall in love with you, and when you criticize them, they cry. Perhaps we should make separate labs for boys and girls.

It is not clear whether Hunt had already mentioned that he and his wife met and fell in love when they were working in his lab, or whether he assumed that everyone in the room was aware of this fact and therefore the context of the remark. Hunt continued: “Now seriously, I’m impressed by the economic development of Korea. And women scientists played, without doubt, an important role in it. Science needs women, and you should do science despite the obstacles and despite monsters like me!”

A few hours after the lunch, a British science journalist named Connie St. Louis sent out a tweet to her followers that read:

Nobel scientist Tim Hunt FRS says at Korean women lunch “I’m a chauvinist and keep ‘girls’ single lab.

Beneath the tweet was a photograph of Hunt and more text by St. Louis: “lunch today sponsored by powerful role model Korean female scientists and engineers. Utterly ruined by sexist speaker Tim Hunt FRS.” (The FRS stands for “Fellow of the Royal Society.”) She went on to give an account of the “trouble with girls” speech that left out his “now seriously” verbal transition and praise of women in science and implied that Hunt was seriously advocating sex-segregated labs.

Shared more than 600 times, the St. Louis tweet ignited a combined Internet, social-media, and then print-media firestorm with astonishing speed. Her observations were repeated in news bulletins across the world. But as has happened before when such Twitter posses gather,1 Hunt himself became aware of it only when the BBC called him as he was about to board a plane to London.

While he was on the flight, the dean of life sciences at University College, London, telephoned his wife—herself a full professor at the school—to say that if Hunt did not immediately resign, he would be fired. No one at University College had even tried to get his side of the story or any independent confirmation of the incident described by Connie St. Louis. On the contrary, two of Hunt’s colleagues had started lobbying against him as soon as they saw the tweets. One of them, Dorothy Bishop, sent this message to the Dean on June 9: “Could we ask that he not be on any appointments or promotions committee given his views.” Another, David Colquhoun, started a Twitter hashtag called #Huntgate and called for Hunt to be expelled from the Royal Society as well as University College. And in short order Hunt was indeed made to resign from the Royal Society’s awards committee and the European Research Commission.

Although St. Louis was the primary author of Hunt’s destruction, she had a pair of allies with whom she apparently plotted his takedown while in Seoul.2 They were her friends Deborah Blum and Ivan Oransky. Blum, a professor of journalism at the Massachusetts Institute of Technology and occasional New York Times columnist, took to Twitter right away to back up her old friend, insisting that Hunt never praised women in science during his toast, that he was not joking when calling for segregated labs, and that his remarks had caused great offense to his hosts.

The first website stories about Hunt’s alleged faux pas appeared on June 9. All of them were based on St. Louis’s tweets; none included a response from Hunt himself or comments from the organizers of the event. A Google news headline proclaimed: “Nobel prizewinner Tim Hunt says women should be banned from labs.” Some of the most influential stories modified his reported words to make them sound worse. One such piece by Brandy Zadrozny in the Daily Beast was entitled “Nobel Prize-Winning Biologist Calls Women Love-Hungry Crybabies.” It began: “Lady scientists: they’re always falling in love and crying about it. Amiright? So says important man of science, knighted and Nobel Prize–winning biologist Sir Tim Hunt.”
According to Zadrozny, Hunt’s words were symptomatic of a wider problem: “The biologist who called female scientists ‘girls’ who fell in love with him then berated them for crying too much isn’t an outlier. For females in the science world, sexism is the norm.”

Neither Zadrozny nor her editors at the Beast seem to have noticed that Hunt had spoken of male scientists as “boys” in the same passage, rather undermining the notion that his use of the word “girls” was prima facie evidence of sexism.

Buzzfeed ran a story the same day entitled “Nobel prizewinner makes shockingly sexist remarks at journalist meeting.” The writer, Cat Ferguson, reported that Hunt had said that “labs should be segregated by sex.”

Both Ferguson and Zadrozny added a new element to the case against Hunt, claiming that he had also condescendingly thanked women scientists for “making the lunch.” St. Louis later repeated this additional charge in an interview with the BBC. But it was eventually revealed, thanks to the efforts of Louise Mensch, that Hunt never said anything of the kind. In fact the allegedly offensive expression of gratitude had been delivered by a leading Korean—female—politician who stood up before Hunt.

Like most of the science journalists who covered Hunt’s solecism, Zadrozny and Ferguson were content to rely on a handful of tweets as the only evidence in an obviously controversial story. Sadly, the Hunt affair provides ample ammunition for those who believe Internet reporters are a tribe of third-raters with little or no ethical standards or training in Journalism 101.

But there’s another explanation for the fact that reporters such as Zadrozny and Ferguson felt no obligation to verify the facts of the case or do any old-fashioned reporting. In their cases, the temptation to cut journalistic corners may have been overwhelming. That’s because for anyone with an ax to grind about gender equality or sexism in science, this was one of those stories that the tabloids used to label (jestingly for the most part) “too good to check.”

For politically committed editors and reporters, a story that is too good to check is one that perfectly confirms their suspicions and prejudices about those they consider the enemy. It’s a phenomenon that exists on the right as well as the left—as evidenced by the bizarre stories in the 1990s claiming that the Clintons were drug smugglers and murderers. Last year’s invented story in Rolling Stone about a nonexistent gang rape at a University of Virginia fraternity was a particularly troubling modern version of it. That’s because in that case the story that editors didn’t want to check had been fabricated by a writer-activist who believed it was OK to collaborate with an obviously unreliable source on a story if that story “proved” the existence of a social ill she believed existed.

It’s possible that similar motivations inspired St. Louis’s misreporting of Hunt’s case. It is very likely that they were behind the subsequent trumpeting of her claims by the New York Times science columnist Deborah Blum and a chorus of committed journalists and academics. Such motivations explain the general refusal by members of that chorus to admit they were wrong even when confronted by the evidence.

Blum actually wrote: “The real point isn’t about individuals, isn’t about Tim Hunt…The real point is that telling a roomful of female scientists that they aren’t really welcome in a male-run laboratory is the sound of a slamming door. The real point is that to pry open that door means change. And change is hard, uncomfortable, and necessary.” Quite apart from Blum’s dishonesty—Hunt didn’t say anything of the sort, and the room was full of science writers not scientists—the ruthlessness of the statement is astonishing. For Blum, Hunt is a necessary sacrifice, an egg that needs to be broken for the cause.

But you still have to wonder, why did so many academics, as well as journalists and activists, believe her? In some cases it may have reflected a kind of confirmation bias. The bien-pensant are convinced that out there are many unenlightened people, the worst of whom are older white men, who brim with appalling reactionary prejudices.

You might have expected St. Louis and Blum, or the online journalists who took up their inflammatory reports, or the Twitterati who went into a frenzy of condemnation to do a bit of research, to dig into Hunt’s history and find more evidence of his supposed misogyny. After all, someone capable of calling for sex-segregated labs had presumably given other hostages to fortune. Apparently none of them did so; a single “sexist” remark being sufficient for conviction in the court of social media. But had they done so, they would have found that Hunt’s actions throughout his career don’t match the profile of a misogynist or even a sexist. Quite the contrary.

It’s not just that Hunt is married to a senior female biologist who is also a leading advocate for more opportunities for women in the sciences. He is also well known in the scientific community as a lifelong supporter and mentor of female scientists.

If that weren’t enough, for the past five years Hunt has actually been helping the European Research Council develop its “gender-equity plan.” He’s been such a devoted and longtime supporter of women in science that, according to Mensch, he had a day-care nursery installed at the Okinawa Institute and tried unsuccessfully to do the same thing at London’s Crick Institute—one of the institutions that quickly distanced itself from his supposedly sexist remarks.

Indeed, by the end of June, Hunt’s lifelong support for women in science was evident from a stream of tweets, blog posts, and letters to the press from female colleagues and former students. Maria Leptin, of the European Molecular Biology Organization, tweeted: “Tim Hunt was in charge as council chair and member of selection board that appointed the first female EMBO director (full disclosure: me).” Oxford’s Dr. Trisha Greenhalgh: “People who know and have worked with Tim are behind him, those who went on hearsay concluded ‘sexist.’”

One of the most powerful defenses came from Professor Hyunsook Lee of Seoul University, who wrote to the Times of London: “I have known Tim Hunt for more than 15 years, ever since he examined my thesis for a PhD. During those years he could not have been more supportive…he never treated me as a ‘female scientist’ but as a ‘scientist.’ In the scientific community…you sometimes get the feeling that you are being treated as a female. I never had this uneasy feeling from him. I learned a great deal from him and his attitude to science and he will continue to be my mentor.”
Some of this might have emerged at the outset if major mainstream news organizations, such as the BBC and the New York Times, had been more professional than the Buzzfeed and Daily Beast reporters. They weren’t.

The report by the Times’s London correspondent Dan Bilefsky repeated, as fact, Connie St. Louis’s claim that Hunt’s remarks were received in “stony silence.” Not only was this disputed very quickly by other guests at the lunch, it was disproved beyond reasonable doubt by a recording of the event that was released in July on Mensch’s website. Bilefsky also stated as fact that Hunt argued that “female scientists should be segregated from male colleagues.”

The BBC played an especially important role in turning Hunt into a hated figure. Anyone who still thinks that the BBC’s reportorial standards have not precipitately declined should read Mensch’s detailed accounting of the various misquotes and falsehoods about Hunt that the network came out with in various radio and television broadcasts across the world. This was especially true and especially damaging in the case of the flagship Radio 4 Today program, the All Things Considered of Great Britain.

It is telling that none of these mainstream organizations reported or seem to have known that, on the day after Hunt’s talk, the president of the European Research Commission (the delightfully named Jean-Pierre Bourguignon) had issued a statement in defense of Hunt and his record as a supporter of gender balance. He would have been an obvious person to ask for a comment.

Any of these organizations, moreover, if they had bothered, could have found convincing evidence on Twitter itself that contradicted St. Louis’s claims. Mensch quickly discovered that someone had actually tweeted about Tim Hunt during the luncheon—a science reporter from the Philippines named Shai Panela. She wrote: “Nobel Laureate Tim Hunt acknowledging the contribution of female science journalists.”

Mensch also turned up a shocked response to St. Louis’s claims by another science writer at the luncheon: Russia’s Nataliya Demina. “Everybody who heard T. Hunt’s speech yesterday knew he was joking,” she tweeted. “For those who not: guys where is u sense of humor?”

A third participant, a female assistant editor from Malaysia named Tan Shiow Chin flatly contradicted claims by St. Louis and Blum that Hunt had advocated single-sex labs. She recalled Hunt’s remarking in his toast that “men would be the worse off for it.”

Eight other Nobel laureates came out in his support. Twenty-nine colleagues wrote a joint letter to the London Times calling for him to be reinstated at both UCL and the European Research Centre; the signatories said they had been “shocked to witness the attacks made by commentators who have never met Tim.” From the United States, Cornell’s David Collum and NYU’s Nicholas Taleb added their voices, prompting Sir Paul Nurse, the president of the Royal Society to come out in defense of Hunt and tell the BBC that he should never have been sacked.

By the end of June, with St. Louis’s claims in tatters but with the refusal of University College, London, to change its stance, the broadcaster Jonathan Dimbleby resigned from his honorary fellowship there, and a well-known author publicly dropped it from his will.

Almost immediately St. Louis protected herself and her dishonesty against investigation by asserting her victimhood. “Women are vulnerable to vicious trolling on Twitter,” she told Scientific American, “and black women doubly so.”

St. Louis also went on the offensive. This included an article for the Guardian entitled “Stop Defending Tim Hunt.” In it she insisted that Hunt never said “now seriously” after his segregated-lab joke. “Nor did he praise the role of women in science and Korean society.” She even repeated the allegation that Hunt thanked the women journalists present for making lunch. Multiple witnesses have now come forward to confirm that St. Louis lied about these just as she lied to the BBC about Hunt’s speech being greeted with “stony silence.”

One of the final nails in the coffin of her credibility came when it turned out that a European Commission official had been at the luncheon and taken notes. His report was suppressed by the commission (in traditional Brussels fashion), but a leaked copy made it clear that Hunt had been joking and that the final words of his toast had been these: “Science needs women, and you should do science despite all the obstacles, and despite monsters like me.”

The unnamed EU official not only says that he did not detect “any awkwardness in the room as reported on social and then mainstream media” but that one of the Korean organizers of the conference spontaneously told him how impressed she was that “Sir Tim could improvise such a warm and funny speech” at such short notice.
Bourguignon confirmed the validity of the leaked report, adding that he himself had spoken to other Korean hosts who confirmed the warm reception given to Hunt’s speech and his praise of female scientists.

As Richard Dawkins, one of Hunt’s early supporters, put it, the leak proved that Hunt’s remark “was lighthearted banter against himself, his irony clearly (not clearly enough, alas) indicating that he is really the reverse of a ‘chauvinist monster.’” Dawkins also expressed the hope that Hunt would receive apologies from UCL, Nature magazine, and “other quarters where they should know better.” At the time of this writing, however, only one of the many publications and science writers who tore into Hunt’s alleged sexism has recanted and apologized: David Kroll of Forbes.

The coup de grâce came in July with Mensch’s release of a short recording from the luncheon. One can clearly hear applause and laughter in the room as Hunt ends his speech. Apparently out of a hundred guests from around the world, most of them women, the only people who were offended by Hunt’s remarks were a handful of British and American science writers, all of whom happen to be diversity obsessives.

The most generous interpretation of Connie St. Louis’s bizarre behavior is that she was too intellectually limited to recognize irony that was somehow obvious to an audience composed mostly of people who spoke English as a second language. A leak of the unedited version of her “Stop Defending Tim Hunt” piece for the Guardian is so garbled and incoherent that this actually seems plausible, though it also makes you wonder how and why she came to be teaching journalism even at a third-rate institution like London’s City University.

That’s a question that began to be asked quite widely a few weeks after St. Louis sent her tweets and became a celebrity on the back of her denunciations of Hunt. The Daily Mail discovered that St. Louis had lied on the curriculum vitae she had supplied for the City University website. The CV claims that she is “an award-winning freelance broadcaster, journalist, writer, and scientist” who “writes for numerous outlets, including the Independent, Daily Mail, the Guardian, the Sunday Times…” But when the Mail’s Guy Adams went through 20 years of digital archives for the Independent, the Sunday Times, and the Mail he could find no articles carrying her byline. Before the current scandal, her work for the Guardian had been limited to a single piece in 2013. Oddly, the BBC and the Guardian have yet to report not only this evidence pertaining to her credibility, but also all the contradictory evidence concerning her claims about Hunt’s speech in Seoul.

Hunt himself was manifestly ill equipped to deal with the onslaught. In an apparent state of confusion and demoralization, he apologized as soon the BBC contacted him, and again in subsequent interviews and statements. But he worded that first apology in such a way that allowed malevolent, dishonest critics to claim that he had admitted to all of St. Louis’s charges. The key phrase in the apology that Deborah Blum and others used against him was “I was trying to be honest.” This was clearly a reference to the fact that he himself had fallen in love with his wife while they were working together. Hunt later explained in a statement to the Guardian, “I certainly did not mean to demean women but rather be honest about my own shortcomings.” Nevertheless the sentence was cited to claim that Hunt really is in favor of sex-segregated labs.

But even if Hunt had been more media-savvy, he still would have faced a heavily stacked deck. After that first apology, one BBC headline read, “Scientist Tim Hunt responds to criticism of ‘girls in labs’ comments,” even though he had never used the phrase “girls in labs.”

In another interview, Hunt regretted his “stupid and ill-judged remarks.” His remarks were indeed ill-judged but mainly because of where and to whom they were delivered. Hunt was presumably used to talking to friendly audiences of scientists, academics, and protégés. His attitude to public speaking and toast-giving was formed in environments where one could assume discretion and the liberty to speak freely. But anyone who speaks to an audience composed largely of journalists, let alone bloggers or “journalism professors,” needs to choose his words carefully lest he fall victim to someone looking for a high-profile scalp.

Still, things might have been even worse if Hunt and his wife had not given an extensive joint interview to the Observer, the Sunday paper now owned by the Guardian. Conducted the weekend after the incident, it gave him the chance to point out the injustice of his situation. He said the UCL had “hung me out to dry” and noted that his accusers at the university “haven’t even asked for my side of affairs.” More important, the interview gave his wife, the less unworldly Professor Collins, an opportunity to rebut the caricature of Hunt as a male chauvinist. “He is certainly not an old dinosaur,” she told the paper. “He just says silly things now and again….I’m a feminist; I would not have put with him if he were a sexist.” She added, for good measure, that he does all the shopping and cooking for the family.

For all his naiveté, and despite the support he was beginning to receive, Hunt knew that his professional life was over. “I’m finished,” he said in the interview. “I had hoped to do a lot more to help promote science…but I cannot see how than can happen. I have become toxic.” Unfortunately, this is indeed the case. Since his comments came to light, Hunt has been disinvited from major scientific and medical conferences. As Dame Athene Donald wrote: “His ability to go and inspire the young has been unnecessarily destroyed.”

At the time of this writing, Hunt has not been reinstated as an honorary professor at University College, London. Nor is he likely to be. Provost Michael Arthur, as if keen to demonstrate the cowardice and lack of intellectual integrity he and so many others confuse with political virtue and good public relations, recently told the press that to reinstate Hunt would send out “entirely the wrong signal.”

It’s worth remembering that University College, London, an institution founded in 1826 by the philosopher Jeremy Bentham, has become notorious in recent years for its craven attitude to Islamic radicalism (the underwear bomber Farouk Abdulmutallab was radicalized while a student there), its toleration of fundamentalist guest speakers who advocate the murder of Jews and homosexuals, and its willingness to let extremist student groups censor free speech.

One of the more depressing aspects of the affair has been the number of clever and influential people, not all of them women, who have stated that even if Hunt was joking, he still deserved to be punished. These people genuinely believe that jokes about alleged differences between the sexes are beyond the pale—the cause of anti-sexism, like that of anti-racism, being simply too important or too fragile to tolerate subversive humor.

Uta Frith, the chair of the Royal Society’s diversity committee, has written that “as the case of Tim Hunt has shown, prejudice is unacceptable even if meant in jest.” She actually celebrated the Twitter lynching of Hunt, even after it was clear that Hunt had been dishonestly maligned for making fun of himself, as “a catalyst for a deep seated bitterness to pour out of people…bitterness about injustice, pure and simple.” As an example of intellectual and moral degradation in an elite institution, you can’t do much better than this.

Hunt experienced in less than two months’ time something similar to the process of denunciation, destruction, and rehabilitation that the main character in Milan Kundera’s autobiographical novel The Joke (1967) endured over a period of many years. Set in Stalinist Czechoslovakia, The Joke tells the story of Ludvik, a student who sends a jesting postcard to his girlfriend that concludes with the words “Long Live Trotsky.” Ludvik is actually an enthusiastic supporter of the relatively new Communist regime, but that doesn’t prevent him from being denounced, expelled from college, expelled from the Party, and then sent off to a labor battalion. Ludvik is too young and naive to understand that totalitarian systems have very limited tolerance for humor and see it as dangerous and subversive. Perhaps Hunt was too old and naive to realize that the worlds of science, education, and “science journalism” are policed by people who are not exactly totalitarians but whose obsession with “correct” language and thought is incompatible with humor and intellectual freedom.

It is a phenomenon that combines modern ideology with quasi-Victorian notions of “respectable” behavior and feminine fragility. For these witch-hunters, there can be no toleration of “inappropriate” speech by the contemporary equivalent of “Society.” The wrong kind of joke, breed of joke-teller, or even the wrong political opinion, moreover, creates a “hostile environment” that supposedly intimidates the sensitive victim to such a degree that she cannot function on an equal level. The Hunt affair shows that this way of thinking doesn’t hold sway on American campuses alone. It has crossed the Atlantic and spread outward and upward.

On the other hand, believers in free inquiry and freedom of speech can perhaps take comfort in one aspect of the Hunt case: Perhaps this brand of intolerance and academic McCarthyism is a specifically Anglo-Saxon or at least Anglophone affliction. Granted, there are signs of it in Scandinavia and other parts of Europe where British and American intellectual fashion is especially strong, but other cultures—perhaps those that have wrestled with or are still wrestling with real misogyny and actual limitations on free expression—might be immune. If Korean, Russian, and Malaysian female scientists like those in Hunt’s audience are able and willing to appreciate the ironic, self-deprecating jokes of an old Brit, there is still hope.


Footnotes

1 Readers may remember the story of Justine Sacco, the public-relations consultant in New York who made an ironic joke about AIDS on Twitter as she boarded a plane to Johannesburg. She was quickly lambasted online as a racist by tens of thousands of people around the world, and her career was over by the time her plane landed. All of it happened with her entirely in ignorance of the mob that had pursued and bagged her.
The writer Jon Ronson, who has taken part in such campaigns, has pointed out “the disconnect between the severity of the crime and the gleeful savagery of the punishment.” His excellent book “So You’ve Been Public Shamed” is very much worth reading on the subject.

2 As St. Louis blogged for Scientific American, “I discussed [Hunt’s comments] with a couple of colleagues I’d been sitting next to. . . . We decided that I should publish the story on Twitter since it had a British angle.” Later, when challenged on her version of Hunt’s words, St. Louis claimed that the trio of them took notes. But in one of his messages sent in her support, Oransky admitted that none of them had written anything down.

 

https://www.commentarymagazine.com/article/the-timothy-hunt-witch-hunt/

]]>