Tie what on?

Unraveling a euphemism for ‘getting drunk’

Etymological sleuthing isn’t one of my usual pursuits, but last month, as I browsed an old slang dictionary, I stumbled onto a clue that seemed too promising to ignore. A nightcap, said John Russell Bartlett’s 1848 “Dictionary of Americanisms,” is “A glass of hot toddy or gin-sling taken before going to bed at night.”

That we knew, even if gin-sling is no longer our nightcap of choice. But the definition goes on: “When a second glass is taken, it is called ‘a string to tie it [the nightcap] with.’ ” And there’s an example, too, from an 1843 work of fiction: “Come, now, Squire, before we turn in, let us tie the nightcap.”

“Tie the nightcap” was the phrase that hooked me. Could it be related to “tie one on,” that mysterious slang expression meaning “get drunk”?

Nightcap itself, used since the early 19th century, is no mystery at all: Since tying on a nightcap (in the pre-central heating era) was the last thing you do before sleep, the word “nightcap” was applied to the pre-bedtime drink as well. And the author of the “tie the nightcap” quote, a Canadian named Thomas Chandler Haliburton, extended the metaphor in loving detail in the pages of “The Attache; or, Sam Slick in England.” On the eve of a departure, Sam wants to “tie a night-cap,” but his nightcap, he explains, requires embellishments:

“ ‘What a dreadful awful looking thing a night-cap is without a tassel, ain’t it? Oh! you must put a tassel on it, and that is another glass. Well then, what is the use of a night-cap, if it has a tassel on it, but has no string, it will slip off your head the very first turn you take; and that is another glass you know. But one string won’t tie a cap;…you must have two strings to it, and that brings one glass more. Well then, what is the use of two strings if they ain’t fastened? If you want to keep the cap on, it must be tied…

“Mr. Slick ordered materials for brewing, namely: whisky, hot water, sugar and lemon; and having duly prepared in regular succession the cap, the tassel, and the two strings, filled his tumbler again, and said, ‘Come now, Squire, before we turn in, let us tie the night-cap.’ ”

In a tidy linguistic universe, we would find that “tie one on” was the legitimate offspring of “tie the nightcap” – what could be more obvious? But in fact, “tie one on,” recorded only since the 1940s, is something of a puzzle. It may be an offshoot of the “bun” family, a group of expressions for being drunk – “have a bun on,” “get a bun on,” eventually “tie a bun on” – dating to the turn of the 20th century. Unfortunately, this trail is another dead end, since nobody seems to have any idea what the “bun” might have been.

Then there’s the “bag” branch of the drinker’s vocabulary. Farmer and Henley’s 1890 slang dictionary lists “To PUT or GET ONE’S HEAD IN A Bag” as printers’ and sailors’ slang for “drink,” with a quote from an 1887 issue of the Saturday Review: “It is slang, and yet purely trade slang, when one printer says of another that he has GOT HIS HEAD IN THE BAG.”

The “bag” in question may well be the “bag o’ beer” cited in James Redding Ware’s 1909 dictionary of Victorian slang, shorthand for a quart of a blended brew – “half of fourpenny porter and half of fourpenny ale.” By the 1940s, we have “in the bag” (and “half in the bag”), “bagged,” and, yes, “tie a bag on.” The last phrase could have been influenced by similar slang for “eat” – “to put on/tie on the nosebag” – but that gets us no closer to Haliburton’s nightcap strings.

A connection could still turn up, of course – some short-lived slang phrase that links the tied-on nightcap and the baffling “tie on a bun,” say. But from the evidence at hand, it looks as if Haliburton’s elaborate metaphor was just an amusing exercise, not the inspiration for a family of drinking idioms.

So “tie one on” is back to “origin unknown,” a familiar neighborhood for slang historians. Michael Quinion, who writes the weekly World Wide Words newsletter, is a frequent visitor: He’s had to put “malarkey,” “zilch,” “kibosh,” and even “the full Monty” (despite several appealing explanations) into the “unknown” file. We still don’t know why an early edition of a morning paper, published the night before, is called a “bulldog edition”; the truth about “the whole nine yards” eludes us. And Haliburton’s “tie the nightcap,” tempting though it is, can’t yet be considered the foundation for the much later and still elusive “tie one on.”

Jan Freeman, Boston Globe


Full article: http://www.boston.com/bostonglobe/ideas/articles/2010/02/28/tie_what_on/

The week ahead

An election in deeply divided Iraq

• IRAQIS finally go to the polls on Sunday March 7th to vote in their long-delayed national elections. But with many candidates barred from standing because of former ties to Saddam Hussein’s Baath party, sectarian rivalries have once more come to the fore. It is far from certain that any single group will win enough votes to form a government, prompting fears that relations between Sunnis and Shias will deteriorate even further and threaten the country’s fragile recovery.

• A VOTE by the House Foreign Affairs Committee on Thursday March 4th threatens to sour relations between America and Turkey. The congressional committee will consider whether to label the mass slaughter of Ottoman Armenians by Turkish forces in 1915 as a genocide. Previous similar resolutions never made it to a vote in the House of Representatives for fear of damaging relations with an important ally in the Middle East. But a House vote is more likely this time after Barack Obama’s election pledge to recognise the episode as genocide.

• SHAREHOLDERS of Yukos, a Russian oil company that was dismembered by the Russian state and the country’s tax authorities, will take their grievances to the European Court of Human Rights on Thursday March 4th. Yukos’s boss still languishes in jail after the politically motivated attack on his company but investors are hoping that the court will award them some $100 billion in compensation from Russia’s government. The case may drag on for several years. But if shareholders are successful and Russia refuses to pay they could be entitled to seize state assets abroad.

• ICELANDERS are set to hold a referendum on Saturday March 6th to decide whether the country should repay $5.1 billion to the British and Dutch governments. The money was paid out to savers in those countries who had lost money when an Icelandic bank collapsed in 2008. A no vote, the likeliest outcome for a deeply unpopular measure, would be damaging for Iceland’s government and may set back talks on EU membership. Negotiations between the three governments aimed at a compromise over repayment terms broke down on February 25th.


Full article and photo: http://www.economist.com/opinion/displayStory.cfm?story_id=15582437&source=features_box2

And the Orchestra Played On

The other day, I found myself rummaging through a closet, searching for my old viola. This wasn’t how I’d planned to spend the afternoon. I hadn’t given a thought to the instrument in years. I barely remembered where it was, much less how to play it. But I had just gotten word that my childhood music teacher, Jerry Kupchynsky — “Mr. K.” to his students — had died.

In East Brunswick, N.J., where I grew up, nobody was feared more than Mr. K. He ran the town’s music department with a ferocity never before seen in our quiet corner of suburbia. In his impenetrably thick Ukrainian accent, he would berate us for being out of tune, our elbows in the wrong position, our counting out of sync.

“Cellos sound like hippopotamus rising from bottom of river,” he would yell during orchestra rehearsals. Wayward violinists played “like mahnyiak,” while hapless gum chewers “look like cow chewing cud.” He would rehearse us until our fingers were callused, then interrupt us with “Stop that cheekin plocking!”

Mr. K. pushed us harder than our parents, harder than our other teachers, and through sheer force of will made us better than we had any right to be. He scared the daylight out of us.

I doubt any of us realized how much we loved him for it.

Which is why, decades later, I was frantically searching for an instrument whose case still bore the address of my college dorm. After almost a half-century of teaching, at the age of 81, Mr. K. had died of Parkinson’s disease. And across the generations, through Facebook and e-mail messages and Web sites, came the call: it was time for one last concert for Mr. K. — performed by us, his old students and friends.

Now, I used to be a serious student. I played for years in a string quartet with Mr. K.’s violin-prodigy daughters, Melanie and Stephanie. One of my first stories as a Wall Street Journal reporter was a first-person account of being a street musician.

But I had given it up 20 years ago. Work and motherhood intervened; with two children and long hours as an editor, there wasn’t time for music any more. It seemed kind of frivolous. Besides, I wasn’t even sure I would know how.

The hinges creaked when I opened the decrepit case. I was greeted by a cascade of loose horsehair — my bow a victim of mites, the repairman later explained. It was pure agony to twist my fingers into position. But to my astonishment and that of my teenage children — who had never heard me play — I could still manage a sound.

It turned out, a few days later, that there were 100 people just like me. When I showed up at a local school for rehearsal, there they were: five decades worth of former students. There were doctors and accountants, engineers and college professors. There were people who hadn’t played in decades, sitting alongside professionals like Mr. K.’s daughter Melanie, now a violinist with the Chicago Symphony Orchestra. There were generations of music teachers.

They flew in from California and Oregon, from Virginia and Boston. They came with siblings and children; our old quartet’s cellist, Miriam, took her seat with 13 other family members.

They came because Mr. K. understood better than anyone the bond music creates among people who play it together. Behind his bluster — and behind his wicked sense of humor and taste for Black Russians — that was his lesson all along.

He certainly learned it the hard way. As a teenager during World War II, he endured two years in a German internment camp. His wife died after a long battle with multiple sclerosis. All those years while we whined that he was riding us too hard, he was raising his daughters and caring for his sick wife on his own. Then his younger daughter Stephanie, a violin teacher, was murdered. After she vanished in 1991, he spent seven years searching for her, never giving up hope until the day her remains were found.

Yet the legacy he had left behind was pure joy. You could see it in the faces of the audience when the curtain rose for the performance that afternoon. You could hear it as his older daughter Melanie, her husband and their violinist children performed as a family. You could feel it when the full orchestra, led by one of Mr. K.’s protégés, poured itself into Tchaikovsky and Bach. It powered us through the lost years, the lack of rehearsal time — less than two hours — and the stray notes from us rustier alums.

Afterward, Melanie took the stage to describe the proud father who waved like a maniac from a balcony in Carnegie Hall the first time she played there. At the end of his life, when he was too ill to talk, she would bring her violin to his bedside and play for hours, letting the melodies speak for them both. The bonds of music were as strong as ever.

In a way, this was Mr. K.’s most enduring lesson — and one he had been teaching us since we were children. Back when we were in high school, Mr. K. had arranged for Melanie and our quartet to play at the funeral of a classmate killed in a horrific car crash. The boy had doted on his little sister, a violinist. We were a reminder of how much he loved to listen to her play.

As the far-flung orchestra members arrived for Mr. K.’s final concert, suddenly we saw her, that little girl, now grown, a professional musician herself. She had never stopped thinking about her brother’s funeral, she told me, and when she heard about this concert, she flew from Denver in the hope that she might find the musicians who played in his honor. For 30 years, she had just wanted the chance to say, “Thank you.”

As did we all.

Joanne Lipman, a former deputy managing editor at The Wall Street Journal, was the founding editor in chief of Condé Nast Portfolio magazine.


Full article: http://www.nytimes.com/2010/02/28/opinion/28lipman.html

Muslims Won’t Play Together

WE may scoff at the idea that the Olympic Games have anything to do with the “endeavor to place sport at the service of humanity and thereby to promote peace,” as the Olympic charter enshrines as its ideal. But at least nations across the world were able to put aside differences for two weeks of friendly competition in Vancouver.

A mundane achievement, perhaps, but it’s one that’s beyond the grasp of the Islamic world. The Islamic Solidarity Games, the Olympics of the Muslim world, which were to be held in Iran in April, have been called off by the Arab states because Tehran inscribed “Persian Gulf” on the tournament’s official logo and medals.

It’s a small but telling controversy. It puts the lie to the idea of the Islamic world as a bloc united by religious values that are hostile to the West. It also gives clues as to how the United States and its allies should handle two of their most urgent foreign policy matters: the Iranian nuclear program and the Israeli-Palestinian conflict.

This is not the first time that Arabs have challenged the internationally accepted name of the waterway that separates Persia (or Iran, as it has been called since 1935) from the Arabian Peninsula. Pan-Arabist thought — which dominated Arab political life for most of the 20th century — insisted on the creation of a unified vast empire “from the Atlantic Ocean to the Arab Gulf,” provoking sharp confrontations with Iran since the late 1960s.

The Islamic regime in Tehran, which came to power in 1979 dismissing nationalism as an imperialist plot aimed at weakening the worldwide Muslim community (or umma), initially displayed less interest in the gulf’s Persian identity than in the spread of its Islamist message. “The Iranian revolution is not exclusively that of Iran, because Islam does not belong to any particular people,” insisted Ayatollah Ruhollah Khomeini. “The struggle will continue until the calls ‘there is no god but Allah and Muhammad is the messenger of Allah’ are echoed all over the world.”

Yet like Stalin, who responded to the Nazi invasion of the Soviet Union in 1941 by urging his people to fight for the motherland rather than for the Communist ideals with which they had been indoctrinated, Khomeini reverted to nationalist rhetoric to rally his subjects after the Iraqi invasion of 1980. He also used the war to justify a string of military and diplomatic actions against the smaller Arab states like Qatar and Kuwait aimed at asserting Iran’s supremacy in the gulf.

In this history of a single body of water, one sees a perfect example of the so-called Islamic Paradox that dates from the seventh century. For although the Prophet Muhammad took great pains to underscore the equality of all believers regardless of ethnicity, categorically forbidding any fighting among the believers, his precepts have been constantly and blatantly violated.

It took a mere 24 years after the Prophet’s death for the head of the universal Islamic community, the caliph Uthman, to be murdered by political rivals. This opened the floodgates to incessant infighting within the House of Islam, which has never ceased. Likewise, there has been no overarching Islamic solidarity transcending the multitude of parochial loyalties — to one’s clan, tribe, village, family or nation. Thus, for example, not only do Arabs consider themselves superior to all other Muslims, but inhabitants of Hijaz, the northwestern part of the Arabian Peninsula and Islam’s birthplace, regard themselves the only true Arabs, and tend to be highly disparaging of all other Arabic-speaking communities.

Nor, for that matter, has the House of Islam ever formed a unified front vis-à-vis the House of War (as Muslims call the rest of the world). Even during the Crusades, the supposed height of the “clash of civilizations,” Christian and Muslim rulers freely collaborated across the religious divide, often finding themselves aligned with members of the rival religion against their co-religionists. While the legendary Saladin himself was busy eradicating the Latin Kingdom of Jerusalem, for example, he was closely aligned with the Byzantine Empire, the foremost representative of Christendom’s claim to universalism.

This pattern of pragmatic cooperation reached its peak during the 19th century, when the Ottoman Empire relied on Western economic and military support to survive. (The Charge of the Light Brigade of 1854 was, at its heart, part of a French-British effort to keep the Ottomans from falling under Russian hegemony.) It has also become a central feature of 20th- and 21st-century Middle Eastern politics.

Muslim and Arab rulers have always, in their intrigues, sought the support and protection of the “infidel” powers they so vilify. President Gamal Abdel Nasser of Egypt, the champion of pan-Arabism who had built his reputation on standing up to “Western imperialism,” imported more than 10,000 Soviet troops into Egypt when his “War of Attrition” against Israel in the late 1960s went sour.

Similarly, Ayatollah Khomeini bought weapons from even the “Great Satan,” the United States. Saddam Hussein used Western support to survive his war against Iran in the 1980s. And Osama bin Laden and the rest of the Afghan mujahedeen accepted weapons and money from the United States, with the Islamic state of Pakistan as the middleman, in their struggle against the Soviet occupation.

Yet, since it is far easier to unite people through a common hatred than through a shared loyalty, Islamic solidarity has been repeatedly invoked as an instrument for achieving the self-interested ends of those who proclaimed it. Little wonder the covenant of Hamas insists, “When our enemies usurp some Islamic lands, jihad becomes a duty binding on all Muslims.”

So, if the Muslim bloc is just as fractious as any other group of seemingly aligned nations, what does it mean for United States policy in the Islamic world?

For one, it should give us more impetus to take a harder line with Iran. Just as the Muslim governments couldn’t muster the minimum sense of commonality for holding an all-Islamic sports tournament, so they would be unlikely to rush to Iran’s aid in the event of sanctions, or even a military strike.

Beyond the customary lip service about Western imperialism and “Crusaderism,” most other Muslim countries would be quietly relieved to see the extremist regime checked. It’s worth noting that the two dominant Arab states, Egypt and Saudi Arabia, have been at the forefront of recent international efforts to contain Iran’s nuclear ambitions.

As for the Palestinian-Israeli conflict, the idea that bringing peace between the two parties will bring about a flowering of cooperation in the region and take away one of Al Qaeda’s primary gripes against the West totally misreads history and present-day politics. Muslim states threaten Israel’s existence not so much out of concern for the Palestinians, but rather as part of a holy war to prevent the loss of a part of the House of Islam.

In these circumstances, one can only welcome the latest changes in the Obama administration’s Middle Eastern policy, which combine a tougher stance on Iran’s nuclear subterfuge with a less imperious approach to the Arab-Israeli conflict.

Secretary of State Hillary Clinton’s two-track plan — discussion with Tehran while at the same time lining up meaningful sanctions — is fine as far as it goes. But a military strike must remain a serious option: there is no peaceful way to curb Iran’s nuclear ambitions, stemming as they do from its imperialist brand of national-Islamism.

Likewise, there is no way for the Obama administration to resolve the 100-year war between Arabs and Jews unless all sides are convinced that peace is in each of their best interests. Any agreement between Israel and the Palestinians is far less important than a regional agreement in which every Islamic nation can make peace with the idea of Jewish statehood in the House of Islam.

And that, depressingly, is going to be a lot harder to pull off than even the Islamic Solidarity Games.

Efraim Karsh, the head of Middle East and Mediterranean studies at King’s College London, is the author of “Islamic Imperialism: A History” and the forthcoming “Palestine Betrayed.”


Full article and photo: http://www.nytimes.com/2010/02/28/opinion/28karsh.html

Learning From the Sin of Sodom

For most of the last century, save-the-worlders were primarily Democrats and liberals. In contrast, many Republicans and religious conservatives denounced government aid programs, with Senator Jesse Helms calling them “money down a rat hole.”

Over the last decade, however, that divide has dissolved, in ways that many Americans haven’t noticed or appreciated. Evangelicals have become the new internationalists, pushing successfully for new American programs against AIDS and malaria, and doing superb work on issues from human trafficking in India to mass rape in Congo.

A pop quiz: What’s the largest U.S.-based international relief and development organization?

It’s not Save the Children, and it’s not CARE — both terrific secular organizations. Rather, it’s World Vision, a Seattle-based Christian organization (with strong evangelical roots) whose budget has roughly tripled over the last decade.

World Vision now has 40,000 staff members in nearly 100 countries. That’s more staff members than CARE, Save the Children and the worldwide operations of the United States Agency for International Development — combined.

A growing number of conservative Christians are explicitly and self-critically acknowledging that to be “pro-life” must mean more than opposing abortion. The head of World Vision in the United States, Richard Stearns, begins his fascinating book, “The Hole in Our Gospel,” with an account of a visit a decade ago to Uganda, where he met a 13-year-old AIDS orphan who was raising his younger brothers by himself.

“What sickened me most was this question: where was the Church?” he writes. “Where were the followers of Jesus Christ in the midst of perhaps the greatest humanitarian crisis of our time? Surely the Church should have been caring for these ‘orphans and widows in their distress.’ (James 1:27). Shouldn’t the pulpits across America have flamed with exhortations to rush to the front lines of compassion?

“How have we missed it so tragically, when even rock stars and Hollywood actors seem to understand?”

Mr. Stearns argues that evangelicals were often so focused on sexual morality and a personal relationship with God that they ignored the needy. He writes laceratingly about “a Church that had the wealth to build great sanctuaries but lacked the will to build schools, hospitals, and clinics.”

In one striking passage, Mr. Stearns quotes the prophet Ezekiel as saying that the great sin of the people of Sodom wasn’t so much that they were promiscuous or gay as that they were “arrogant, overfed and unconcerned; they did not help the poor and needy.” (Ezekiel 16:49.)

Hmm. Imagine if sodomy laws could be used to punish the stingy, unconcerned rich!

The American view of evangelicals is still shaped by preening television blowhards and hypocrites who seem obsessed with gays and fetuses. One study cited in the book found that even among churchgoers ages 16 to 29, the descriptions most associated with Christianity were “antihomosexual,” “judgmental,” “too involved in politics,” and “hypocritical.”

Some conservative Christians reinforced the worst view of themselves by inspiring Ugandan homophobes who backed a bill that would punish gays with life imprisonment or execution. Ditto for the Vatican, whose hostility to condoms contributes to the AIDS epidemic. But there’s more to the picture: I’ve also seen many Catholic nuns and priests heroically caring for AIDS patients — even quietly handing out condoms.

One of the most inspiring figures I’ve met while covering Congo’s brutal civil war is a determined Polish nun in the terrifying hinterland, feeding orphans, standing up to drunken soldiers and comforting survivors — all in a war zone. I came back and decided: I want to grow up and become a Polish nun.

Some Americans assume that religious groups offer aid to entice converts. That’s incorrect. Today, groups like World Vision ban the use of aid to lure anyone into a religious conversation.

Some liberals are pushing to end the longtime practice (it’s a myth that this started with President George W. Bush) of channeling American aid through faith-based organizations. That change would be a catastrophe. In Haiti, more than half of food distributions go through religious groups like World Vision that have indispensible networks on the ground. We mustn’t make Haitians the casualties in our cultural wars.

A root problem is a liberal snobbishness toward faith-based organizations. Those doing the sneering typically give away far less money than evangelicals. They’re also less likely to spend vacations volunteering at, say, a school or a clinic in Rwanda.

If secular liberals can give up some of their snootiness, and if evangelicals can retire some of their sanctimony, then we all might succeed together in making greater progress against common enemies of humanity, like illiteracy, human trafficking and maternal mortality.

Nicholas D. Kristof, New York Times


Full article: http://www.nytimes.com/2010/02/28/opinion/28kristof.html

Handbook suggests that deviations from ‘normality’ are disorders

Peter De Vries, America’s wittiest novelist, died 17 years ago, but his discernment of this country’s cultural foibles still amazes. In a 1983 novel, he spotted the tendency of America’s therapeutic culture to medicalize character flaws:

“Once terms like identity doubts and midlife crisis become current,” De Vries wrote, “the reported cases of them increase by leaps and bounds.” And: “Rapid-fire means of communication have brought psychic dilapidation within the reach of the most provincial backwaters, so that large metropolitan centers and educated circles need no longer consider it their exclusive property, nor preen themselves on their special malaises.”

Life is about to imitate De Vries’s literature, again. The fourth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM), psychiatry’s encyclopedia of supposed mental “disorders,” is being revised. The 16 years since the last revision evidently were prolific in producing new afflictions. The revision may aggravate the confusion of moral categories.

Today’s DSM defines “oppositional defiant disorder” as a pattern of “negativistic, defiant, disobedient and hostile behavior toward authority figures.” Symptoms include “often loses temper,” “often deliberately annoys people” or “is often touchy.” DSM omits this symptom: “is a teenager.”

This DSM defines as “personality disorders” attributes that once were considered character flaws. “Antisocial personality disorder” is “a pervasive pattern of disregard for . . . the rights of others . . . callous, cynical . . . an inflated and arrogant self-appraisal.” “Histrionic personality disorder” is “excessive emotionality and attention-seeking.” “Narcissistic personality disorder” involves “grandiosity, need for admiration . . . boastful and pretentious.” And so on.

If every character blemish or emotional turbulence is a “disorder” akin to a physical disability, legal accommodations are mandatory. Under federal law, “disabilities” include any “mental impairment that substantially limits one or more major life activities”; “mental impairments” include “emotional or mental illness.” So there might be a legal entitlement to be a jerk. (See above, “antisocial personality disorder.”)

The revised DSM reportedly may include “binge eating disorder” and “hypersexual disorder” (“a great deal of time” devoted to “sexual fantasies and urges” and “planning for and engaging in sexual behavior”). Concerning children, there might be “temper dysregulation disorder with dysphoria.”

This last categorization illustrates the serious stakes in the categorization of behaviors. Extremely irritable or aggressive children are frequently diagnosed as bipolar and treated with powerful antipsychotic drugs. This can be a damaging mistake if behavioral modification treatment can mitigate the problem.

Another danger is that childhood eccentricities, sometimes inextricable from creativity, might be labeled “disorders” to be “cured.” If 7-year-old Mozart tried composing his concertos today, he might be diagnosed with attention-deficit hyperactivity disorder and medicated into barren normality.

Furthermore, intellectual chaos can result from medicalizing the assessment of character. Today’s therapeutic ethos, which celebrates curing and disparages judging, expresses the liberal disposition to assume that crime and other problematic behaviors reflect social or biological causation. While this absolves the individual of responsibility, it also strips the individual of personhood and moral dignity.

James Q. Wilson, America’s preeminent social scientist, has noted how “abuse excuse” threatens the legal system and society’s moral equilibrium. Writing in National Affairs quarterly (“The Future of Blame”), Wilson notes that genetics and neuroscience seem to suggest that self-control is more attenuated — perhaps to the vanishing point — than our legal and ethical traditions assume.

The part of the brain that stimulates anger and aggression is larger in men than in women, and the part that restrains anger is smaller in men than in women. “Men,” Wilson writes, “by no choice of their own, are far more prone to violence and far less capable of self-restraint than women.” That does not, however, absolve violent men of blame. As Wilson says, biology and environment interact. And the social environment includes moral assumptions, sometimes codified in law, concerning expectations about our duty to desire what we ought to desire.

It is scientifically sensible to say that all behavior is in some sense caused. But a society that thinks scientific determinism renders personal responsibility a chimera must consider it absurd not only to condemn depravity but also to praise nobility. Such moral derangement can flow from exaggerated notions of what science teaches, or can teach, about the biological and environmental roots of behavior.

Or — revisers of the DSM, please note — confusion can flow from the notion that normality is always obvious and normative, meaning preferable. And the notion that deviations from it should be considered “disorders” to be “cured” rather than stigmatized as offenses against valid moral norms.

George F. Will, Washington Post


Full article: http://www.washingtonpost.com/wp-dyn/content/article/2010/02/26/AR2010022603369.html

What’s Wrong With ‘Eating Animals’

Somewhere along the way, Jonathan Safran Foer or his publishers must have realized that the case he makes against American animal farming doesn’t apply tidily to Britain (or to most of Europe). So he’s added a “Preface to the U.K. Edition,” in which he claims that “a remarkably similar story could be told about animal farming in the United Kingdom.” In the next sentence, however, he admits that “there are some important differences: sow stalls (gestation crates) and veal creates [sic, and this is not the only sign of haste in the writing of this preface] are banned in the U.K., whereas they are the norm in America; poultry slaughter is almost certainly less cruel.”

Of course Mr. Foer says the similarities between U.K. and U.S. food-animal welfare standards are “far more, and more important” than the differences. But he has already ceded the ethical high ground to British livestock farmers; and you have to wonder why he’s bothered to publish here a book whose “statistics refer to American agriculture.” Let us deconstruct.

“Approximately 800 million chickens, turkeys and pigs are factory farmed in the United Kingdom every year,” he asserts. There are even more intensively farmed animals if he “were to include cows and fish.” The diet Mr. Foer advocates shuns fish as well as meat. The most recent guess (by vegetarian organizations) is that 10% of U.K. residents are “meat-avoiders,” but my own experience suggests to me that most British “vegetarians” (as opposed to vegans) are fish-eaters. Mr. Foer excludes cows, I suspect, because he’s realized, a little late in the enterprise, that British cattle are largely raised on grass, not cereals as in America. Like most Europeans, Britons prefer the taste of grass-fed beef. I’d have a little more confidence in the universal relevance of “Eating Animals” if the writer showed that he’d taken the trouble to find out what affluent people living outside America actually eat. Figures for vegetarianism in continental Europe are hard to obtain, because the concept of principled abstention from meat is alien to most of its cultures.

We English-speakers are more squeamish, but contemporary Britons eat much more like Europeans than like Americans. For example, a good deal of “Eating Animals” talks about turkey. But, despite the 800 million figure above, in 2006 there were 173 million table birds produced in the UK; 64% were table chickens, 27% laying hens; and only 17 million were turkeys, according to the Farm Business Survey. Why so few turkeys? Because the British, like the French, really only eat turkey once a year, at Christmas. (That should leave a very large balance of pigs, but the latest estimate I could find of the pig population was 4.55 million at the end of 2008. A little “first-person research” on my own part has led me to wonder whether the 800 million figure does not include imports of factory-farmed meat, and depend on confusion in labeling requirements.)

If he had found out a little more abut the eating habits of the natives, Mr. Foer would have discovered that the growth areas are in the sort of farming that places a premium on animal welfare as well as on improving the texture and taste of animals bound for the table. A few hours spent in a supermarket would have convinced him, I feel certain, to scrap the many pages of American statistical analysis and slaughterhouse narrative in this book.

However, the facts alone wouldn’t make him change his mind. Mr. Foer is an imaginative writer, and a very good one; and “this book is the record of a very personal inquiry. Facing the prospect of fatherhood, I wanted to make informed decisions about what to feed my son.” Why is it so difficult for people who give up eating meat to admit that, in the end, it’s a question of sentiment? Why do they feel the need to prop up their feelings with facts? Why can’t they just say: “I don’t like the idea of what happens in the abattoir, so I shall abstain from eating its products”? What’s wrong with saying “I won’t eat anything that had a face or a mother”? Above all, why do “vegetarians” (or pescetarians, as most British vegetarians should call themselves) feel the need to proselytize?

We’ve seen that the application of the animal-welfare arguments is not universal; we can discount the environmental arguments, as it is possible to farm livestock in a non-intensive, eco-friendly fashion, and ethical consumers (such as the readers of this review) source their food carefully. If we take the absolutist position that it is wrong to kill for food, it is not sufficient to be a vegetarian who eats dairy produce and eggs. Milk production entails the destruction of male calves shortly after birth (especially since the sentimentalists have killed off the U.K. veal industry), and egg production requires disposing of male chicks. Vegans eschew all animal products, including shoe leather and honey, so at least have the virtue of consistency. Oddly, though Mr. Foer is aware that there is a difference between the more radical vegan diet and those who (illogically, but who cares?) allow themselves the high-grade proteins to be had from eating milk, cheese and eggs, he nowhere makes the distinction.

“Late in the book,” says Mr. Foer, “I note that in a different time or place, I might have made different decisions about eating animals. The United Kingdom is not the different place I was imagining.” It is possible to be a meat-eater in the U.K. without being party to the horrors Mr. Foer lovingly chronicles in the U.S. If only he would take the trouble to find out what British people are really eating, he might change his mind.

Mr. Levy, who writes about the arts for the Journal, is co-chair of the Oxford Symposium on Food and Cookery.


Full article: http://online.wsj.com/article/SB126715497682651975.html