New England’s hidden history

More than we like to think, the North was built on slavery.

In the year 1755, a black slave named Mark Codman plotted to kill his abusive master. A God-fearing man, Codman had resolved to use poison, reasoning that if he could kill without shedding blood, it would be no sin. Arsenic in hand, he and two female slaves poisoned the tea and porridge of John Codman repeatedly. The plan worked — but like so many stories of slave rebellion, this one ended in brutal death for the slaves as well. After a trial by jury, Mark Codman was hanged, tarred, and then suspended in a metal gibbet on the main road to town, where his body remained for more than 20 years.

It sounds like a classic account of Southern slavery. But Codman’s body didn’t hang in Savannah, Ga.; it hung in present-day Somerville, Mass. And the reason we know just how long Mark the slave was left on view is that Paul Revere passed it on his midnight ride. In a fleeting mention from Revere’s account, the horseman described galloping past “Charlestown Neck, and got nearly opposite where Mark was hung in chains.”

When it comes to slavery, the story that New England has long told itself goes like this: Slavery happened in the South, and it ended thanks to the North. Maybe

we had a little slavery, early on. But it wasn’t real slavery. We never had many slaves, and the ones we did have were practically family. We let them marry, we taught them to read, and soon enough, we freed them. New England is the home of abolitionists and underground railroads. In the story of slavery — and by extension, the story of race and racism in modern-day America — we’re the heroes. Aren’t we?

As the nation prepares to mark the 150th anniversary of the American Civil War in 2011, with commemorations that reinforce the North/South divide, researchers are offering uncomfortable answers to that question, unearthing more and more of the hidden stories of New England slavery — its brutality, its staying power, and its silent presence in the very places that have become synonymous with freedom. With the markers of slavery forgotten even as they lurk beneath our feet — from graveyards to historic homes, from Lexington and Concord to the halls of Harvard University — historians say it is time to radically rewrite America’s slavery story to include its buried history in New England.

“The story of slavery in New England is like a landscape that you learn to see,” said Anne Farrow, who co-wrote “Complicity: How the North Promoted, Prolonged, and Profited From Slavery” and who is researching a new book about slavery and memory. “Once you begin to see these great seaports and these great historic houses, everywhere you look, you can follow it back to the agricultural trade of the West Indies, to the trade of bodies in Africa, to the unpaid labor of black people.”

It was the 1991 discovery of an African burial ground in New York City that first revived the study of Northern slavery. Since then, fueled by educators, preservationists, and others, momentum has been building to recognize histories hidden in plain sight. Last year, Connecticut became the first New England state to formally apologize for slavery. In classrooms across the country, popularity has soared for educational programs on New England slavery designed at Brown University. In February, Emory University will hold a major conference on the role slavery’s profits played in establishing American colleges and universities, including in New England. And in Brookline, Mass., a program called Hidden Brookline is designing a virtual walking tour to illuminate its little-known slavery history: At one time, nearly half the town’s land was held by slave owners.

“What people need to understand is that, here in the North, while there were not the large plantations of the South or the Caribbean islands, there were families who owned slaves,” said Stephen Bressler, director of Brookline’s Human Relations-Youth Resources Commission. “There were businesses actively involved in the slave trade, either directly in the importation or selling of slaves on our shores, or in the shipbuilding, insurance, manufacturing of shackles, processing of sugar into rum, and so on. Slavery was a major stimulus to the Northern economy.”

Turning over the stones to find those histories isn’t just a matter of correcting the record, he and others say. It’s crucial to our understanding of the New England we live in now.

“The absolute amnesia about slavery here on the one hand, and the gradualness of slavery ending on the other, work together to make race a very distinctive thing in New England,” said Joanne Pope Melish, who teaches history at the University of Kentucky and wrote the book “Disowning Slavery: Gradual Emancipation and ‘Race’ in New England, 1780-1860.” “If you have obliterated the historical memory of actual slavery — because we’re the free states, right? — that makes it possible to turn around and look at a population that is disproportionately poor and say, it must be their own inferiority. That is where New England’s particular brand of racism comes from.”

Dismantling the myths of slavery doesn’t mean ignoring New England’s role in ending it. In the 1830s and ’40s, an entire network of white Connecticut abolitionists emerged to house, feed, clothe, and aid in the legal defense of Africans from the slave ship Amistad, a legendary case that went all the way to the US Supreme Court and helped mobilize the fight against slavery. Perhaps nowhere were abolition leaders more diehard than in Massachusetts: Pacifist William Lloyd Garrison and writer Henry David Thoreau were engines of the antislavery movement. Thoreau famously refused to pay his taxes in protest of slavery, part of a philosophy of civil disobedience that would later influence Martin Luther King Jr. But Thoreau was tame compared to Garrison, a flame-thrower known for shocking audiences. Founder of the New England Anti-Slavery Society and the newspaper The Liberator, Garrison once burned a copy of the US Constitution at a July Fourth rally, calling it “a covenant with death.” His cry for total, immediate emancipation made him a target of death threats and kept the slavery question at a perpetual boil, fueling the moral argument that, in time, would come to frame the Civil War.

But to focus on crusaders like Garrison is to ignore ugly truths about how unwillingly New England as a whole turned the page on slavery. Across the region, scholars have found, slavery here died a painfully gradual death, with emancipation laws and judicial rulings that either were unclear, poorly enforced, or written with provisions that kept slaves and the children born to them in bondage for years.

Meanwhile, whites who had trained slaves to do skilled work refused to hire the same blacks who were now free, driving an emerging class of skilled workers back to the lowest rungs of unskilled labor. Many whites, driven by reward money and racial hatred, continued to capture and return runaway Southern slaves; some even sent free New England blacks south, knowing no questions about identity would be asked at the other end. And as surely as there was abolition, there was “bobalition” — the mocking name given to graphic, racist broadsides printed through the 1830s, ridiculing free blacks with characters like Cezar Blubberlip and Mungo Mufflechops. Plastered around Boston, the posters had a subtext that seemed to boil down to this: Who do these people think they are? Citizens?

“Is Garrison important? Yes. Is it dangerous to be an abolitionist at that time? Absolutely,” said Melish. “What is conveniently forgotten is the number of people making a living snagging free black people in a dark alley and shipping them south.”

Growing up in Lincoln, Mass., historian Elise Lemire vividly remembers learning of the horrors of a slaveocracy far, far away. “You knew, for example, that families were split up, that people were broken psychologically and kept compliant by the fear of your husband or wife being sold away, or your children being sold away,” said Lemire, author of the 2009 book “Black Walden,” who became fascinated with former slaves banished to squatter communities in Walden Woods.

As she peeled back the layers, Lemire discovered a history rarely seen by the generations of tourists and schoolchildren who have learned to see Concord as a hotbed of antislavery activism. “Slaves [here] were split up in the same way,” she said. “You didn’t have any rights over your children. Slave children were given away all the time, sometimes when they were very young.”

In Lemire’s Concord, slave owners once filled half of town government seats, and in one episode town residents rose up to chase down a runaway slave. Some women remained enslaved into the 1820s, more than 30 years after census figures recorded no existing slaves in Massachusetts. According to one account, a former slave named Brister Freeman, for whom Brister’s Hill in Walden Woods is named, was locked inside a slaughterhouse shed with an enraged bull as his white tormentors laughed outside the door. And in Concord, Lemire argues, black families were not so much liberated as they were abandoned to their freedom, released by masters increasingly fearful their slaves would side with the British enemy. With freedom, she said, came immediate poverty: Blacks were forced to squat on small plots of the town’s least arable land, and eventually pushed out of Concord altogether — a precursor to the geographic segregation that continues to divide black and white in New England.

“This may be the birthplace of a certain kind of liberty,” Lemire said, “but Concord was a slave town. That’s what it was.”

If Concord was a slave town, historians say, Connecticut was a slave state. It didn’t abolish slavery until 1848, a little more than a decade before the Civil War. (A judge’s ruling ended legal slavery in Massachusetts in 1783, though the date is still hotly debated by historians.) It’s a history Connecticut author and former Hartford Courant journalist Anne Farrow knew nothing about — until she got drawn into an assignment to find the untold story of one local slave.

Once she started pulling the thread, Farrow said, countless histories unfurled: accounts of thousand-acre slave plantations and a livestock industry that bred the horses that turned the giant turnstiles of West Indian sugar mills. Each discovery punctured another slavery myth. “A mentor of mine has said New England really democratized slavery,” said Farrow. “Where in the South a few people owned so many slaves, here in the North, many people owned a few. There was a widespread ownership of black people.”

Perhaps no New England colony or state profited more from the unpaid labor of blacks than Rhode Island: Following the Revolution, scholars estimate, slave traders in the tiny Ocean State controlled between two-thirds and 90 percent of America’s trade in enslaved Africans. On the rolling farms of Narragansett, nearly one-third of the population was black — a proportion not much different from Southern plantations. In 2003, the push to reckon with that legacy hit a turning point when Brown University, led by its first African-American president, launched a highly controversial effort to account for its ties to Rhode Island’s slave trade. Today, that ongoing effort includes the CHOICES program, an education initiative whose curriculum on New England slavery is now taught in over 2,000 classrooms.

As Brown’s decision made national headlines, Katrina Browne, a Boston filmmaker, was on a more private journey through New England slavery, tracing her bloodlines back to her Rhode Island forebears, the DeWolf family. As it turned out, the DeWolfs were the biggest slave-trading family in the nation’s biggest slave-trading state. Browne’s journey, which she chronicled in the acclaimed documentary “Traces of the Trade: A Story from the Deep North,” led her to a trove of records of the family’s business at every point in slavery’s triangle trade. Interspersed among the canceled checks and ship logs, Browne said, she caught glimpses into everyday life under slavery, like the diary entry by an overseer in Cuba that began, “I hit my first Negro today for laughing at prayers.” Today, Browne runs the Tracing Center, a nonprofit to foster education about the North’s complicity in slavery.

“I recently picked up a middle school textbook at an independent school in Philadelphia, and it had sub-chapter headings for the Colonial period that said ‘New England,’ and then ‘The South and Slavery,’ ” said Browne, who has trained park rangers to talk about Northern complicity in tours of sites like Philadelphia’s Liberty Bell. “Since learning about my family and the whole North’s role in slavery, I now consider these things to be my problem in a way that I didn’t before.”

If New England’s amnesia has been pervasive, it has also been willful, argues C.S. Manegold, author of the new book “Ten Hills Farm: The Forgotten History of Slavery in the North.” That’s because many of slavery’s markers aren’t hidden or buried. In New England, one need look no further than a symbol that graces welcome mats, door knockers, bedposts, and all manner of household decor: the pineapple. That exotic fruit, said Manegold, is as intertwined with slavery as the Confederate flag: When New England ships came to port, captains would impale pineapples on a fence post, a sign to everyone that they were home and open for business, bearing the bounty of slave labor and sometimes slaves themselves.

“It’s a symbol everyone knows the benign version of — the happy story that pineapples signify hospitality and welcome,” said Manegold, whose book centers on five generations of slaveholders tied to one Colonial era estate, the Royall House and Slave Quarters in Medford, Mass., now a museum. The house features two carved pineapples at its gateposts.

By Manegold’s account, pineapples were just the beginning at this particular Massachusetts farm: Generation after generation, history at the Royall House collides with myths of freedom in New England — starting with one of the most mythical figures of all, John Winthrop. Author of the celebrated “City Upon a Hill” sermon and first governor of the Massachusetts Bay Colony, Winthrop not only owned slaves at Ten Hills Farm, but in 1641, he helped pass one of the first laws making chattel slavery legal in North America.

When the house passed to the Royalls, Manegold said, it entered a family line whose massive fortune came from slave plantations in Antigua. Members of the Royall family would eventually give land and money that helped establish Harvard Law School. To this day, the law school bears a seal borrowed from the Royall family crest, and for years the Royall Professorship of Law remained the school’s most prestigious faculty post, almost always occupied by the law school dean. It wasn’t until 2003 that an incoming dean — now Supreme Court Justice Elena Kagan — quietly turned the title down.

Kagan didn’t publicly explain her decision. But her actions speak to something Manegold and others say could happen more broadly: not just inserting footnotes to New England heritage tours and history books, but truly recasting that heritage in all its painful complexity.

“In Concord,” Lemire said, “the Minutemen clashed with the British at the Old North Bridge within sight of a man enslaved in the local minister’s house. The fact that there was slavery in the town that helped birth American liberty doesn’t mean we shouldn’t celebrate the sacrifices made by the Minutemen. But it does mean New England has to catch up with the rest of the country, in much of which residents have already wrestled with their dual legacies of freedom and slavery.”

Francie Latour is an associate editor at Wellesley magazine and a former Globe reporter.

____________

Full article and photo: http://www.boston.com/bostonglobe/ideas/articles/2010/09/26/new_englands_hidden_history/

A short history of presidential primaries

Although a Niagara of vitriol is drenching politics, the two parties are acting sensibly and in tandem about something once considered a matter of constitutional significance — the process by which presidential nominations are won.

The 2012 process will begin 17 months from now — in February rather than January. Under rules adopted by both parties’ national committees, no delegates to the national conventions shall be selected before the first Tuesday in March — except for delegates from New Hampshire, South Carolina and Nevada. Iowa may still conduct its caucuses, which do not select delegates, in February.

It is not graven on the heart of man by the finger of God that the Entitled Four shall go first, but it might as well be. Although they have just 3.8 percent of the nation’s population, they do represent four regions. Anyway, they shall have the spotlight to themselves until the deluge of delegate selections begin — perhaps in March but preferably in April.

Any Republican delegate-selection event held before the first day of April shall be penalized: The result cannot be, as many Republicans prefer, a winner-take-all allocation of delegates. March events “shall provide for the allocation of delegates on a proportional basis.” This means only that some of the delegates must be allocated proportional to the total vote.

Because Democrats are severe democrats, they have no winner-take-all events, so they do not have this stick with which to discipline disobedient states. Instead, they brandish — they are, after all, liberals — a carrot: States will be offered bonus delegates for moving their nominating events deeper into the nominating season, and for clustering their contests with those of neighboring states.

Each party wants to maximize its chance of nominating a strong candidate and — this is sometimes an afterthought — one who would not embarrass it as president. So both parties have equal interests in lengthening the nominating process to reduce the likelihood that a cascade of early victories will settle nomination contests before they have performed their proper testing-and-winnowing function.

With states jockeying for early positions, the danger has been that the process will become compressed into something similar to an early national primary. This would heavily favor well-known and well-funded candidates and would virtually exclude everyone else.

There have been other proposals. One would divide the nation into four regions voting on monthly intervals, with the order of voting rotating every four years. Another would spread voting over 10 two-week intervals, with the largest states voting last, thereby giving lesser-known candidates a chance to build strength.

Such plans, however, require cooperation approaching altruism among the states, which should not be counted on. Instead, the two parties are in a Madisonian mood, understanding that incentives are more reliable than moral exhortations in changing behavior.

Speaking of the sainted Madison, the parties’ reforms are a small step back toward what the Constitution envisioned: settled rules for something important. The nation’s Founders considered the selection of presidential candidates so crucial that they wanted the process to be controlled by the Constitution. So they devised a system under which the nomination of presidential candidates and the election of a president occurred simultaneously:

Electors meeting in their respective states, in numbers equal to their states’ senators and representatives, would vote for two candidates for president. When Congress counted the votes, the one with the most would become president, the runner-up vice president.

This did not survive the quick emergence of parties. After the presidential election of 1800, which was settled in the House after 36 votes, the 12th Amendment was adopted, and suddenly the nation had what it has had ever since — a process of paramount importance but without settled rules. The process has been a political version of the “tragedy of the commons” — by everyone acting self-interestedly, everyone’s interests are injured.

In 1952, Sen. Estes Kefauver of Tennessee won every Democratic primary he entered except Florida’s, which was won by Sen. Richard Russell of Georgia. So the nominee was . . . Illinois Gov. Adlai Stevenson. Party bosses, a species as dead as the dinosaurs, disliked Kefauver.

Today, the parties’ modest reforms — the best kind — have somewhat reduced the risks inherent in thorough democratization of the nomination process. Certainly the democratization has not correlated with dramatic improvements in the caliber of nominees. And the current president, whose campaign was his qualification for the office, is proof that even a protracted and shrewd campaign is not an infallible predictor of skillful governance.

George F. Will, Washington Post

__________

Full article and photo: http://www.washingtonpost.com/wp-dyn/content/article/2010/09/24/AR2010092402649.html

How to Raise Boys Who Read

Hint: Not with gross-out books and video-game bribes.

When I was a young boy, America’s elite schools and universities were almost entirely reserved for males. That seems incredible now, in an era when headlines suggest that boys are largely unfit for the classroom. In particular, they can’t read.

According to a recent report from the Center on Education Policy, for example, substantially more boys than girls score below the proficiency level on the annual National Assessment of Educational Progress reading test. This disparity goes back to 1992, and in some states the percentage of boys proficient in reading is now more than ten points below that of girls. The male-female reading gap is found in every socio-economic and ethnic category, including the children of white, college-educated parents.

The good news is that influential people have noticed this problem. The bad news is that many of them have perfectly awful ideas for solving it.

Everyone agrees that if boys don’t read well, it’s because they don’t read enough. But why don’t they read? A considerable number of teachers and librarians believe that boys are simply bored by the “stuffy” literature they encounter in school. According to a revealing Associated Press story in July these experts insist that we must “meet them where they are”—that is, pander to boys’ untutored tastes.

For elementary- and middle-school boys, that means “books that exploit [their] love of bodily functions and gross-out humor.” AP reported that one school librarian treats her pupils to “grossology” parties. “Just get ’em reading,” she counsels cheerily. “Worry about what they’re reading later.”

Not with ‘gross-out’ books and video-game bribes.

There certainly is no shortage of publishers ready to meet boys where they are. Scholastic has profitably catered to the gross-out market for years with its “Goosebumps” and “Captain Underpants” series. Its latest bestsellers are the “Butt Books,” a series that began with “The Day My Butt Went Psycho.”

The more venerable houses are just as willing to aim low. Penguin, which once used the slogan, “the library of every educated person,” has its own “Gross Out” line for boys, including such new classics as “Sir Fartsalot Hunts the Booger.”

Workman Publishing made its name telling women “What to Expect When You’re Expecting.” How many of them expected they’d be buying “Oh, Yuck! The Encyclopedia of Everything Nasty” a few years later from the same publisher? Even a self-published author like Raymond Bean—nom de plume of the fourth-grade teacher who wrote “SweetFarts”—can make it big in this genre. His flatulence-themed opus hit no. 3 in children’s humor on Amazon. The sequel debuts this fall.

Education was once understood as training for freedom. Not merely the transmission of information, education entailed the formation of manners and taste. Aristotle thought we should be raised “so as both to delight in and to be pained by the things that we ought; this is the right education.”

“Plato before him,” writes C. S. Lewis, “had said the same. The little human animal will not at first have the right responses. It must be trained to feel pleasure, liking, disgust, and hatred at those things which really are pleasant, likeable, disgusting, and hateful.”

This kind of training goes against the grain, and who has time for that? How much easier to meet children where they are.

One obvious problem with the SweetFarts philosophy of education is that it is more suited to producing a generation of barbarians and morons than to raising the sort of men who make good husbands, fathers and professionals. If you keep meeting a boy where he is, he doesn’t go very far.

The other problem is that pandering doesn’t address the real reason boys won’t read. My own experience with six sons is that even the squirmiest boy does not require lurid or vulgar material to sustain his interest in a book.

So why won’t boys read? The AP story drops a clue when it describes the efforts of one frustrated couple with their 13-year-old unlettered son: “They’ve tried bribing him with new video games.” Good grief.

The appearance of the boy-girl literacy gap happens to coincide with the proliferation of video games and other electronic forms of entertainment over the last decade or two. Boys spend far more time “plugged in” than girls do. Could the reading gap have more to do with competition for boys’ attention than with their supposed inability to focus on anything other than outhouse humor?

Dr. Robert Weis, a psychology professor at Denison University, confirmed this suspicion in a randomized controlled trial of the effect of video games on academic ability. Boys with video games at home, he found, spend more time playing them than reading, and their academic performance suffers substantially. Hard to believe, isn’t it, but Science has spoken.

The secret to raising boys who read, I submit, is pretty simple—keep electronic media, especially video games and recreational Internet, under control (that is to say, almost completely absent). Then fill your shelves with good books.

People who think that a book—even R.L. Stine’s grossest masterpiece—can compete with the powerful stimulation of an electronic screen are kidding themselves. But on the level playing field of a quiet den or bedroom, a good book like “Treasure Island” will hold a boy’s attention quite as well as “Zombie Butts from Uranus.” Who knows—a boy deprived of electronic stimulation might even become desperate enough to read Jane Austen.

Most importantly, a boy raised on great literature is more likely to grow up to think, to speak, and to write like a civilized man. Whom would you prefer to have shaped the boyhood imagination of your daughter’s husband—Raymond Bean or Robert Louis Stevenson?

I offer a final piece of evidence that is perhaps unanswerable: There is no literacy gap between home-schooled boys and girls. How many of these families, do you suppose, have thrown grossology parties?

Mr. Spence is president of Spence Publishing Company in Dallas.

__________

Full article and photo: http://online.wsj.com/article/SB10001424052748704271804575405511702112290.html

Visigoths at the gate?

When facing a tsunami, what do you do? Pray, and tell yourself stories. I am not privy to the Democrats’ private prayers, but I do hear the stories they’re telling themselves. The new meme is that there’s a civil war raging in the Republican Party. The Tea Party will wreck it from within and prove to be the Democrats’ salvation.

I don’t blame anyone for seeking a deus ex machina when about to be swept out to sea. But this salvation du jour is flimsier than most.

In fact, the big political story of the year is the contrary: that a spontaneous and quite anarchic movement with no recognized leadership or discernible organization has been merged with such relative ease into the Republican Party.

The Tea Party could have become Perot ’92, an anti-government movement that spurned the Republicans, went third-party and cost George H.W. Bush reelection, ending 12 years of Republican rule. Had the Tea Party gone that route, it would have drained the Republican Party of its most mobilized supporters and deprived Republicans of the sweeping victory that awaits them on Nov. 2.

Instead, it planted its flag within the party and, with its remarkable energy, created the enthusiasm gap. Such gaps are measurable. This one is a chasm. This year’s turnout for the Democratic primaries (as a percentage of eligible voters) was the lowest ever recorded. Republican turnout was the highest since 1970.

True, Christine O’Donnell’s nomination in Delaware may cost the Republicans an otherwise safe seat (and possibly control of the Senate), and Sharron Angle in Nevada is running only neck-and-neck with an unpopular Harry Reid. On balance, however, the Tea Party contribution is a large net plus, with its support for such strong candidates as Marco Rubio of Florida, Pat Toomey of Pennsylvania, Joe Miller of Alaska, Mike Lee of Utah. Even Rand Paul, he of the shaky start in Kentucky, sports an eight-point lead. All this in addition to the significant Tea Party contribution to the tide that will carry dozens of Republicans into the House.

Nonetheless, some Democrats have convinced themselves that they have found the issue with which to salvage 2010. “President Obama’s political advisers,” reports the New York Times, “are considering a range of ideas, including national advertisements, to cast the Republican Party as all but taken over by Tea Party extremists.”

Sweet irony. Fear-over-hope rides again, this time with Democrats in the saddle warning darkly about “the Republican Tea Party” (Joe Biden). Message: Vote Democratic and save the nation from a Visigoth mob with a barely concealed tinge of racism.

First, this is so at variance with reality that it’s hard to believe even liberals believe it. The largest Tea Party event yet was the recent Glenn Beck rally on the Mall. The hordes descending turned out to be several hundred thousand cheerful folks in what, by all accounts, had the feel of a church picnic. And they left the place nearly spotless — the first revolution in recorded history that collected its own trash.

Second, the general public is fairly evenly split in its views of the Tea Party. It experiences none of the horror that liberals do — and think others should. Moreover, the electorate supports by 2-to-1 the Tea Party signature issues of smaller government and lower taxes.

Third, you would hardly vote against the Republican in your state just because there might be a (perceived) too-conservative Republican running somewhere else. How would, say, Paul running in Kentucky deter someone from voting for Mark Kirk in Illinois? Or, to flip the parties, will anyone in Nevada refuse to vote for Harry Reid because Chris Coons, a once self-described “bearded Marxist,” is running as a Democrat in Delaware?

Fourth, what sane Democrat wants to nationalize an election at a time of 9.6 percent unemployment and such disappointment with Obama that just this week several of his own dreamy 2008 supporters turned on him at a cozy town hall? The Democrats’ only hope is to run local campaigns on local issues. That’s how John Murtha’s former district director hung on to his boss’s seat in a special election in Pennsylvania.

Newt Gingrich had to work hard — getting Republican candidates to sign the Contract with America — to nationalize the election that swept Republicans to victory in 1994. A Democratic anti-Tea Party campaign would do that for the Republicans — nationalize the election, gratis — in 2010. As a very recent former president — now preferred (Public Policy Polling, Sept. 1) in bellwether Ohio over the current one by 50 percent to 42 percent — once said: Bring ’em on.

Charles Krauthammer, Washington Post

__________

Full article: http://www.washingtonpost.com/wp-dyn/content/article/2010/09/23/AR2010092304746.html

Can a president lead with Woodward watching?

Question of the day: Why do presidents give the White House keys to Bob Woodward?

I ask this with all due deference, respect, hat in hand, cape over puddle and other sundry gestures owed by ink-stained wretches like me to the Most Famous Journalist on the Planet.

Through several administrations, Woodward has become president ex officio — or at least reporter in chief, a human tape recorder who issues history’s first draft even as history is still tying its shoes.

For years he’s been the best-selling first read on a president’s inner struggles. His latest, “Obama’s Wars,” exposes infighting in the West Wing over how to handle Afghanistan.

The suggestion that there was discord in the Oval Office over whether to increase troop numbers in a brutal war theater is, frankly, of great consolation. If we don’t worry ourselves sick about putting lives on the line, what exactly would we concern ourselves with? Who’s dancing next with the stars?

What is of some concern — at least based on those excerpts that have leaked thus far — is that the president gets pushed around by the generals. And that impression feeds into the larger one that Barack Obama is not quite commander in chief. He seems far more concerned with being politically savvy than with winning what he has called the good war.

Cognitive dissonance sets in when Obama declares that “it’s time to turn the page” in the war that he didn’t like — Iraq — and that is not in fact over. Fifty thousand troops remain in Iraq, while the surge in Afghanistan seems to be not enough — or too much for too long, already.

Whatever one’s view of circumstances on the ground, whether in the wars abroad or in domestic skirmishes on Wall Street, Obama seems not to be the man in charge. Nor does it seem that he is even sure of his own intentions. One telling exchange reported by Woodward took place with Sen. Lindsey Graham (R-S.C.). In explaining his July 2011 deadline to begin withdrawing troops from Afghanistan, Obama told Graham:

“I have to say that. I can’t let this be a war without end, and I can’t lose the whole Democratic Party.”

How’s that? We tell the enemy when we’re leaving so the party base doesn’t get upset? Well, of course, public opinion matters in war, as in all things. As we’ve seen before, wars can’t be won without the will of the people at home. But a commander in chief at least ought to know what he’s fighting for and why he’s asking Americans to risk their lives. If it’s not a good enough reason to warrant victory, then maybe it isn’t any longer a good war.

In another telling anecdote, the president asked his aides for a plan “about how we’re going to hand it off and get out of Afghanistan.” Apparently, he didn’t get such a plan. Whose presidency is this anyway?

The White House reportedly isn’t upset with the way the president comes across. His portrayal is consistent with what they consider a positive profile: Obama as thoughtful and reflective. To the list might we add ponderous?

We all want a thoughtful president. As few Democrats tire of reminding us, America and the world have had quite enough of cowboys. But surely we can discard the caricatures and settle on a thoughtful commander who is neither a gunslinger nor a chalk-dusted harrumpher. Surely the twain can meet.

The Woodward Syndrome, meanwhile, presents a dilemma for all presidents. By his presence, events are affected. By our knowledge of what he witnesses, even as history is being created in real time, we can also affect these same events. Is it fair to Obama to critique him as he navigates his own thoughts? Or are we interfering with outcomes by inserting ourselves into conversations to which we were never supposed to be privy?

It’s a conundrum unlikely to be resolved. If anything, in our tell-all, see-all political culture, no struggle will go unrecorded or un-critiqued. The need for strong leadership is, therefore, all the more necessary.

There’s a saying that seems applicable here: Work like you don’t need money, love like you’re never been hurt, dance like no one’s watching.

Note to President Obama: Lead like there’s no tomorrow. No midterm election, no presidential reelection, no party base. Liberate yourself from the Woodward Syndrome, figure out what you think, and lead.

You are commander in chief, after all. Half the country may disagree with you, but they’ll respect you in the morning.

Kathleen Parker, Washington Post

__________

Full article: http://www.washingtonpost.com/wp-dyn/content/article/2010/09/24/AR2010092404221_pf.html

Homo administrans

The biology of business

Biologists have brought rigour to psychology, sociology and even economics. Now they are turning their attention to the softest science of all: management

SCURRYING around the corridors of the business school at the National University of Singapore (NUS) in his white lab coat last year, Michael Zyphur must have made an incongruous sight. Visitors to management schools usually expect the staff to sport suits and ties. Dr Zyphur’s garb was, however, no provocative fashion statement. It is de rigueur for anyone dealing with biological samples, and he routinely collects such samples as part of his research on, of all things, organisational hierarchies. He uses them to look for biological markers, in the form of hormones, that might either cause or reflect patterns of behaviour that are relevant to business.

Since its inception in the early 20th century, management science has been dominated by what Leda Cosmides and John Tooby, two evolutionary psychologists, refer to disparagingly as the standard social science model (SSSM). This assumes that most behavioural differences between individuals are explicable by culture and socialisation, with biology playing at best the softest of second fiddles. Dr Zyphur is part of an insurgency against this idea. What Dr Cosmides and Dr Tooby have done to psychology and sociology, and others have done to economics, he wants to do to management. Consultants often talk of the idea of “scientific” management. He, and others like him, want to make that term meaningful, by applying the rigour of biology.

To do so, they will need to weave together several disparate strands of the subject—genetics, endocrinology, molecular biology and even psychology. If that works, the resulting mixture may provide a new set of tools for the hard-pressed business manager.

To the management born

Say “biology” and “behaviour” in the same sentence, and most minds think of genetics and the vexed question of nature and nurture. In a business context such questions of heredity and environment are the realm of Scott Shane, a professor of management at Case Western Reserve University in Ohio. In a recent book*, Dr Shane proffers a review of the field. Many of his data come from studies of twins—a traditional tool of human geneticists, who are denied the possibility of experimental breeding enjoyed by their confrères who study other species, such as flies and mice.

Identical twins share all of their DNA. Non-identical twins share only half (like all other siblings). Despite a murky past involving the probable fabrication of data by one of the field’s pioneers, Sir Cyril Burt, the science of comparing identical with non-identical twins is still seen as a good way of distinguishing the effects of genes from those of upbringing.

The consensus from twin studies is that genes really do account for a substantial proportion of the differences between individuals—and that applies to business as much as it does to the rest of life. Dr Shane observes genetic influence over which jobs people choose (see chart), how satisfied they are with those jobs, how frequently they change jobs, how important work is to them and how well they perform (or strictly speaking, how poorly: genes account for over a third of variation between individuals in “censured job performance”, a measure that incorporates reprimands, probation and performance-related firings). Salary also depends on DNA. Around 40% of the variation between people’s incomes is attributable to genetics. Genes do not, however, operate in isolation. Environment is important, too. Part of the mistake made by supporters of the SSSM was to treat the two as independent variables when, in reality, they interact in subtle ways.

Richard Arvey, the head of the NUS business school’s department of management and organisation, has been looking into precisely how genes interact with different types of environment to create such things as entrepreneurial zeal and the ability to lead others. Previous research had shown that people exhibiting personality traits like sensation-seeking are more likely to become entrepreneurs than their less outgoing and more level-headed peers. Dr Arvey and his colleagues found the same effect for extroversion (of which sensation-seeking is but one facet). There was, however, an interesting twist. Their study—of 1,285 pairs of identical twins and 849 pairs of same-sex fraternal ones—suggests that genes help explain extroversion only in women. In men, this trait is instilled environmentally. Businesswomen, it seems, are born. But businessmen are made.

In a second twin study, this time just on men, Dr Arvey asked to what extent leaders are born, and to what extent they are made. Inborn leadership traits certainly do exist, but upbringing, he found, matters too. The influence of genes on leadership potential is weakest in boys brought up in rich, supportive families and strongest in those raised in harsher circumstances. The quip that the battle of Waterloo was won on the playing fields of Eton thus seems to have some truth.

Pathways to success

Twin studies such as these point the way, but they provide only superficial explanations of what is going on. To get at the nitty gritty it is necessary to dive into molecular biology. And that is the province of people like Song Zhaoli, who is also at the NUS.

One way genes affect behaviour is through the agency of neurotransmitters, the chemicals that carry messages between nerve cells. Among these chemicals, two of the most important are dopamine and serotonin. Dopamine controls feelings of pleasure and reward. Serotonin regulates mood. Some personality traits have been shown to depend on the amounts of these neurotransmitters that slosh around the junctions between nerve cells. Novelty-seeking, for example, is associated with lots of dopamine. A tendency to depression may mean too little serotonin. And the levels of both are regulated by genes, with different variants of the same underlying gene having different effects.

Recent years have seen a surge of research into the links between particular versions of neurotransmitter-related genes and behavioural outcomes, such as voter turnout, risk-aversion, personal popularity and sexual promiscuity. However, studies of work-related traits have hitherto been conspicuous by their absence.

Dr Song has tried to fill this gap. His team have gathered and analysed DNA from 123 Singaporean couples to see if it can be matched with a host of work-related variables, starting with job satisfaction.

In this case Dr Song first checked how prone each participant in the study was to the doldrums, in order to establish a baseline. He also asked whether they had experienced any particularly stressful events, like sustaining serious injury, getting the sack or losing a lot of money, within the previous year. Then he told participants to report moments of negative mood (anger, guilt, sadness or worry) and job satisfaction (measured on a seven-point scale) four times a day for a week, using a survey app installed on their mobile phones.

He knew from previous research that some forms of melancholia, such as seasonal affective disorder (or winter blues), have been linked to particular versions of a serotonin-receptor gene called HTR2A. When he collated the DNA and survey data from his volunteers, he found those with a particular variant of HTR2A were less likely than those carrying one of its two other possible variants to experience momentary negative mood, even if they had had a more stress-ridden year. Dr Song also found that when carriers of that same variant reported lower negative mood, they also tended to report higher job satisfaction—an effect which was absent among people who had inherited the remaining two versions of the gene.

This suggests that for people fortunate enough to come equipped with the pertinent version of HTR2A, stressful events are less likely to have a negative effect on transient mood. What is more, for these optimists, better mood turns out to be directly related to contentment with their job. In other words, it may be a particular genetic mutation of a serotonin-receptor gene, and not the employer’s incentives, say, that is making people happier with their work.

The hormonal balance-sheet

Neurotransmitters are not the only way an individual’s genetic make-up is translated into action. Hormones also play a part. For example, oxytocin, which is secreted by part of the brain called the hypothalamus, has been shown to promote trust—a crucial factor in all manner of business dealings. The stress hormone cortisol, meanwhile, affects the assessment of the time value of money.

That, at least, was the conclusion of a study by Taiki Takahashi of Hokkaido University in Japan. After taking saliva samples from 18 volunteers, Dr Takahashi asked them what minimum amount of money they would accept in a year’s time in order to forgo an immediate payout of ¥10,000 (around $90 at the time). He found those with a lower base level of the hormone tended to prefer immediate payment, even when the sum in question was piffling compared with the promised future compensation.

Then there is testosterone, the principal male sex hormone (though women make it too). The literature on this hormone’s behavioural effects is vast. High levels of the stuff have been correlated with risk tolerance, creativity and the creation of new ventures. But testosterone is principally about dominance and hierarchy. This is where Dr Zyphur’s mouth swabs come in.

When Dr Zyphur (who is now at the University of Melbourne) was at the NUS, he led a study of how testosterone is related to status and collective effectiveness in groups. He and his colleagues examined levels of the hormone in 92 mixed-sex groups of about half a dozen individuals. Surprisingly, a group member’s testosterone level did not predict his or her status within the group. What the researchers did discover, though, is that the greater the mismatch between testosterone and status, the less effectively a group’s members co-operate. In a corporate setting that lower productivity translates into lower income.

Testosterone crops up in another part of the business equation, too: sales. It appears, for instance, to be a by-product of conspicuous consumption. In an oft-cited study Gad Saad and John Vongas of Concordia University in Montreal found that men’s testosterone levels responded precisely to changes in how they perceived their status. Testosterone shot up, for example, when they got behind the wheel of a sexy sports car and fell when they were made to drive a clunky family saloon car. The researchers also reported that when a man’s status was threatened in the presence of a female by a display of wealth by a male acquaintance, his testosterone levels surged.

As Dr Saad and Dr Vongas point out, a better understanding of this mechanism could help explain many aspects both of marketing and of who makes a successful salesman. Car salesmen, for example, are stereotypically male and aggressive, which tends to indicate high levels of testosterone. Whether that is really the right approach with male customers is, in light of this research, a moot point.

Natural selection

Results such as these are preliminary. But they do offer the possibility of turning aspects of management science into a real science—and an applied science, to boot. Decisions based on an accurate picture of human nature have a better chance of succeeding than those that are not. For instance, if job satisfaction and leadership turn out to have large genetic components, greater emphasis might be placed on selection than on training.

Not everyone is convinced. One quibble is that many investigations of genetics and behaviour have relied on participants’ retrospective reports of their earlier psychological states, which are often inaccurate. This concern, however, is being allayed with the advent of techniques such as Dr Song’s mobile-sampling method.

Another worry is that, despite the fact that most twin studies have been extensively replicated, they may be subject to systematic flaws. If parents exhibit a tendency to treat identical twins more similarly than fraternal ones, for instance, then what researchers see as genetic factors could turn out to be environmental ones.

That particular problem can be examined by looking at twins who have been fostered or adopted apart, and thus raised in separate households. A more serious one, though, has emerged recently. This is that identical twins may not be as identical as appears at first sight. A process called epigenesis, which shuts down genes in response to environmental prompts, may make their effective genomes different from their actual ones.

Statistically, that would not matter too much if the amount of epigenesis were the same in identical and fraternal twins, but research published last year by Art Petronis of the Centre for Addiction and Mental Health in Toronto and his colleagues, suggests it is not. Instead, identical twins are epigenetically closer to each other than the fraternal sort. That means environmentally induced effects that are translated into action by this sort of epigenesis might be being confused by researchers with inherited ones.

Still, this and other concerns about the effectiveness of the new science should pass as more data are gathered. But a separate set of concerns may be increased by better data. These are those of an ethical nature, which pop up whenever scientists broach the nature-nurture nexus. Broadly, such concerns divide into three sorts.

The first involves the fear that genetic determinism cheapens human volition. But as Dr Shane is at pains to stress, researchers like him are by no means genetic fatalists. He draws an analogy with sports wagers. Knowing that you have the favourable version of a gene may shift the odds somewhat, but it no more guarantees that you will be satisfied with your job than knowing of a player’s injury ensures that you will cash in on his team’s loss. Indeed, it might be argued that a better understanding of humanity can help direct efforts to counteract those propensities viewed as detrimental or undesirable, thus ensuring people are less, rather than more, in thrall to their biology.

The second set of ethical worriers are those who fret that biological knowledge may be used to serve nefarious ends. Whenever biology meets behaviour the spectre of social Darwinism and eugenics looms menacingly in the background. Yet, just because genetic information can serve evil ends need not mean that it has to. Dr Shane observes that pretending DNA has no bearing on working life does not make those influences go away, it just makes everyone ignorant of what they are, “Everyone, that is, except those who want to misuse the information.”

The third ethical qualm involves the thorny issue of fairness. Ought employers to use genetic testing to select their workers? Will this not lead down a slippery slope to genetic segregation of the sort depicted in the genetic dystopias beloved of science-fiction?

This pass, however, has already been sold. Workers are already sometimes hired on the basis of personality tests that try to tease out the very genetic predispositions that biologists are looking for. The difference is that the hiring methods do this indirectly, and probably clumsily. Moreover, in a rare example of legislative foresight, politicians in many countries have anticipated the problem. In 2008, for example, America’s Congress passed the Genetic Information Nondiscrimination Act, banning the use of genetic information in job recruitment. Similar measures had previously been adopted in several European countries, including Denmark, Finland, France and Sweden.

Biohazard?

There is one other group of critics. These are those who worry that applying biology to business is dangerous not because it is powerful, but because it isn’t. To the extent they are genetic at all, behavioural outcomes are probably the result of the interaction of myriad genes in ways that are decades from being fully understood. That applies as much to business-related behaviour as to behaviour in any other facet of life.

Still, as Dr Zyphur is keen to note, not all academic work has to be about hard-nosed application in the here and now. Often, the practical applications of science are serendipitous—and may take a long time to arrive. And even if they never arrive, understanding human behaviour is just plain interesting for its own sake. “We in business schools often act like technicians in the way we conceptualise and teach our topics of study,” he laments. “This owes much to the fact that a business school is more like a trade school than it is a part of classic academia.” Now, largely as a result of efforts by Dr Zyphur and others like him, management science looks set for a thorough, biology-inspired overhaul. Expect plenty more lab coats in business-school corridors.

*“Born Entrepreneurs, Born Leaders. How Your Genes Affect Your Work Life”. Oxford University Press. $29.95

__________

Full article and photos: http://www.economist.com/node/17090697