Five Best Groundbreaking Memoirs

The Education of Henry Adams

By Henry Adams (1918)

With its narratordolefully pointing the way toward modernism, insistently (and convincingly) writing in the third person, “The Education of Henry Adams” is a one-man kaleidoscope of American history: its politics and pretenses, its turn from a patrician, Victorian society toward the unknowable chaos of the 20th century. Adams regarded his efforts at education as a lifelong exercise in passionate failure. Though 100 copies of the book were printed privately in 1907, he withheld general publication until after his death in 1918. What he didn’t write revealed an intimate truth: Adams omitted the story of his wife’s depression and suicide in 1885. Here was a seminal memoir, required reading for every student of intellectual history, in which the Rubicon of a life had been left out! Adams lifts the veil just twice: once when describing his sister’s death, and again when he returns to America and visits the bronze statue at Rock Creek cemetery in Washington, commissioned from Augustus Saint-Gaudens in his wife’s honor.

Survival in Auschwitz

By Primo Levi (1958)

From the opening sentence—”I was captured by the Fascist Militia on 13 December 1943″—this searingly quiet account by Primo Levi, an Italian chemist, of his 10 months in Auschwitz is a monument of dignity. First published in Italy in 1947 with a title that translates as “If This Is a Man,” the book became a blueprint for every such story that followed, not only as a portrait of the camp’s atrocities but also as a testament to the moments when humanity prevailed. On a mile-long trip with a fellow prisoner to retrieve a 100-pound soup ration, Levi begins to teach his friend “The Canto of Ulysses” from Dante. Completing the lesson becomes urgent, then vital: “It is late, it is late,” Levi realizes, “we have reached the kitchen, I must finish.” No candle has ever shown more brilliantly from within the caverns of evil.

Slouching Towards Bethlehem

 

Joan Didion in 1981.

By Joan Didion (1968)

If Joan Didion’s first nonfiction collection now seems tethered to the 1960s, it’s partly because so many writers would try to imitate her style: The tenor and cadence were as precise as an atomic clock. She mapped a prevailing culture from the badlands of Southern California and the streets of Haight-Ashbury to the province of her own paranoia, all of it cloaked in jasmine-scented doom. As both background character and prevailing sensibility, Didion brings the reader into her lair: “You see the point. I want to tell you the truth, and already I have told you about the wide rivers.” “Slouching Towards Bethlehem” suggested that memoir was about voice as well as facts. Didion didn’t just intimate a decade of upheaval, she announced it with a starter pistol’s report.

Dispatches

By Michael Herr (1977)

Every war has its Stephen Crane, its Robert Graves—and Vietnam had Michael Herr. He spent a year in-country in 1967, then nearly a decade turning what he saw there into a surreal narrative of the war’s geography, from its napalmed landscape to the craters of a soldier’s mind. Soldiers talked to Herr—told him things they hadn’t said before or maybe even known. “I should have had ‘Born to Listen’ written on my helmet,” he told me in London in 1988. What Herr dared to write about was war’s primal allure: “the death space and the life you found inside it.” That he created this gunmetal narrative with a blend of fact and creative memory was acknowledged from the first; his netherland of “truth” mirrored the dream-like quality of the war and influenced its literature for a decade to come.

Darkness Visible

By William Styron (1990)

Certainly there have been other literary memoirs of personal anguish, but Styron’s brutal account of his cliffwalk with suicidal despair blew the door open on the subject. Depression and alcoholism in writers had too often been viewed through a lens of romantic ruin—the destiny- ridden price of creative genius. “Darkness Visible” put an end to all that. Literary lion, second lieutenant during World War II, Styron was brought to his knees in his own brooding woods. His story hauled plenty of ideas about clinical depression out of the 19th century and into the light of day, where they belonged.

Ms. Caldwell is the author of “Let’s Take the Long Way Home: A Memoir of Friendship.” The former chief book critic of the Boston Globe, she was in 2001 awarded the Pulitzer Prize for Distinguished Criticism.

__________

Full article and photo: http://online.wsj.com/article/SB10001424052748703989304575503950688851086.html

What Ahmadinejad Knows

Iran’s president appeals to 9/11 Truthers.

Let’s put a few facts on the table.

• The recent floods in Pakistan are acts neither of God nor of nature. Rather, they are the result of a secret U.S. military project called HAARP, based out of Fairbanks, Alaska, which controls the weather by sending electromagnetic waves into the upper atmosphere. HAARP may also be responsible for the recent spate of tsunamis and earthquakes.

• Not only did the U.S. invade Iraq for its oil, but also to harvest the organs of dead Iraqis, in which it does a thriving trade.

• Faisal Shahzad was not the perpetrator of the May 1 Times Square bombing, notwithstanding his own guilty plea. Rather, the bombing was orchestrated by an American think tank, though its exact identity has yet to be established.

• Oh, and 9/11 was an inside job. Just ask Mahmoud Ahmadinejad.

The U.S. and its European allies were quick to walk out on the Iranian president after he mounted the podium at the U.N. last week to air his three “theories” on the attacks, each a conspiratorial shade of the other. But somebody should give him his due: He is a provocateur with a purpose. Like any expert manipulator, he knew exactly what he was doing when he pushed those most sensitive of buttons.

He knew, for instance, that the Obama administration and its allies are desperate to resume negotiations over Iran’s nuclear programs. What better way to set the diplomatic mood than to spit in their eye when, as he sees it, they are already coming to him on bended knee?

He also knew that the more outrageous his remarks, the more grateful the West would be for whatever crumbs of reasonableness Iran might scatter on the table. This is what foreign ministers are for.

Finally, he knew that the Muslim world would be paying attention to his speech. That’s a world in which his view of 9/11 isn’t on the fringe but in the mainstream. Crackpots the world over—some of whom are reading this column now—want a voice. Ahmadinejad’s speech was a bid to become theirs.

This is the ideological component of Ahmadinejad’s grand strategy: To overcome the limitations imposed on Iran by its culture, geography, religion and sect, he seeks to become the champion of radical anti-Americans everywhere. That’s why so much of his speech last week was devoted to denouncing capitalism, the hardy perennial of the anti-American playbook. But that playbook needs an update, which is where 9/11 “Truth” fits in.

Could it work? Like any politician, Ahmadinejad knows his demographic. The University of Maryland’s World Public Opinion surveys have found that just 2% of Pakistanis believe al Qaeda perpetrated the attacks, whereas 27% believe it was the U.S. government. (Most respondents say they don’t know.)

Among Egyptians, 43% say Israel is the culprit, while another 12% blame the U.S. Just 16% of Egyptians think al Qaeda did it. In Turkey, opinion is evenly split: 39% blame al Qaeda, another 39% blame the U.S. or Israel. Even in Europe, Ahmadinejad has his corner. Fifteen percent of Italians and 23% of Germans finger the U.S. for the attacks.

Deeper than the polling data are the circumstances from which they arise. There’s always the temptation to argue that the problem is lack of education, which on the margins might be true. But the conspiracy theories cited earlier are retailed throughout the Muslim world by its most literate classes, journalists in particular. Irrationalism is not solely, or even mainly, the province of the illiterate.

Nor is it especially persuasive to suggest that the Muslim world needs more abundant proofs of American goodwill: The HAARP fantasy, for example, is being peddled at precisely the moment when Pakistanis are being fed and airlifted to safety by U.S. Marine helicopters operating off the USS Peleliu.

What Ahmadinejad knows is that there will always be a political place for what Michel Foucault called “the sovereign enterprise of Unreason.” This is an enterprise whose domain encompasses the politics of identity, of religious zeal, of race or class or national resentment, of victimization, of cheek and self-assertion. It is the politics that uses conspiracy theory not just because it sells, which it surely does, or because it manipulates and controls, which it does also, but because it offends. It is politics as a revolt against empiricism, logic, utility, pragmatism. It is the proverbial rage against the machine.

Chances are you know people to whom this kind of politics appeals in some way, large or small. They are Ahmadinejad’s constituency. They may be irrational; he isn’t crazy.

Bret Stephens, Wall Street Journal

__________

Full article : http://online.wsj.com/article/SB10001424052748704654004575517632476603268.html

So wrong it’s right

The ‘eggcorn’ has its day

Over the past 10 days, language bloggers have been exchanging virtual high-fives at the news of an honor bestowed on one of their coinages. In its most recent quarterly update, the Oxford English Dictionary Online announced that its word-hoard now includes the shiny new term eggcorn.

An eggcorn, as regular readers of this column may recall, is — well, here’s the official new definition: “an alteration of a word or phrase through the mishearing or reinterpretation of one or more of its elements as a similar-sounding word.” If you write “let’s nip it in the butt” (instead of “bud”) or “to the manor born” (instead of “manner”), you’re using an eggcorn.

The term derives from “egg corn” as a substitution for “acorn,” whose earliest appearance comes in an 1844 letter from an American frontiersman: “I hope you are as harty as you ust to be and that you have plenty of egg corn bread which I can not get her and I hop to help you eat some of it soon.”

Why would eggcorn (as we now spell it) replace acorn in the writer’s lexicon? As the OED editors comment, “acorns are, after all, seeds which are somewhat egg-shaped, and in many dialects the formations acorn and eggcorn sound very similar.” (And, like corn kernels, acorns can be ground into meal or flour.) This coinage came to the attention of the linguists blogging at Language Log in 2003, and at the suggestion of Geoffrey Pullum, one of the site’s founders, it was adopted as the term for all such expressions.

Eggcorns needed their own label, the Language Loggers decided, because they were mistakes of a distinct sort — variants on the traditional phrasing, but ones that still made at least a bit of sense. “Nip it in the bud,” for instance, is a horticultural metaphor, perhaps not so widely understood as it once was; the newer “nip it in the butt” describes a different strategy for getting rid of some unwelcome visitation, but it’s not illogical. Hamlet said he was “to the manner born,” but the modern alteration, “to the manor born,” is also a useful formula.

And because they make sense, eggcorns are interesting in a way that mere disfluencies and malapropisms are not: They show our minds at work on the language, reshaping an opaque phrase into something more plausible. They’re tiny linguistic treasures, pearls of imagination created by clothing an unfamiliar usage in a more recognizable costume.

Even before the eggcorn era, most of us had heard (or experienced) pop-song versions of the phenomenon, like “’Scuse me while I kiss this guy” (for Jimi Hendrix’s “kiss the sky” line), but these have had their own label, mondegreen, for more than half a century. The word was coined in 1954 by Sylvia Wright, in commemoration of her mishearing of a Scottish ballad: “They have slain the Earl o’ Moray/ And laid him on the green,” went the lament, but Wright thought the villains had slain the earl “and Lady Mondegreen.”

Then there are malapropisms, word substitutions that sound similar but make no sense at all. They’re named for Mrs. Malaprop, a character in the 1775 play “The Rivals,” whose childrearing philosophy illustrates her vocabulary problem: “I would by no means wish a daughter of mine to be a progeny of learning….I would have her instructed in geometry, that she might know something of the contagious countries.”

And when the misconceived word or expression has spread so widely that we all use it, it’s a folk etymology — or, to most of us, just another word. Bridegroom, hangnail, Jerusalem artichoke — all started out as mistakes.

But we no longer beat ourselves up because our forebears substituted groom for the Old English guma (“man”), or modified agnail (“painful nail”) into hangnail, or reshaped girasole (“sunflower” in Italian) into the more familiar Jerusalem.

The border between these folk-etymologized words, blessed by history and usage, and the newer eggcorns is fuzzy, and there’s been some debate already at the American Dialect Society’s listserv, ADS-L, about whether the distinction is real. Probably there is no bright line; to me, “you’ve got another thing coming” and “wile away the hours” are eggcorns — recent reshapings of expressions I learned as “another think” and “while away” — but to you they may be normal.

But we face the same problem in deciding which senses are valid for everyday, non-eggcornish words. When does nonplussed for “unfazed” or enormity for “hugeness” become the standard sense? We can only wait and see; the variants may duke it out for decades, but if a change takes hold, the battle will one day be forgotten.

The little eggcorn is in the same situation: It’s struggling to overcome its mixed-up heritage and grow into the kind of respectable adulthood enjoyed by the Jerusalem artichoke. We’re not obliged to help it along, but while it’s here, we might as well enjoy its wacky poetry.

Jan Freeman’s e-mail address is mailtheword@gmail.com; she blogs about language at Throw Grammar from the Train (throwgrammarfromthetrain.blogspot.com).  

__________

Full article and photo: http://www.boston.com/bostonglobe/ideas/articles/2010/09/26/so_wrong_its_right

The Non-Economist’s Economist

John Kenneth Galbraith avoided technical jargon and wrote witty prose—too bad he got so much wrong

The Dow Jones Industrials spent 25 years in the wilderness after the 1929 Crash. Not until 1954 did the disgraced 30-stock average regain its Sept. 3, 1929, high. And then, its penance complete, it soared. In March 1955, the U.S. Senate Banking and Currency Committee, J. William Fulbright of Arkansas, presiding, opened hearings to determine what dangers lurked in this new bull market. Was it 1929 all over again?

John Kenneth Galbraith (1908-2006), photographed by Richard Avedon in Boston in 1993

One of the witnesses, John Kenneth Galbraith, a 46-year-old Harvard economics professor, seemed especially well-credentialed. His new history of the event that still transfixed America, “The Great Crash, 1929” was on its way to the bookstores and to what would prove to be a commercial triumph. An alumnus of Ontario Agricultural College and the holder of a doctorate in agricultural economics from the University of California at Berkeley, Galbraith had written articles for Fortune magazine and speeches for Adlai Stevenson, the defeated 1952 Democratic presidential candidate. He was a World War II price controller and the author of “American Capitalism: The Concept of Countervailing Power.” When he stepped into a crowded elevator, strangers tried not to stare: he stood 6 feet 8 inches tall.

On the one hand, Galbraith observed, the stock market was not so speculatively charged in 1955 as it had been in 1929 On the other, he insisted, there were worrying signs of excess. Stocks were not so cheap as they had been in the slack and demoralized market of 1953 (though, at 4%, they still outyielded corporate bonds). “The relation of share prices to book value is showing some of the same tendencies as in 1929,” Galbraith went on. “And while it would be a gross exaggeration to say that there has been the same escape from reality that there was in 1929, it does seem to me that enough has happened to indicate that we haven’t yet lost our capacity for speculative self-delusion.”

__________

Reading List: If Not Galbraith, Who?

Maury Klein tells a great story in “Rainbow’s End: The Crash of 1929” (Oxford, 2001), but he also attempts to answer the great question: What went wrong? For the financial specialist in search of a tree-by-tree history of the forest of the Depression, look no further than Barrie A. Wigmore’s “The Crash and Its Aftermath: A History of the Securities Markets in the United States, 1929-33” (Greenwood Press, 1985).

In the quality of certitude, the libertarian Murray Rothbard yielded to no economist. His revisionist history, “America’s Great Depression” (available through the website of the Mises Institute), contends that it was the meddling Hoover administration that turned recession into calamity. Amity Shlaes draws up a persuasive indictment of the New Deal in her “The Forgotten Man” (HarperCollins, 2007).

“Economics and the Public Welfare” by Benjamin Anderson (Liberty Press, 1979) is in strong contention for the lamest title ever fastened by a publisher on a deserving book. Better, the subtitle: “A Financial and Economic History of the United States: 1914-1946.”

“Where are the Customers’ Yachts? Or A Good Hard Look at Wall Street,” by Fred Schwed Jr. (Simon & Schuster, 1940) is the perfect antidote for any who imagine that the reduced salaries and status of today’s financiers is anything new. Page for page, Schwed’s unassuming survey of the financial field might be the best investment book ever written. Hands-down, it’s the funniest.

An unfunny but essential contribution to the literature of the Federal Reserve is the long-neglected “Theory and Practice of Central Banking” (Harper, 1936) by Henry Parker Willis, the first secretary of the Federal Reserve Board. Willis wrote to protest the against the central bank’s reinvention of itself, quite against the intentions of its founders, as a kind of infernal economic planning machine. He should see it now.

Freeman Tilden’s “A World in Debt” (privately printed, 1983) is a quirky, elegant, long out-of-print treatise by a non-economist on an all-too-timely subject. “The world,” wrote Tilden in 1936, “has several times, and perhaps many times, squandered itself into a position where a total deflation of debt was imperative and unavoidable. We may be entering one more such receivership of civilization.”

If the Obama economic program leaves you cold, puzzled or hot under the collar, turn to Hunter Lewis’s “Where Keynes Went Wrong” (Axios Press, 2009) or “The Critics of Keynesian Economics,” edited by Henry Hazlitt (Arlington House, 1977).

—James Grant

__________

Re-reading Galbraith is like watching black-and-white footage of the 1955 World Series. The Brooklyn Dodgers are gone—and so is much of the economy over which Galbraith lavished so much of his eviscerating wit. In 1955, “globalization” was a word yet uncoined. Imports and exports each represented only about 4% of GDP, compared with 16.1% and 12.5%, respectively, today. In 1955, regulation was constricting (this feature of the Eisenhower-era economy seems to be making a reappearance) and unions were powerful. There was a lingering, Depression-era suspicion of business and, especially, of Wall Street. The sleep of corporate managements was yet undisturbed by the threat of a hostile takeover financed with junk bonds.

Half a century ago, the “conventional wisdom,” in Galbraith’s familiar phrase, was statism. In “American Capitalism,” the professor heaped scorn on the CEOs and Chamber of Commerce presidents and Republican statesmen who protested against federal regimentation. “In the United States at this time,” noted the critic Lionel Trilling in 1950, “liberalism is not only the dominant but even the sole intellectual tradition.” William F. Buckley’s upstart conservative magazine, National Review, made its debut in 1955 with the now-famous opening line that it “stands athwart history, yelling Stop.” Galbraith seemed not to have noticed that history and he were arm in arm. His was the conventional wisdom.

Concerning the emphatic Milton Friedman, someone once borrowed the Victorian-era quip, “I wish I was as sure of anything as he is of everything.” Galbraith and the author of “Capitalism and Freedom” were oil and water, but they did share certitude. To Galbraith, “free-market capitalism” was an empty Rotary slogan. It didn’t exist and, in Eisenhower-era America, couldn’t. Industrial oligopolies had rendered it obsolete.

Only in the introductory economics textbooks, he believed, did the free interplay between supply and demand determine price. Fortune 500 companies set their own prices. They chaffered with their vendors and customers, who themselves were big enough to throw their weight around in the market. As a system of decentralized decision-making, there was something to be said for capitalism, Galbraith allowed. As a network of oligopolistic fiefdoms, however, it needed federal direction. The day of Adam Smith’s “invisible hand” was over or ending. “Countervailing power,” in the Galbraith formulation, was the new idea.

Corporate bureaucrats—collectively, the “technostructure”—had pushed aside the entrepreneurs, proposed Galbraith channeling Thorstein Veblen. While, under the robber baron model, the firm existed to make profits, the modern behemoth exists to perpetuate itself in power while incidentally earning a profit. Planning is what the technostructure does best—it seems to hate surprises. “This planning,” wrote Galbraith, in “The New Industrial State,” “replaces prices that are established by the market with prices that are established by the firm. The firm, in tacit collaboration with the other firms in the industry, has wholly sufficient power to set and maintain minimum prices.” What was to be done? “The market having been abandoned in favor of planning of prices and demand,” he prescribed, “there is no hope that it will supply [the] last missing element of restraint. All that remains is the state.” It was fine with the former price controller of the Office of Price Administration.

As for the stockholder, he or she was as much a cipher as the manipulated consumer. “He (or she) is a passive and functionless figure, remarkable only on his capacity to share, without effort or even without appreciable risk, in the gains from the growth by which the technostructure measures its success,” according to Galbraith. “No grant of feudal privilege has ever equaled, for effortless return, that of the grandparents who bought and endowed his descendants with a thousand shares of General Motors or General Electric or IBM.” Galbraith was writing near the top of the bull market he had failed to anticipate in 1955. Shareholders were about to re-learn (if they had forgotten) the lessons of “risk.”

In its way, “The New Industrial State” was as mistimed as “The Great Crash.” In 1968, a year after the appearance of the first edition, the planning wheels started to turn at Leasco Data Processing Corp., Great Neck, N.Y. But Leasco’s “planning” took the distinctly un- Galbraithian turn of an unsolicited bid for control of the blue-blooded Chemical Bank of New York. Here was something new under the sun. Saul Steinberg, would-be revolutionary at the head of Leasco, ultimately surrendered before the massed opposition of the New York banking community. (“I always knew there was an Establishment,” Mr. Steinberg mused—”I just used to think I was a part of it.”) But the important thing was the example Mr. Steinberg had set by trying. The barbarians were beginning to form at the corporate gates.

The cosseted, self-perpetuating corporate bureaucracy that Galbraith described in “The New Industrial State” was in for a rude awakening. Deregulation became a Washington watchword under President Carter, capitalism got back its good name under President Reagan and trade barriers fell under President Clinton. Presently came the junk-bond revolution and the growth in an American market for corporate control. Hedge funds and private equity funds prowled for under- and mismanaged public companies to take over, resuscitate and—to be sure, all too often—to overload with debt. The collapse of communism and the rise of digital technology opened up vast new fields of competitive enterprise. Hundreds of millions of eager new hands joined the world labor force, putting downward pressure on costs, prices and profit margins. Wal-Mart delivered everyday low, and lower, prices, and MCI knocked AT&T off its monopolistic pedestal. The technostructure must have been astounded.

Galbraith in his home in Cambridge, Mass., in 1981

Here are the opening lines of “American Capitalism”: “It is told that such are the aerodynamics and wing-loading of the bumblebee that, in principle, it cannot fly. It does, and the knowledge that it defied the august authority of Isaac Newton and Orville Wright must keep the bee in constant fear of a crack-up.” You keep reading because of the promise of more in the same delightful vein. And, indeed, there is much more, including a charming annotated chronology of Galbraith’s life by his son and the editor of this volume, James K. Galbraith.

John F. Kennedy’s ambassador to India, muse to the Democratic left, two-time recipient of the Presidential Medal of Freedom, celebrity author, Galbraith in life was even larger than his towering height. His “A Theory of Price Control,” which was published in 1952 to favorable reviews but infinitesimal sales, was his one and only contribution to the purely professional economics literature. Thereafter this most acerbic critic of free markets prospered by giving the market what it wanted.

Now comes the test of whether his popular writings will endure longer than the memory of his celebrity and the pleasure of his prose. “The Great Crash” has a fighting chance, because of its very lack of analytical pretense. “History that reads like a poem,” raved Mark Van Doren in his review of the 1929 book. Or, he might have judged, that eats like whipped cream.

But the other books in this volume seem destined for only that kind of immortality conferred on amusing period pieces. When, for example, Galbraith complains in “The Affluent Society” that governments can’t borrow enough, or that the Federal Reserve is powerless to resist inflation, you wonder what country he was writing about, or even what planet he was living on.

Not that the professor refused to learn. In the first edition of “The New Industrial State,” for instance, he writes confidently: “While there may be difficulties, and interim failures or retreats are possible and indeed probable, a system of wage and price restraint is inevitable in the industrial system.” A decade or so later, in the edition selected for this volume, that sentence is gone. In its place is another not quite so confident: “The history of controls, in some form or other and by some nomenclature, is still incomplete.”

At the 1955 stock-market hearings, Galbraith was followed at the witness table by the aging speculator and “adviser to presidents” Bernard M. Baruch. The committee wanted to know what the Wall Street legend thought of the learned economist. “I know nothing about him to his detriment,” Baruch replied. “I think economists as a rule—and it is not personal to him—take for granted they know a lot of things. If they really knew so much, they would have all of the money, and we would have none.”

Mr. Grant, the editor of Grant’s Interest Rate Observer, is the author, most recently, of “Mr. Market Miscalculates” (Axios, 2009)

__________

Full article and photos: http://online.wsj.com/article/SB10001424052748703556604575501883282762648.html

Uncommon knowledge

A surprise benefit of minimum wage

The minimum wage has been politically controversial for most of the last century, even though it affects a marginal share of the labor force and evidence of significant job loss is inconclusive. Now one economist would like us to consider another effect of the minimum wage: finishing high school. By curtailing low-wage/low-skill jobs, the minimum wage motivates young people to stay in school and become skilled. This effect then generates what the author calls an “educational cascade” by setting an example for the upcoming class of students. He estimates that the average male born in 1951 gained 0.2 years — and the average male born in 1986 gained 0.7 years — of high school due to the cumulative effect of the minimum wage.

Sutch, R., “The Unexpected Long-Run Impact of the Minimum Wage: An Educational Cascade,” National Bureau of Economic Research (September 2010).

Bearing false witness

False confessions and false eyewitness testimony are never-ending challenges for the judicial process. Although coercive interrogation is blamed in many of these situations, new research illustrates just how little coercion is needed. In an experiment, people played a quiz game for money. Later, they were told that the person who had sat next to them during the game was suspected of cheating. They were shown a 15-second video clip of the person sitting next to them cheating, even though the video clip was doctored and no cheating actually happened. They were asked to sign a witness statement against the cheater, but they were explicitly told not to sign if they hadn’t directly witnessed the cheating, aside from seeing it in the video. Nevertheless, almost half of those who saw the video signed the statement. Some of those who signed the statement even volunteered additional incriminating information.

Wade, K. et al., “Can Fabricated Evidence Induce False Eyewitness Testimony?” Applied Cognitive Psychology (October 2010).

The cure for sadness: pain

For most people, pain is not fun. However, a recent study finds that, when you’re not having fun, pain can help. Several hundred people were tested to see how much pain — in the form of increasing pressure or heat applied to their hands — they could tolerate. Not surprisingly, people reported being less happy after the experiment. But less happy is not necessarily the same as more unhappy. Indeed, negative emotions were also attenuated after the experiment, especially for women and people with more sensitive emotions. In other words, physical pain helped dull emotional pain.

Bresin, K. et al., “No Pain, No Change: Reductions in Prior Negative Affect following Physical Pain,” Motivation and Emotion (September 2010).

That reminds me of…me!

In a series of experiments, researchers have transformed Descartes’s famous phrase (“I think, therefore I am”) into something like this: “I am reminded of myself, therefore I will think.” People presented with a resume or product paid more attention to it if it happened to have a name similar to their own. As a result of this increased attention, a high-quality resume or product got a boost, while a low-quality resume or product was further handicapped. However, in a strange twist, people who sat in front of a mirror while evaluating a product exhibited the opposite effect: Quality didn’t matter for a product with a similar name but did matter otherwise. The authors speculate that too much self-referential thinking overloads one’s ability to think objectively.

Howard, D. & Kerin, R., “The Effects of Name Similarity on Message Processing and Persuasion,” Journal of Experimental Social Psychology (forthcoming).

Defensive sleeping

The odds that you’ll need to fend off an attacker entering your bedroom at night are pretty small. Yet, according to a recent study, our evolutionary heritage — formed when we had to survive sleeping outdoors — instills a strong preference for bedrooms designed less by the principles of Architectural Digest than by those of “Home Alone” or “Panic Room.” When shown a floor plan for a simple rectangular bedroom and asked to arrange the furniture, most people positioned the bed so that it faced the door. They also positioned the bed on the side of the room behind the door as it would be opening, and as far back from the door as possible, a position that would seem to give the occupant the most time to respond. If the floor plan included a window on the opposite side of the room from the door, people were inclined to move the bed away from the window, too.

Spörrle, M. & Stich, J., “Sleeping in Safe Places: An Experimental Investigation of Human Sleeping Place Preferences from an Evolutionary Perspective,” Evolutionary Psychology (August 2010).

Kevin Lewis is an Ideas columnist.

__________

Full article and photo: http://www.boston.com/bostonglobe/ideas/articles/2010/09/26/a_surprise_benefit_of_minimum_wage/

‘Busted’

Randy Britton e-mails: “I’ve noticed in much of the coverage of the BP oil spill that the press has taken to calling the oil well ‘busted.’ Since when is ‘busted’ the proper way to describe a broken oil well?  It seems very colloquial and not a form I would expect to see in proper journalistic forums.”

Even now that BP’s troubled oil well in the Gulf of Mexico is being permanently sealed, news reports continue to refer to the “busted well,” particularly wire services like the Associated Press and AFP. Reuters was an early adopter, reporting on efforts to contain the “busted well” on May 3. Alternatively, busted has modified oil rig, or just plain rig. A database search of coverage of the BP spill finds the first recorded use of busted came nine days into the crisis on April 29, when the MSNBC host Ed Schultz said, “The busted rig is leaking — get this — 200,000 gallons of oil a day.”

Is busted overly informal for journalists? The verb bust certainly has colloquial roots, beginning its life on the American scene as a folksy variant of burst. (The same dropping of the “r” turned curse into cuss, horse into hoss and parcel into passel.) Building on earlier use as a noun, bust busted out as a verb as early as 1806, when Meriwether Lewis, while on his famous expedition with William Clark, wrote in his journal, “Windsor busted his rifle near the muzzle.” Since then, bust has worked its way into a wide variety of American expressions.

Bust runs the gamut from slang to standard,” explain David K. Barnhart and Allan A. Metcalf in their book “America in So Many Words.” “When it is used to mean ‘to explode or fall apart or be arrested,’ bust is generally slang. In the sense of failing (especially financially) it is informal, as busting the bank in gambling lingo, while in the specialized sense of taming a horse it is standard, the only way to say busting a bronco.

Despite its potential slanginess, busted is “not actually forbidden” in the news media, as the Boston Globe language columnist Jan Freeman wrote in August. Indeed, reporters often latch onto the occasional colloquialism that seems particularly expressive, and in this case, Freeman surmises they were drawn to the term’s “criminal-cowboy-macho connotations.”

Regardless of the reasons for its current vogue, it’s notable that busted was rarely relied on by the press to describe stricken oil wells before the BP disaster — even in incidents that were highly similar, such as the 1979 blowout of the Ixtoc I well in the Gulf of Mexico. Most of the precursors I found come from more literary sources. It was appropriate, for instance, in some light verse by J.W. Foley published in The New York Times in 1904:

Dear friend, there’s a question I’d like to ask you,
(Your pardon I crave if it vexes)
Have you ever invested a hundred or two
In an oil well somewhere down in Texas?
Have you ridden in autos (I mean in your mind),

With the profits you honestly trusted
Would flow from your venture in oil stocks — to find
That the oil well was hopelessly busted?

I can’t find fault in reporters drawing on the rich history of bust and busted in American English to add a little extra oomph to their dispatches from the gulf. Calling the well busted does evoke a looser, wilder state of disrepair than broken, or the more technically accurate blown-out. But after many months of news coverage, the phrase “busted well” has now turned into little more than a cliché. That’s a far worse journalistic offense than a bit of well-placed slang.

Ben Zimmer will answer one reader question every other week.

__________

Full article: http://www.nytimes.com/2010/09/26/magazine/26onlanguage.html

Unpacking Imagination

In an age of childhood obesity and children tethered to electronic consoles, playgrounds have rarely been more important. In an age of constrained government budgets, playgrounds have rarely been a harder sell. Fortunately, the cost of play doesn’t have to be prohibitive. In creating the Imagination Playground in Lower Manhattan — a playground with lots of loose parts for children to create their own play spaces — we realized that many of the elements with the greatest value to children were inexpensive and portable. Although traditional playgrounds can easily cost in the millions to build, boxed imagination playgrounds can be put together for under $10,000. (Land costs not included!) The design below is one that my architecture firm has done in collaboration with the New York City Parks Department and KaBoom, a nonprofit organization. But it needn’t be the only one out there. There are a lot of ways to build a playground — and a lot of communities in need of one. Let a thousand portable playgrounds bloom.

David Rockwell, New York Times

__________

Full article and photo: http://www.nytimes.com/interactive/2010/09/25/opinion/20100925_opchart.html