Monday, July 31, 2006

In vino somnium

Simon Blackburn in the Financial Times:

Postmodernism emerged out of the idea that we see things through filters coloured by hidden dark forces of class, gender, culture or language. From this it travelled to the ironic, detached attitude of treating truth as no more than a narrative, facts as indefinitely elastic, and the world as a text open to multiple interpretations. It held that our cherished standards of reason were just a contingent historical deposit, that some kind of relativism was intellectually forced on us, or that truth itself was only a social construction.

I write in the past tense, since many argue that the events of September 11 2001 demolished these ideas. Those events reminded people that there are beliefs we need to affirm. We need truth, reason and objectivity - and we need them to be on our side. The academy has become less friendly to relaxed pluralisms. Whereas in the 1980s and 1990s a playful attitude that anything goes seized the weaker parts of the humanities, now a stern rectitude calls the tune. Ideas often reflect politics - and politically, postmodernism can be seen as a reflection of postcolonial guilt, meaning not only that we were wrong to force western rationality or western science down other people’s throats, but that their rationality or their science was every bit as good as ours. If that is right, then the current backlash may be the philosopher’s version of neo-conservatism, parallel to current historians’ tasteless celebrations of the virtues of empire.


Postmodernism is often billed as attacking truth and science. This is how it is presented in the valuable little book Why Truth Matters, by the editors of the sceptical website, Ophelia Benson and Jeremy Stangroom. They mount a spirited counterattack, reminding us - in the way that Cambridge philosopher GE Moore was famous for doing - that if it comes to a battle for hearts and minds, basic convictions of common sense and science beat philosophical subtleties hands down. … Benson and Stangroom reveal a rogues’ gallery of social constructivists, who look at how individuals and groups participate in the creation of their own perceived reality. These “rogues” include the feminist Sandra Harding and the neo-pragmatist Richard Rorty, but the doyen must surely be the French philosopher of science Bruno Latour. Latour’s confusion of words and things led him to the precipice of denying that there could have been dinosaurs before the term was invented. Presumably a similar argument would show that nobody before Crick and Watson had DNA.

I have noticed that if the topic of ‘philosophy’ comes up in conversation, particularly in slightly inebriated conversation, a surprisingly large number of people are keen to leap in with their own personal Big Theory.

Depressingly often, this amounts to a statement of absurd truth relativism. It is usually illustrated by an uttered ‘proof’ such as “But when everybody believed that the Earth was flat, then it was the truth that the Earth was flat”, while a knowing smile and raised eyebrow indicate the profundity of the insight.

Sunday, July 30, 2006

The Singularity is Near

So Big and Healthy Grandpa Wouldn’t Even Know You

New York Times
July 30, 2006

Valentin Keller enlisted in an all-German unit of the Union Army in Hamilton, Ohio, in 1862. He was 26, a small, slender man, 5 feet 4 inches tall, who had just become a naturalized citizen. He listed his occupation as tailor.

A year later, Keller was honorably discharged, sick and broken. He had a lung ailment and was so crippled from arthritis in his hips that he could barely walk.

His pension record tells of his suffering. “His rheumatism is so that he is unable to walk without the aid of crutches and then only with great pain,” it says. His lungs and his joints never got better, and Keller never worked again.

He died at age 41. [...]

People of Valentin Keller’s era, like those before and after them, expected to develop chronic diseases by their 40’s or 50’s. Keller’s descendants had lung problems, they had heart problems, they had liver problems. They died in their 50’s or 60’s.

Now, though, life has changed. The family’s baby boomers are reaching middle age and beyond and are doing fine.

“I feel good,” says Keller’s great-great-great-grandson Craig Keller. At 45, Mr. Keller says he has no health problems, nor does his 45-year-old wife, Sandy.

The Keller family illustrates what may prove to be one of the most striking shifts in human existence — a change from small, relatively weak and sickly people to humans who are so big and robust that their ancestors seem almost unrecognizable.

New research from around the world has begun to reveal a picture of humans today that is so different from what it was in the past that scientists say they are startled. Over the past 100 years, says one researcher, Robert W. Fogel of the University of Chicago, humans in the industrialized world have [changed in a way] "that is unique not only to humankind, but unique among the 7,000 or so generations of humans who have ever inhabited the earth.” [...]

The biggest surprise emerging from the new studies is that many chronic ailments like heart disease, lung disease and arthritis are occurring an average of 10 to 25 years later than they used to. There is also less disability among older people today, according to a federal study that directly measures it. And that is not just because medical treatments like cataract surgery keep people functioning. Human bodies are simply not breaking down the way they did before.

Even the human mind seems improved. The average I.Q. has been increasing for decades, and at least one study found that a person’s chances of having dementia in old age appeared to have fallen in recent years.

The proposed reasons are as unexpected as the changes themselves. Improved medical care is only part of the explanation; studies suggest that the effects seem to have been set in motion by events early in life, even in the womb, that show up in middle and old age.

“What happens before the age of 2 has a permanent, lasting effect on your health, and that includes aging,” said Dr. David J. P. Barker, a professor of medicine at Oregon Health and Science University in Portland and a professor of epidemiology at the University of Southampton in England.

Each event can touch off others. Less cardiovascular disease, for example, can mean less dementia in old age. The reason is that cardiovascular disease can precipitate mini-strokes, which can cause dementia. Cardiovascular disease is also a suspected risk factor for Alzheimer’s disease.

The effects are not just in the United States. Large and careful studies from Finland, Britain, France, Sweden and the Netherlands all confirm that the same things have happened there; they are also beginning to show up in the underdeveloped world. [...]

In 1900, 13 percent of people who were 65 could expect to see 85. Now, nearly half of 65-year-olds can expect to live that long.

People even look different today. American men, for example, are nearly 3 inches taller than they were 100 years ago and about 50 pounds heavier. [...]

Today’s middle-aged people are the first generation to grow up with childhood vaccines and with antibiotics. Early life for them was much better than it was for their parents, whose early life, in turn, was much better than it was for their parents.

And if good health and nutrition early in life are major factors in determining health in middle and old age, that bodes well for middle-aged people today. Investigators predict that they may live longer and with less pain and misery than any previous generation.

“Will old age for today’s baby boomers be anything like the old age we think we know?” Dr. Barker asked. “The answer is no.” [...]

Scientists used to say that the reason people are living so long these days is that medicine is keeping them alive, though debilitated. But studies like one Dr. Fogel directs, using records of of Union Army veterans, have led many to rethink that notion.

The study involves a random sample of about 50,000 Union Army veterans. Dr. Fogel compared those men, the first generation to reach age 65 in the 20th century, with people born more recently.

The researchers focused on common diseases that are diagnosed in pretty much the same way now as they were in the last century. So they looked at ailments like arthritis, back pain and various kinds of heart disease that can be detected by listening to the heart.

The first surprise was just how sick people were, and for how long.

Instead of inferring health from causes of death on death certificates, Dr. Fogel and his colleagues looked at health throughout life. They used the daily military history of each regiment in which each veteran served, which showed who was sick and for how long; census manuscripts; public health records; pension records; doctors’ certificates showing the results of periodic examinations of the pensioners; and death certificates.

They discovered that almost everyone of the Civil War generation was plagued by life-sapping illnesses, suffering for decades. And these were not some unusual subset of American men — 65 percent of the male population ages 18 to 25 signed up to serve in the Union Army. “They presumably thought they were fit enough to serve,” Dr. Fogel said.

Even teenagers were ill. Eighty percent of the male population ages 16 to 19 tried to sign up for the Union Army in 1861, but one out of six was rejected because he was deemed disabled.

And the Union Army was not very picky. “Incontinence of urine alone is not grounds for dismissal,” said Dora Costa, an M.I.T. economist who works with Dr. Fogel, quoting from the regulations. A man who was blind in his right eye was disqualified from serving because that was his musket eye. But, Dr. Costa said, “blindness in the left eye was O.K.”

After the war ended, as the veterans entered middle age, they were rarely spared chronic ailments.

“In the pension records there were descriptions of hernias as big as grapefruits,” Dr. Costa said. “They were held in by a truss. These guys were continuing to work although they clearly were in a lot of pain. They just had to cope.”

Eighty percent had heart disease by the time they were 60, compared with less than 50 percent today. By ages 65 to 74, 55 percent of the Union Army veterans had back problems. The comparable figure today is 35 percent.

The steadily improving health of recent generations shows up in population after population and country after country. But these findings raise a fundamental question, Dr. Costa said.

“The question is, O.K., there are these differences, and yes, they are big. But why?” she said.

“That’s the million-dollar question,” said David M. Cutler, a health economist at Harvard. “Maybe it’s the trillion-dollar question. And there is not a received answer that everybody agrees with.” [...]

Men living in the Civil War era had an average height of 5-foot-7 and weighed an average of 147 pounds. That translates into a body mass index of 23, well within the range deemed “normal.” Today, men average 5-foot-9½ and weigh an average of 191 pounds, giving them an average body mass index of 28.2, overweight and edging toward obesity.

Those changes, along with the great improvements in general health and life expectancy in recent years, intrigued Dr. Costa. Common chronic diseases — respiratory problems, valvular heart disease, arteriosclerosis, and joint and back problems — have been declining by about 0.7 percent a year since the turn of the 20th century. And when they do occur, they emerge at older ages and are less severe.

The reasons, she and others are finding, seem to have a lot to do with conditions early in life. Poor nutrition in early years is associated with short stature and lifelong ill health, and until recently, food was expensive in the United States and Europe.

Dr. Fogel and Dr. Costa looked at data on height and body mass index among Union Army veterans who were 65 and older in 1910 and veterans of World War II who were that age in the 1980’s. Their data relating size to health led them to a prediction: the World War II veterans should have had 35 percent less chronic disease than the Union Army veterans. That, they said, is exactly what happened.

They also found that diseases early in life left people predisposed to chronic illnesses when they grew older.

“Suppose you were a survivor of typhoid or tuberculosis,” Dr. Fogel said. “What would that do to aging?” It turned out, he said, that the number of chronic illnesses at age 50 was much higher in that group. “Something is being undermined,” he said. “Even the cancer rates were higher. Ye gods. We never would have suspected that.”

Men who had respiratory infections or measles tended to develop chronic lung disease decades later. Malaria often led to arthritis. Men who survived rheumatic fever later developed diseased heart valves.

And stressful occupations added to the burden on the body.

People would work until they died or were so disabled that they could not continue, Dr. Fogel said. “In 1890, nearly everyone died on the job, and if they lived long enough not to die on the job, the average age of retirement was 85,” he said. Now the average age is 62.

A century ago, most people were farmers, laborers or artisans who were exposed constantly to dust and fumes, Dr. Costa said. “I think there is just this long-term scarring.”

Dr. Barker of Oregon Health and Science University is intrigued by the puzzle of who gets what illness, and when.

“Why do some people get heart disease and strokes and others don’t?” he said. [...]
Animal studies and data that he and others have been gathering have convinced him that health in middle age can be determined in fetal life and in the first two years after birth.

His work has been controversial. Some say that other factors, like poverty, may really be responsible. But Dr. Barker has also won over many scientists.

In one study, he examined health records of 8,760 people born in Helsinki from 1933 to 1944. Those whose birth weight was below about six and a half pounds and who were thin for the first two years of life, with a body mass index of 17 or less, had more heart disease as adults.

Another study, of 15,000 Swedish men and women born from 1915 to 1929, found the same thing. So did a study of babies born to women who were pregnant during the Dutch famine, known as the Hunger Winter, in World War II.

That famine lasted from November 1944 until May 1945. Women were eating as little as 400 to 800 calories a day, and a sixth of their babies died before birth or shortly afterward. But those who survived seemed fine, says Tessa J. Roseboom, an epidemiologist at the University of Amsterdam, who studied 2,254 people born at one Dutch hospital before, during and after the famine. Even their birth weights were normal.

But now those babies are reaching late middle age, and they are starting to get chronic diseases at a much higher rate than normal, Dr. Roseboom is finding. Their heart disease rate is almost triple that of people born before or after the famine. They have more diabetes. They have more kidney disease.

That is no surprise, Dr. Barker says. Much of the body is complete before birth, he explains, so a baby born to a pregnant woman who is starved or ill may start life with a predisposition to diseases that do not emerge until middle age.

The middle-aged people born during the famine also say they just do not feel well. Twice as many rated their health as poor, 10 percent compared with 5 percent of those born before or after the famine.

“We asked them whether they felt healthy,” Dr. Roseboom said. “The answer to that tends to be highly predictive of future mortality.”

But not everyone was convinced by what has come to be known as the Barker hypothesis, the idea that events very early in life affect health and well-being in middle and old age. One who looked askance was Douglas V. Almond, an economist at Columbia University.

Dr. Almond had a problem with the studies. They were not of randomly selected populations, he said, making it hard to know if other factors had contributed to the health effects. He wanted to see a rigorous test — a sickness or a deprivation that affected everyone, rich and poor, educated and not, and then went away. Then he realized there had been such an event: the 1918 flu.

The flu pandemic arrived in the United States in October 1918 and was gone by January 1919, afflicting a third of the pregnant women in the United States. What happened to their children? Dr. Almond asked.

He compared two populations: those whose mothers were pregnant during the flu epidemic and those whose mothers were pregnant shortly before or shortly after the epidemic.

To his astonishment, Dr. Almond found that the children of women who were pregnant during the influenza epidemic had more illness, especially diabetes, for which the incidence was 20 percent higher by age 61. They also got less education — they were 15 percent less likely to graduate from high school. The men’s incomes were 5 percent to 7 percent lower, and the families were more likely to receive public assistance.

The effects, Dr. Almond said, occurred in whites and nonwhites, in rich and poor, in men and women. He convinced himself, he said, that there was something to the Barker hypothesis.

Living Large and Healthy, but How Long Can It Go On?

New York Times
July 30, 2006

Longer life. Less disease. Less disability. The trends have continued for more than a century as humans have become bigger, stronger and healthier. But can they — will they — keep going? Or is there some countertrend, obesity or an overuse of medications, perhaps, that will turn the statistics around?

The questions are serious, but, researchers say, for now there are no easy answers, only lessons in humility as, over and over again in recent years, scientists have seen their best predictions overthrown.
Life expectancy, for example, has been a real surprise, says Eileen M. Crimmins, a professor of gerontology and demographic research at the University of Southern California. “When I came of age as a professional, 25 years ago, basically the idea was three score years and 10 is what you get,” Dr. Crimmins said. Life span was “this rock, and you can’t touch it.”
“But,” she added, “then we started noticing that in fact mortality is plummeting.” Will it continue much longer?

“It is an extremely controversial area, and the answer is, We don’t know,” said Dr. Richard J. Hodes, director of the National Institute on Aging.Some worry, for example, that today’s fat children will grow up to be tomorrow’s heart disease and diabetes patients, destroying the nation’s gains in health and well-being.

“It is very legitimate to be concerned about levels of overweight and obesity in kids,” said David Williamson, a senior biomedical research scientist at the Centers for Disease Control and Prevention. “But at the same time, those levels of obesity are overlaid on improvements in health in children, which also affect long-term health and longevity.”

The mixed picture has led to disparate views about what is likely to occur.
S. Jay Olshansky, a professor of epidemiology and biostatistics at the University of Illinois at Chicago, predicted in The New England Journal of Medicine that obesity would lead to so much diabetes and heart disease that life expectancy would “level off or even decline within the first half of this century.”

Dr. Olshansky was countered by Samuel H. Preston, a professor of demography at the University of Pennsylvania. Dr. Preston cited the population’s overall better health, from childhood on, and said that obesity had already been factored into national projections of life spans and that the projections were that life spans would continue to increase. [...]

The problem for now, Dr. Williamson says, is that there is so much concern over obesity that other factors may be ignored.
He tells of a recent episode that illustrates his point, when he went with some Italian colleagues to see a photography exhibit.“We were looking at pictures of Pennsylvania coal miners in the late 1800’s and early 1900’s,” Dr. Williamson said. “A lot of these people were kids.”

“The Italians said to me, ‘Oh, look. These kids were so thin.’ ”
“I said, ‘Well, hell yes they were thin. But were they healthy?’ ” It is likely that they were poor, malnourished and sick from the coal dust, he added. No wonder they were thin.
“It really got me thinking about, gosh, have we gotten so out of touch?” Dr. Williamson said. Obesity, he said, is a very legitimate concern, but it is not the only health risk. And being thin does not necessarily equate with being healthy.

Friday, July 28, 2006

Good Luck

Scientists Hope to Unravel Neanderthal DNA

July 21, 2006

Researchers in Germany said Thursday that they planned to collaborate with an American company in an effort to reconstruct the genome of Neanderthals, the archaic human species that occupied Europe from 300,000 years ago to 30,000 years ago, until being displaced by modern humans.

Long a forlorn hope, the sequencing, or decoding, of Neanderthal DNA suddenly seems possible because of a combination of analytic work on ancient DNA by Svante Paabo, of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and a new method of DNA sequencing developed by a Connecticut company, 454 Life Sciences.

The initial genome to be decoded comes from 45,000-year-old Neanderthal bones found in Croatia, though bones from other sites may be analyzed later. Because the genome must be kept in constant repair and starts to break up immediately after the death of the cell, the material surviving in Neanderthal bones exists in tiny fragments 100 or so DNA units in length. As it happens, this is just the length that works best with the 454 machine, which is also able to decode vast amounts of DNA at low cost.

Recovery of the Neanderthal genome, in whole or in part, would be invaluable for reconstructing many events in human prehistory and evolution. It would help address such questions as whether Neanderthals and humans interbred, whether the archaic humans had an articulate form of language, how the Neanderthal brain was constructed, if they had light or dark skin, and the total size of the Neanderthal population. [...]

95 percent of [DNA found] in the Neanderthal bones belongs to ancient bacteria, said Michael Egholm, a vice president of 454 Life Sciences. But bacterial sequences can be recognized and discarded, he said.

Because Neanderthal DNA is so scarce, Dr. Paabo and the 454 Life Science researchers developed their methods on ancient DNA from cave bears and mammoth. [...]

The first goal of the project will be to sequence three billion units of Neanderthal DNA, corresponding to the full length of the Neanderthal genome. This will require decoding 20 times as much DNA, because so much of the DNA in the Neanderthal bones belongs to bacteria.

Genomes usually must be decoded several times over to get a complete and accurate sequence, but the first three billion bases of Neanderthal should “hit all the essential differences,” Dr. Egholm said. [...]

One of the most important results that researchers are hoping for is to discover, from a three-way comparison of chimp, human and Neanderthal DNA, which genes have made humans human. The chimp and human genomes differ at just 1 percent of the sites on their DNA. At this 1 percent, Neanderthals resemble humans at 96 percent of the sites, to judge from the preliminary work, and chimps at 4 percent. Analysis of these DNA sites, at which humans differ from the two other species, will help understand the evolution of specifically human traits “and perhaps even aspects of cognitive function,” Dr. Paabo said.

The degree of resemblance between humans and Neanderthals is fiercely debated by archaeologists, and even issues like whether Neanderthals had language have not been resolved. Dr. Paabo believes that genetic analysis is the best hope of doing so. He has paid particular attention to a gene known as FOXP2, which from its mutated forms in people seems to be involved in several advanced aspects of language.

A longstanding dispute among archaeologists is whether the modern humans who first entered Europe 45,000 years ago, ultimately from Africa, interbred with the Neanderthals or forced them into extinction. [...]
Evidence from the human genome suggests some interbreeding with an archaic species, [says Bruce Lahn, a geneticist at the University of Chicago], which could have been Neanderthals or other early humans. [...]

Dr. Stephen O’Brien, a geneticist at the National Cancer Institute, said that having the Neanderthal genome would be “a very exciting prospect” because it would serve as a reference point for deciding which genes had been selected for in recent human evolution.

The chimpanzee, with which humans shared an ancestor who lived some five million years ago, is one such reference point but the Neanderthals, who split from the modern human lineage some 500,000 years ago, would provide a much more helpful signpost to recent evolutionary events, Dr. O’Brien said, like adaptations as modern humans dispersed from their African homeland and the genetic differences between the three major human ethnic groups of Africans, Asians and Europeans. [...]

If Dr. Paabo and 454 Life Sciences should succeed in reconstructing the entire Neanderthal genome, it might in theory be possible to bring the species back from extinction by inserting the Neanderthal genome into a human egg and having volunteers bear Neanderthal infants. This might be the best possible way of finding out what each Neanderthal gene does, but there would be daunting ethical problems in bringing a Neanderthal child into the world again.

Dr. Paabo said that he could not even imagine how such a project could be accomplished and that in any case ethical concerns “would totally preclude such an experiment.”


I'm quite curious about what happened to the Neanderthals. Hopefully it was a "make love, not war" scenario, but based on human history, I don't think that that's likely.

Wednesday, July 26, 2006


We are defending our sovereignty
Ali Fayyad, Hizbullah leadership
Tuesday July 25, 2006
The Guardian

For nearly two weeks Israel has been waging a war of terror and aggression against Lebanon. Its stated justification is the capture by the Islamic Resistance (Hizbullah) of two Israeli soldiers with the aim of exchanging them for Lebanese prisoners. The war has already resulted in the killing of around 400 and wounding of more than 1,000 Lebanese. Most are civilians (a third children), crushed in their homes or ripped to pieces in their cars by Israeli bombs and missiles.

In reality, the Israeli escalation is less about the two soldiers and more about its determination to disarm the Lebanese resistance

[ …]

In the context of the continued occupation, detention of prisoners and repeated Israeli attacks and incursions into Lebanese territory, the capture of the Israeli soldiers was entirely legitimate. The operation was fully in line with the Lebanese ministerial declaration, supported in parliament, that stressed the right of the resistance to liberate occupied Lebanese territory, free prisoners of war and defend Lebanon against Israeli aggression. International law also allows peoples and states to take action to protect their citizens and territory. The Israeli onslaught is aimed not only at liquidating the resistance and destroying the country's infrastructure but at intervening in Lebanese politics and imposing conditions on what can be agreed.

There is now a clear national consensus on the need to maintain the military power necessary to prevent Lebanon from being subjugated by Israel's war machine. Popular resistance is a way of redressing the huge imbalance of power, defending Lebanon's sovereignty and preventing Israel from intervening in Lebanese internal affairs, as has happened repeatedly since 1948. It is also - as has been the case in the prisoner-capture operation - dictated by an entirely local agenda, rather than reflecting any Syrian or Iranian policy.

The aggression against Lebanon, which has primarily targeted civilians and failed to achieve any tangible military objectives, is part of a continuing attempt to impose Israeli hegemony on the area and prevent the emergence of a regional system that might guarantee stability, self-determination, freedom and democracy.

Hizbullah has tried from the start of this crisis to limit the escalation by adopting a policy of limited response while avoiding civilian targets; its aims were restricted to freeing the prisoners of war held in both camps. However, Israel's systematic destruction of entire civilian areas in Beirut and elsewhere and perpetration of scores of horrific massacres prompted Hizbullah to shift to an all-out confrontation to affirm Lebanon's right to deter aggression and defend its territorial integrity and its citizens, just as any sovereign state would do.

Thus far, Hizbullah has had surprising military successes, while maintaining its position in the face of Israel's superior fire power, and preserved its capacity to wage a long-term war. But Hizbullah is still ready to accept a ceasefire and negotiate indirectly an exchange of prisoners to bring the current crisis to an end.

This is what Israel has so far rejected, with the support of the US. For this is also a war of American hegemony over the Middle East, and the US - supported by the British government - is fully complicit in the Israeli war crimes carried out in the past two weeks. It would appear that the peaceful option will not be given a chance until Hizbullah and the forces of resistance have demonstrated their ability to confront Israel's aggression and thwart its objectives, as happened in 1993 and 1996. That is why resistance is not only a pillar of our sovereignty but also a prerequisite of stability.

· Ali Fayyad is a senior member of Hizbullah's executive committee

That The Grauniad can allow itself to be a mouthpiece for a fascist, anti-Semitic terrorist organisation almost destroys one’s faith in the mere existence of any kind of Decent Left at all. I don’t care if the paper does have a damn good film section (it does).

As one of the commentors on the article puts it: “Hey Guardian editors, Why not cut out the middleman and simply title all of your comment pieces, "The Jews Are Our Misfortune"?

Thankfully, Harry’s Place does some sterling work in restoring faith in the Decent Left with an outright condemnation of the latest Galloway-led, anti-Semitic lunacy:

I had been thinking about writing a few words on why people on the left should be giving their support to Israel in it's struggle with the fascists of Hezbollah and Hamas but when I started to sketch out the possible structure of such a post I realised how ludicrously pointless and offensive this debate on the left has become.

After all, for a discussion among leftists aimed at convincing those who disagree with a basic position of solidarity with Israel one should probably start with some sort of explanation for why the Jewish state has a right to exist.

But isn't there something odd, indeed sick, about even having to make that case?(continues)

Tuesday, July 25, 2006

But what effect will these record temperatures have on the spread of bird flu?

From the Thunderer column in The Times:

Not what I call a hot story
by Ross Clark

CAN THERE be any headline more disappointing than the words: “Britain sizzles as heatwave breaks records”? It is usually followed by a report about the hottest Tuesday in Bognor Regis for four years. Admittedly this week’s “record” temperature — 36.5C at Wisley, Surrey, on Wednesday — was a little more substantial. But even so, the news was received in my house with a large yawn.

This, supposedly, was the highest temperature recorded in Britain in July. But that is only if you discount the 38.1C recorded in Tonbridge on July 22, 1868. This record is faithfully listed in my dog-eared copy of The English Climate by H. H. Lamb, of the Meteorological Office, (published 1954) but has now been struck from the records on the basis that the thermometer and its housing is not now considered to have conformed to modern standards. Of course it didn’t: it was built in the 1860s, that’s why.

The real reason the 1868 heatwave has been wiped from the historical record, of course, is that it is highly inconvenient for the global warming lobby. How can you scare people into thinking that every hot summer day constitutes yet more evidence of man-made meteorological doom when actually it was even more sizzling back on that balmy day in Tonbridge when gentlemen were briefly driven to remove their stove-pipe hats?
Logically, of course, if you strike out one Victorian record you should strike them all out.

Yet, whenever it suits them, members of the global warming brigade are more than happy to quote dubious anecdotal evidence to try to prove their point that we are all slowly being fried.

The reason weather records keep getting broken, both in Britain and the world at large, is not so much that the world is becoming warmer — or, as is alternatively asserted in the case of a record freeze, the climate is becoming more extreme. It is because there are many more recording stations than there used to be, t increasing the chances that an extreme climatic event will be recorded. Moreover, compared with old thermometers, mod- ern recording equipment is capable of registering very brief increases in temperature of a few seconds’ duration.

I’ve got to go now and turn down my electric fan. It is almost getting chilly in my office. Hold the front page! It is the coldest day in Cambridgeshire since last Sunday.

It has been well-observed that a consequence of our 24-hour news culture is the wildfire viral spread of half-understood and context-free here-today gone-tomorrow memes, especially memes predicting global disaster. Journalists interviewing journalists about reports originating in newsfeeds written by journalists based on their interviews with other journalists.

The upshot is that nearly all ‘news’ is nonsense. The BBC have had no new updates in their Bird Flu section since the beginning of July. Now that summer’s here, we’ve got global warming again.

I have a radical solution: Olds™. All domestic news should be reported with a good 10 years’ worth of hindsight; all science, technology, international politics and military reporting should be given at least 20 years to mature, and nothing at all should be said about the climate until 500 years have passed. Bulletins need last no longer than two or three minutes.

So a sample of today’s Olds™ might be: “Good evening, here are the Olds. There was no Millennium Bug. Turns out we should have got rid of Saddam properly first time around. Diana, Princess of Wales was killed in a car crash that was a pure accident: sad, but don’t go mad over it. Good night.”

Thursday, July 20, 2006

Summer Hiatus for Duck

I will be on hiatus for the next three weeks or so. I will be moving my daughter down to the Phoenix, AZ area where she will be going to college, and will also be doing some home refurbishing to get my house ready for sale. The DD will be in the capable hands of Skipper, Brit and Oroborous in my absence.

May the Quack be with you!

Tuesday, July 18, 2006

Lick-Your-Plate-Clean Good

I discovered this combination this afternoon. While I expected that it would be good, I wasn't prepared for just how wonderful this gustatory delight turns out to be !

ground turkey
cooked kidney beans
lime juice
tomato sauce (the kind found next to the tomato paste on the grocery shelf, not spaghetti sauce - although actually, spaghetti sauce might work too, if it's a vegetarian style)
barbecue sauce (a sweet type)
garlic powder (not garlic salt, but ground dried garlic)
cheddar cheese (sharpness to taste - I used medium sharp, but extra-sharp would also have been tasty)
fresh spinach leaves, washed

One serving:
Grate or dice one ounce of cheddar cheese, set aside.
Brown about four ounces of ground turkey, in a patty or loose, liberally sprinkling with lime juice while cooking.
Heat two thirds of a cup of kidney beans to piping hot. In a medium bowl, combine the beans, a tablespoon of butter, a quarter cup of tomato sauce, two tablespoons of barbecue sauce, and a half teaspoon of garlic powder (or more for garlic lovers such as myself - I actually used a full teaspoon). Mix well.

While still warm, spread the bean and tomato sauce mixture on a dinner plate. Crumble or spread the hot turkey on top, and then sprinkle on another teaspoon or two of lime juice. Add the cheese.
On top of the cheese, spread a half cup of sauerkraut, and then a cup and a half of fresh spinach leaves.

Holy Hannah, that's good eating !
It also helps to prevent flu, scurvy, heart disease, and cancer, but in this case those are secondary benefits.

Yo Blair! What about this trade thingy sh**?

From the BBC:

Forget prime minister, Mr Blair, or even plain old Tony. The new way to address the prime minister, we learn, is "Yo Blair".

That at least is how George Bush greets the PM in private, according to unguarded remarks they both made in front of an open G8 microphone.

We also learn how Mr Blair refers to international commerce as "this trade thingy".

And there was some strong language used as well. The US president apparently believes the Middle East conflict could be ended if only pressure were put on Syria "to get Hezbollah to stop doing this shit".


A transcript of the off-the-cuff conversation between US President George W Bush and UK Prime Minister Tony Blair during a break at the G8 conference in Russia.

The president was caught on tape using an expletive as he described the actions of Hezbollah in attacking Israel.

The two men start by discussing an exchange of gifts:

Bush: And thanks for the sweaters - I know you picked em out yourself...

Blair: Oh yes absolutely - in fact I knitted it!!!

Bush: What about Kofi Annan - he seems all right. I don't like his ceasefire plan. His attitude is basically ceasefire and everything sorts out.... But I think...

Blair: Yeah the only thing I think is really difficult is that we can't stop this without getting international presence agreed. I think what you guys have talked about which is the criticism of the [inaudible word]. I am perfectly happy to try and see what the lie of the land is, but you need that done quickly because otherwise it will spiral.

Bush: Yeah I think Condi's [US Secretary of State Condoleezza Rice] gonna go soon.

Blair: Well that's all that matters but if you... You see at least it gets it going.

Bush: I agree it's a process...I told her your offer too.

Blair: Well it's only if she needs the ground prepared as it were. If she goes out she HAS to succeed whereas I can just go and...

Bush: You see the irony is what they need to is get Syria to get Hezbollah to stop doing this shit and it's all over...

Blair: Dunno... Syria....

Bush: Why?

Blair: Because I think this is all part of the same thing...

Bush: (with mouth full of bread) Yeah

Blair: Look - what does he think? He thinks if Lebanon turns out fine. If you get a solution in Israel and Palestine. Iraq goes in the right way

Bush: Yeah - he's [indistinct]

Blair: Yeah.... He's had it. That's what all this is about - it's the same with Iran

Bush: I felt like telling Kofi to call, to get on the phone to Assad and make something happen.

Blair: Yeah

Bush: [indistinct] blaming Israel and [indistinct] blaming the Lebanese government....

You can also watch a little video clip on the BBC website (at least, you can in the UK), including an ‘analysis’ from Baria Alamuddin, Foreign Editor of the Arabic newspaper Al Hayat.

Unfortunately, I had to turn it off as soon as Ms Alamuddin began lamenting the ‘casual’ way the two Bs were discussing the Middle East when ‘there are people dying out there.’

Saturday, July 15, 2006

Peter's World -- Time and Money

For those new to TDD, or with limited retention for the breathtakingly trivial, here is the deep background for what follows:

Peter's World

Peter's World -- The Bust


The easiest thing to do with a traffic ticket is to pay it and move on. Nearly always, that also coincides with the smart thing to do. While there are odd exceptions (E.g., driving a British sports car -- an MGA -- with California plates through a west Texas town at 1:00 am is a sure way to get a ticket for doing 55 in 25, a previously unheard feat for that car while in second gear; said ticket got promptly torn up when the arresting officer discovered the driver was stationed at the Air Force base ten miles down the road. That brought home the truth of the saying "poor New Mexico: so far from heaven, so close to Texas.), the ticket is the payback for an actual violation, which no amount of quibbling can avoid.

But what's the fun in that? Besides the disconnect between what the officer said he saw, and what I saw, here was a perfect opportunity to take a trip to Peter's World, to view the law from where the sausage is made.

With a month and a half between my pavement performance art and the preliminary hearing, I had some time to tackle the central riddle: how is it that an apparently stand-up police officer saw me entering the intersection two seconds after the light changed, when I saw myself entering it while the light was still yellow?

This is the part of the show where the Law & Order lawyers, and, therefore, Peter, play the role of quasi-detectives, doing police work the police aren't doing.

Fortunately, the pivotal T-intersection is not only on my way home from work, it is equally convenient to approach from both relevant directions, south and east. After a couple weeks of trying to explain the two-second gap, something Rosemary Woods would have found a doddle, I had the good fortune to hit the red on the west arm of the T, right where the arresting officer was sitting. It took a good half minute of staring before the obvious grabbed me: despite sitting in an F-150, a much higher seating position than the police officer enjoyed, I couldn't see the limit line marking the entry point to the intersection.

Why? For the first time I noticed the right arm of the T had an additional lane on the south side, to facilitate right turn traffic at a major intersection barely 200 yards away. Since I am talking arms here, picture the left as belonging to some pasty faced, pigeon chested dweeb like, well, I am. Now envision the right arm that would be attached to The Governator.

I'm sure you can see the problem. A subsequent session with a tape-measure, with some inevitable experimental error due to dodging traffic, showed this additional lane displacing the limit line from where it would otherwise have been by 45 feet.

Or, given the normal speed making the right turn into the rightmost travel lane, right at 2 seconds travel time.

Suddenly, I have a case.

Two weeks later, armed with PowerPoint slides with circles, arrows, and explanations, I went to the preliminary hearing, where I was to learn about ...


Being the archetypal naif in Peter's World, I assumed the preliminary hearing was my opportunity to present my side of the story which, if persuasive enough, would result in dismissing the case.

Wrongo, wonderwings.

For traffic violations, preliminary hearing is nothing more than an assembly line intended to make the cash register go cha-ching just as fast as humanly possible.

At the appointed time, I took my place in a line with about 40 other people. I happened to sit next to a far from unattractive redhead in her early thirties. Being as I am married, neither rich nor attractive, and with my thirties in my deep six, it struck me as singularly ironic that, for the first time in my life, I had readily at hand an excellent pick up line.

"So, what are you in for?"

Whereupon I heard the soccer mom's story of being tagged for running a red, while she was certain she had entered the intersection in time.

Thereby doubling the irony, since, for the first time in my life, I could follow up a sure fire pick-up line with commiseration.

The opportunity was fleeting, because the line evaporated with unseemly haste.

After hearing my name called in the flat monotone of someone who has become thoroughly bored with her job, it was my turn in the barrel with the prosecuting attorney.

Before I could even sit down he said "If you agree to pay the fine, we will waive the points." And then promptly started writing to that effect on his copy of my citation.

"I don't agree."

The pen stopped in mid-scribe, and he looked up at me sharply, clearly annoyed with my impertinent interruption of the cha-ching rhythm.

Opening my folder of meticulous slides "And here's why, the inter ..."

"If you elect to take this to trial, you will pay not only the fine, but get the points on your record. This is my final offer. Do you accept?"

Not being a poker player, I have no idea whether I have a poker face. Which means I may not have successfully hidden the sudden flash of anger and contempt. My answer's tone might also have been less than flat and neutral.


Next: Peter's World -- A Fool for a Client

Friday, July 14, 2006

Intelligent Denial

John Derbyshire takes down George Gilder, The Discovery Institute and Creationism/Intelligent Design in grand style:
It’s a wearying business, arguing with Creationists. Basically, it is a game of Whack-a-Mole. They make an argument, you whack it down. They make a second, you whack it down. They make a third, you whack it down. So they make the first argument again. This is why most biologists just can’t be bothered with Creationism at all, even for the fun of it. It isn’t actually any fun. Creationists just chase you round in circles. It’s boring.

It would be less boring if they’d come up with a new argument once in a while, but they never do. I’ve been engaging with Creationists for a couple of years now, and I have yet to hear an argument younger than I am. (I am not young.) All Creationist arguments have been whacked down a thousand times, but they keep popping up again. Nowadays I just refer argumentative e-mailers to the TalkOrigins website, where any argument you are ever going to hear from a Creationist is whacked down several times over. Don’t think it’ll stop ’em, though.**


Well, here is George Gilder, taking up the challenge. He offers us a complete metaphysic. He then makes the very large claim that science cannot (or will soon be unable to — I am not clear on this point) progress any further unless it abandons its present materialist assumptions and takes up this new metaphysic of his. What can we make of this?

First let’s take a look at George’s metaphysic. It is pluralistic, which is to say, it argues that the basic substance of the universe is of several different kinds. In George’s schema there are three kinds of stuff: intelligence, information, and matter.

Information, says George, is by definition intelligently organized. If it were not, it would not be information, only random static. Further, information needs some material substrate on which to be inscribed, so that matter (understood in the modern sense of matter-energy) is the carrier of information.

Information is thus at the center of his schema, standing between matter, the substrate on which it is inscribed, and intelligence, which organizes it.


There is a hierarchy of being, with insensate matter at the bottom, carrying very little information, up through living creatures, which carry immensely more, having far more complex material substrates, to a supreme intelligence which (I think) has the entire universe as its substrate.

Information, designed by intelligence, makes everything happen. The information in a computer program makes your phone bill happen (and the programmer’s intelligence makes the program happen); the information in DNA makes proteins happen. This is a one-way process: Your phone bill can’t make the computer program happen (nor can the program make your intelligence happen), a protein can’t make a gene happen, etc. Nothing at the lower-information level — a phone bill, a protein — can make anything at the higher-information level — program, gene — happen. This refutes materialism’s assertion that higher information-bearing structures can arise from lower ones. It also refutes evolution, which has high-information-bearing substrates arising out of low-information-bearing ones.


We then proceed to George’s main point, which is, that science cannot (or will soon be unable to) progress any further unless it abandons its present materialist assumptions and takes up this new metaphysic. What can be said about that?

I think the main thing to be said about it is, that George’s metaphysics is going to be a tough sell to scientists. This is important, because science is a very important part of our culture — “the court from which there is no appeal” (Tom Wolfe). If you can’t sell your metaphysic to scientists, George, then it is just an intellectual curiosity, headed nowhere.

There are two reasons why George’s ideas, as presented in this essay, are a tough sell. First, he loses biologists right away with his Creationist patter. Second, George’s Discovery Institute and his Center for Science and Culture don’t discover things and don’t do any science.

First, the Creationist stuff.

Creationists seem not to be aware of how central evolution is to modern biology. Without it, nothing makes sense. I recently, here on NRO, reviewed Nicholas Wade’s book about human origins. We have known a good deal about human origins for a long time, from researches in archeology and zoology. Darwin himself wrote a book on the topic back in 1871. Now, with the tools of modern genomics at our disposal, we are finding out much, much more. None of this would be possible, none of it would make any sense, if speciation by evolution were not the case. A research program in paleoanthropology premised on the idea that speciation by evolution is not the case, would have nowhere to go, nothing to do, and nothing to tell us. It is hard to see how any such program would be possible; though if George will tell me, I’ll be glad to broadcast his idea.

It’s not just paleoanthropology. Speciation via evolution underpins all of modern biology, both pure and applied. Note that in the latter category fall such things as new cures for diseases and genetic defects, new crops, new understandings of the brain, with consequences for pedagogy and psychology, and so on. To say to biologists: “Look, I want you to drop all this nonsense about evolution and listen to me,” is like walking into a room full of pilots and aeronautical engineers and telling them that classical aerodynamics is all hogwash.

Biologists are of all scientists least in need of a new metaphysic. Neurophysiology aside, it is in the “hard” sciences that our epistemological underwear is showing. When physicists have to resort to explanations involving teeny strings vibrating in scrunched-up eleven-dimensional spaces a trillion trillion trillion trillionth of an inch across, or cosmologists try to tell us that entire universes are proliferating every nanosecond like bacteria in a petri dish, there is a case to be made for a metaphysical overhaul. Not that work in these fields has come to a baffled dead stop, as George seems to imply. Far from it; the problem in fundamental physics and cosmology is not so much that we have run out of theories, as that we have too many theories. I’ll grant that there are epistemological issues, though.

Biology, by contrast, really has no outstanding epistemological problems. With the tools of modern genomics at its disposal, it is in fact going through a phase of great energy and excitement, so that biologists are much too busy to be bothered with epistemological issues. To modify the simile I offered above: Creationists are walking into that room full of pilots and aeronautical engineers right at the peak of the Golden Age of flight, around 1930. “Hey, those machines of yours don’t really fly, you know…”

Another turn-off is the blithe way George makes pronouncements about the limits of our understanding. Doesn’t he know the track record here? I think the star of this particular show is Auguste Comte, who declared in 1835 that we could never possibly know the composition of the stars. The spectroscope was invented in 1859.

Not deterred by Comte’s example, George writes that: “This process of protein synthesis and ‘plectics’ cannot even in principle be modeled on a computer.” You sure about that, George? “Even in principle”? How do you know that? Computer modelers are awfully ingenious and creative people. Are you quite sure that you are ahead of all of them? Even that team of 19-year-old, 190-IQ whiz kids in that Microsoft-funded lab in Shanghai, whose heads are full of amazing new ideas? Oh, you’ve never met them? Perhaps you should. And that other team over here, and that one there, and the folk in Bangalore, and the guys in Stuttgart, and that great new institute in Budapest... Never met them either? Oh.

If, five years from now, one of these innumerable teams of researchers develops a really good computer simulation of protein synthesis, will George discard that metaphysic of his, that told him it couldn’t be done? I hope he will.

George’s attitude here is anyway at odds with his “social” arguments. He cannot imagine that anyone could come up with a computer model of protein synthesis because... well, no one ever has. Similarly, Michael Behe of Darwin’s Black Box fame, back in the 1990s, could not imagine that anyone could come up with an evolutionary pathway for the bacterial flagellum, because... no-one ever had. They since have. So George’s assertion that “Behe’s claim of ‘irreducible complexity’ is manifestly true” is manifestly false. Yet these are the people who lecture us on Establishment Science’s reluctance to countenance new ideas!

That brings us to the second problem that scientists have with George’s system: After being around for many years, it has not produced any science. George’s own Discovery Institute was established in 1990; the offshoot Center for Science and Culture (at first called the Center for the Renewal of Science and Culture) in 1992. That is an aggregate 30 years. Where is the science? In all those years, not a single paper of scientific standing has come out of (nor even, to the best of my knowledge, been submitted by) the DI or the CSC. I am certainly willing to be corrected here. If the DI or CSC have any papers of scientific standing — published or not — I shall post links to them to NRO for qualified readers to scrutinize.

Scientists discover things. That’s what they do. In fast-growing fields like genomics, they discover new things almost daily — look into any issue of Science or Nature. What has the Discovery Institute discovered this past 16 years? To stretch my simile further: Creationists are walking into that room full of pilots and aeronautical engineers right at the peak of the Golden Age of flight, never having flown or designed any planes themselves. Are they really surprised that they get a brusque reception?

(I should say here that the handful of Creationists who are themselves professional working scientists produce papers that are, I am told, scientifically valuable. None of those papers are premised on Creationist principles, however, and none have appeared under the aegis of the DI or CSC. The Creationism of Creationist scientists like Michael Behe is extramural — a sort of spare-time hobby. The same can be said of the militant atheism of Daniel Dennett and Richard Dawkins, incidentally.)

Creationists respond to this by telling us that they can’t get a hearing in the defensive, closed-minded, “invested” world of professional science. Creationist ideas are too revolutionary, they say. The impenetrably reactionary nature of established science is a staple of Creationist talk. They seem not to have noticed that twentieth-century science is a veritable catalog of revolutionary ideas that got accepted, from quantum theory to plate tectonics, from relativity to dark matter, from cosmic expansion to the pathogen theory of ulcers. Creationism has been around far, far longer than the “not yet accepted” phase of any of those theories. Why is the proportion of scientists willing to accept it still stuck below (well below, as best I can estimate) one percent? The only answer you can get from a Creationist involves a conspiracy theory that makes the Protocols of the Elders of Zion look positively rational.

Three or four paragraphs into George’s piece, seeing where we were headed, and having accumulated considerable experience with this kind of stuff, I did a “find” on the phrase “scientific establishment.” Sure enough, there it was: those obscurantist, defensive old stuffed shirts of “consensus science” — the Panel of Peers, George calls them — keeping original thought at bay.

In George’s example the original thinker was Max Planck, whose first publication on his revolutionary quantum theory of radiation was in 1900. Poor Max Planck was so thoroughly shunned and ostracized by that glowering, starched-collar Panel of Peers for daring to present ideas that violated their settled convictions, that five years later they made him president of the German Physical Society, and in 1918 gave him the Nobel Prize for Physics! Those mean, blinkered scientific establishmentarians!

Creationism has been around in one form or another for well over a century, which is to say, more than 20 times longer than the interval between Max Planck’s first broadcasting of his quantum theory and his election as president of the Deutsche Physikalische Gesellschaft. The fact that Creationism still has no scientific acceptance whatsoever — no presidencies of learned societies, no Nobel Prizes, not a bean, not a dust mote — does not show that the science establishment is hostile to new ideas, it only shows that scientists cannot see that Creationism has anything to offer them.

What gets the attention of scientists is science. Scientists do not shun Creationism because it is revolutionary; they shun it because Creationists don’t do any science. They started out by promising to. The original plan for the CSC (then CRSC) back in 1992 had phase I listed as: “Scientific Research, Writing & Publicity.” The CSC has certainly been energetic in writing and publicity, but if they have done any scientific research, I missed it.

* * * * *

Look at the last paragraph of George’s piece. It is a call to science to: “grasp the hierarchical reality [that the summit of the hierarchy, a.k.a. God] signifies,” to transcend “its [i.e. science’s] materialist trap,” to “look up from the ever dimmer reaches of its Darwinian pit and cast its imagination toward the word and its sources: idea and meaning, mind and mystery, the will and the way.” Science, says George, must — must! — “eschew reductionism — except as a methodological tool — and adopt an aspirational imagination.” (Isn’t that one humongous big “except” in that last sentence there, George?)

A scientist, reading those words, might reasonably ask: “Why? Why must I do those things you urge me to do? I’m getting along just fine as I am, discovering new things about the world, pushing the wheel of knowledge forward a few inches every year. Did you see that groundbreaking paper of mine in Developmental Biology last month? No? Well, everyone in the field is talking about it. So why should I buy into this metaphysics you’re selling? What’s in it for me? What’s in it for science at large?”

Replies George, from that same closing paragraph: because “this is the only way that science can ever hope to solve the grand challenge problems before it, such as gravity, entanglement, quantum computing, time, space, mass, and mind.”

The scientist will then say: “The only way, you say? But look, I’m not doing too badly generating scientific results — uncovering new facts about the world — by following my current way, from down here in my ‘Darwinian pit’. So right off, I can’t agree with you that this new way of yours is the only way. I have no feeling whatsoever that I am stuck, and looking for a way. I have a way — orthodox scientific method. It works. It generates reproducible results, and suggests testable theories. Possibly this essay of yours offers a better way, but yours sure isn’t the only way.

“And why should I think your way is even a better way to tackle the problems you listed? After all, you, with your ‘only way,’ and your institutes with high-sounding names and lavish funding, and all your decades of being in operation, have not generated any scientific results at all. If someone like you, with a radical new outlook, grounded in a radically new metaphysics, starts providing solutions to difficult problems like those in your list, of course I will be impressed. Of course I will take you seriously; I will adopt your methods; I will transcend materialism and eschew reductionism and all that good stuff you exhort me to do. Of course I will! I will come to you humbly to learn how to do the prescribed transcending and eschewing. I’ll be among the first to come knocking on the Discovery Institute’s door, I guarantee. I want to advance knowledge, along whichever path looks most promising. That’s why I’m a scientist.

“As it is, though, you have nothing to show me. Has your Institute, or your Center, actually come up with a new, testable theory of, say, gravity? Where can I read about it? Oh, you haven’t? Has your Discovery Institute, since its founding in 1990, actually, er, discovered anything?

“No? Well, look, no offense, George, but I’ll tell you what. Go back to your Institute, hire some bright new researchers, teach them your metaphysics and your new methodology, buy them some computers and lab equipment, and let them loose to do some science. When they’ve got testable theories and reproducible results, I’ll pay attention. Until then, if you’ll excuse me, I have to get back to my own lab.”

What would you say to this guy, George?

I'll have more to say on this later, but if I were George Gilder, I'd be sweeping up the shreds of my intellectual project with a broom right now.

Thursday, July 13, 2006

Populism, Authenticity and other Lies

Caleb Stegall is at it again, rousing a rabble to combat the dark forces of prosperity, namely, you and I. In this editorial for the Dallas Morning News, he calls for a new prairie populism to arise:
Yet most people in the Gilded Age seemed desperate. A growing disparity between the haves and have-nots brought on by "unbridled entrepreneurialism," a dramatic increase in both social and geographic mobility, the spread of centralized corporate control over consumer goods and globalizing markets vulnerable to forces far, far away all contributed to a sense of unease and insecurity. Populist fervor swept the middle and lower classes as they felt their livelihood and way of life threatened by collusion between their government and rapidly expanding commerce, industry and mass cultures of transportation and communication.

And so it is today. Midwestern towns are drying up and blowing away like so much tumbleweed. Our inner cities too often function as prisons without bars; suburbia is a blighted, soulless landscape of nowhere; and the yeoman freeholder who was once the backbone of rural America is virtually extinct. Pollsters wonder why George W. Bush isn't getting more credit for strong economic numbers. Perhaps it is because what are signs of health driven by rampant consumerism are experienced by most Americans as symptoms of economic and spiritual rot – their own and their country's.

Americans, many of them at least, are awakening to the truth articulated more than 50 years ago by writer Whittaker Chambers: that the modern world's "vision of comfort without effort, pleasure without the pain of creation, life sterilized against even the thought of death, rationalized so that every intrusion of mystery is felt as a betrayal of the mind, life mechanized and standardized" does not "make for happiness from day to day" – and further, that it may mean "catastrophe in the end."

My guess is that what all the commentators are sensing is something real. Could it be that unconstrained growth, hypermobility and global markets actually produce social and political instability?

In the mid-20th century, economist Joseph Schumpeter argued that capitalism – the acknowledged world-historical champion in terms of producing wealth and prosperity – would, by a process he called "creative destruction," eventually undermine the very social institutions that gave it birth and guarded its existence. He pointed out that market capitalism exposed more natural ordering structures – the "ties that bind" – to a brutal new calculus. Commitment to kin, community and place entail making heavy economic sacrifices and provide benefits not easily entered on a balance sheet. The more cost-efficient process of market economics fomented an ongoing progressive revolution that eventually rendered those social and family ties largely superfluous. Lord Acton observed that "every institution tends to perish by an excess of its own basic principle."

This tendency of our political and economic culture toward a state of permanent revolution is the hallmark of any modern progressive society. And if there is one deity today to which every politician, right and left, will pay obeisance, it is the god of progress.

Progressives of all political stripes learn early and often that to get on, they better get out, move on, follow every rainbow. "Oh, the places you'll go," crooned Dr. Seuss, and Americans went and went and went until we became a rootless itinerant people – which, it turns out, is exactly the kind of workers required by an economy built on creative destruction. Nanny-state leftists and corporate-state rightists have long been in bed together promoting the wage-entitlement economy with its instantly mobile and fetter-free worker and 100 percent out-of-the-home servitude.

There is a tremendous cost to the health of the republic, to the common good, that comes with the creative yet destructive power of unlimited economic and political progressivism. The vital role property-owning and self-sufficient families, small towns and regional governments play in a free republic has been recognized for centuries. The civic virtues associated with widespread ownership of land, decentralized systems of trade, commitment to the common good of one's tribe and the moral sturdiness of belonging to a tradition are necessary to the continued independence of a free people.

And the loss of these goods will always strike the middle classes first and hardest. When they are lost, they are felt as loss – loss of an entire way of life. And just as the masses of dispossessed and alienated fought back during the Gilded Age, they are likely to again.

At the 1896 Democratic Convention, the populist lion William Jennings Bryan roared against the elite and monied interests controlling America: "We are fighting in the defense of our homes, our families and posterity. We have petitioned, and our petitions have been scorned. We have entreated, and our entreaties have been disregarded. We have begged, and they have mocked ... We beg no longer; we entreat no more; we petition no more. We defy them!"

And now we are on the cusp of a new wave of populism in search of its own Bryan to rise up on behalf of the people and defy their progressive masters.

Progressive masters? Can someone please tell me who my progressive master is? I haven't groveled at his feet lately, and after reading this I'm fearful that I might be due for a major dose of my-place putting in.

Maybe I've been brainwashed by corporate culture, but I just don't see where he's coming from. Kansas just isn't that different than Minnesota, but I feel like I'm hearing from someone in another nation or continent or century. What is Stegall's definition of self-sufficiency, and why don't I have it? Or more importantly, why did some yeoman farmer from Kansas in 1840 have more of it than I do? And how can you be simultaneously self sufficient and deeply embedded in a local community? Aren't the two mutually exclusive? Isn't a community defined by its web of inter-dependencies? Stegall can't even romanticize consistently.

It is very hard to take someone like Stegall seriously. He speaks in platitudes about his romanticized dreamworld, and invokes dire, vague rumors of social plagues and calamities, but it is very difficult to really get at what exactly is sticking in his craw. Is he against technology? Is he against capitalism? Is he against democracy? How far back does he want to wind the clock? Are cameras ok in his world? Radio? The railroads?

Unless he wants to return us to the primeval forest, Stegall will have to pick a milestone of civilized development in which his idealized society can be realized. But undoubtedly Stegall would prefer that his world remain fixed in time at that milestone. The problem is that in order to reach that milestone, civilization progressed through the very same process of creative destruction that he finds so abhorrent in its current form. Mankind is not a species that will tolerate a status quo for long. Every civilizational milestone in our history was merely a snapshot in time of a constantly evolving reality. Radio, cameras and the railroads were the disruptive technologies that built the social milieu for which people like Stegall think of as the halcyon past of traditional ways. His Golden Age was the Brave New World of his great-great grandparents.

And lest we curse all civilization and pine for the authenticity of the noble savage, Spengler provides a timely antidote in this article from the Asia Times:
Two billion war deaths would have occurred in the 20th century if modern societies suffered the same casualty rate as primitive peoples, according to anthropologist Lawrence H Keeley, who calculates that two-thirds of them were at war continuously, typically losing half of a percent of its population to war each year.

This and other noteworthy prehistoric factoids can be found in Nicholas Wade's Before the Dawn, a survey of genetic, linguistic and archeological research on early man. Primitive peoples, it appears, were nasty, brutish, and short, not at all the cuddly children of nature depicted by popular culture and post-colonial academic studies. The author writes on science for the New York Times and too often wades in where angels fear to tread. [3] A complete evaluation is beyond my capacity, but there is no gainsaying his representation of prehistoric violence.

That raises the question: Why, in the face of overwhelming evidence to the contrary, does popular culture portray primitives as peace-loving folk living in harmony with nature, as opposed to rapacious and brutal civilization? Jared Diamond's Guns, Germs and Steel, which attributes civilization to mere geographical accident, made a best-seller out of a mendacious apology for the failure of primitive society. Wade reports research that refutes Diamond on a dozen counts, but his book never will reach the vast audience that takes comfort in Diamond's pulp science.

Why is it that the modern public revels in a demonstrably false portrait of primitive life? Hollywood grinds out stories of wise and worthy native Americans, African tribesmen, Brazilian rainforest people and Australian Aborigines, not because Hollywood studio executives hired the wrong sort of anthropologist, but because the public pays for them, the same public whose middle-brow contingent reads Jared Diamond.

Nonetheless the overwhelming consensus in popular culture holds that primitive peoples enjoy a quality - call it authenticity - that moderns lack, and that by rolling in their muck, some of this authenticity will stick to us. Colonial guilt at the extermination of tribal societies does not go very far as an explanation, for the Westerners who were close enough to primitives to exterminate them rarely regretted having done so. The hunger for authenticity surges up from a different spring.

European civilization arose by stamping out the kind of authenticity that characterizes primitive peoples. It is a construct, not a "natural" development. One of the great puzzles of prehistory is the proliferation of languages. Linguists believe, for credible reasons too complex to review here, that present-day languages descend from a small number of early prototypes, and splintered into many thousands of variants. Wade says (p 204):

This variability is extremely puzzling given that a universal, unchanging language would seem to be the most useful form of communication. That language has evolved to be parochial, not universal, is surely no accident. Security would have been far more important to early human societies than ease of communication with outsiders. Given the incessant warfare between early human groups, a highly variable language would have served to exclude outsiders and to identify strangers the moment they opened their mouths.

What brought about civilization, that is, large-scale communication and political organization? Conquest is too simple an explanation. We have from Latin five national languages and dozens of dialects, but no comparable development out of the Greek of the earlier Alexandrian empire. Latin and its offshoots dominated Europe because Latin was the language of the Church. The invaders who replenished the depopulated territories of the ruined Roman Empire, Goths, Vandals and Celts, learned in large measure dialects of Latin because Christianity made them into Europeans.

Even in Christianity's darkest hours, when the Third Reich reduced the pope to a prisoner in the Vatican and the European peoples turned the full terror of Western technology upon one another, they managed to kill a small fraction of the numbers that routinely and normally fell in primitive warfare.

Native Americans, Eskimos, New Guinea Highlanders as well as African tribes slaughtered one another with skill and vigor, frequently winning their first encounters with modern armed forces. "Even in the harshest possible environments [such as northwestern Alaska] where it was struggle enough just to keep alive, primitive societies still pursued the more overriding goal of killing one another," Wade notes.

A quarter of the language groups in New Guinea, home to 1,200 of the world's 6,000 languages, were exterminated by warfare during every preceding century, according to one estimate Wade cites. In primitive warfare "casualty rates were enormous, not the least because they did not take prisoners. That policy was compatible with their usual strategic goal: to exterminate the opponent's society. Captured warriors were killed on the spot, except in the case of the Iroquois, who took captives home to torture them before death, and certain tribes in Colombia, who liked to fatten prisoners before eating them."

However badly civilized peoples may have behaved, the 100 million or so killed by communism and the 50 million or so killed by National Socialism seem modest compared with the 2 billion or so who would have died if the casualty rates of primitive peoples had applied to the West. The verdict is not yet in, to be sure. One is reminded of the exchange between Wednesday Addams (played by the young Christina Ricci in the 1993 film Addams Family Values) and a girl at summer camp, who asks, "Why are you dressed like someone died?" to which Wednesday replies, "Wait!"

Why all this nostalgia for places and times that none of us have experienced? Is prosperity too heavy a burden to bear?

Saturday, July 08, 2006

Meaning, suffering and Prozac

Will technology, progress and modern medicine do away with pain, suffering and existential anxiety? And if they do, will that be a good thing? Though the first question may be a puzzler to all but the most starry-eyed technophiles, answering the second question would seem to be a slam dunk, one would think. Would anyone miss pain, suffering and existential anxiety?

One of the remarkable things about the internet is that if you surf regularly enough you will on occasion come upon examples of people or ideas that are truly as rare as the fabled unicorn. Such a person and idea is represented by this review of a book by peter Augustine Lawler titled "Aliens in America, the Strange Truth about our Souls", by Steven Menashi of the Washington Times from Sept 1st of 2002. It seems that the purpose of Lawler's book is to answer the second question above with a resounding "no".
Mr. Lawler recalls Allan Bloom's observation that human beings are defined by love and death. But, today, those human passions no longer animate the most "sophisticated" Americans. Instead, the desire for self-knowledge has been replaced by a feel-good therapy that dulls the unease that is part of being human.

One can see this attitude clearly in the march of political correctness through our educational institutions: At the end of history, the purpose of education is no longer to question or strive for justice, but to implement it. We already have final knowledge, you see, and we can spare everyone the discomfort of striving, the burden of knowledge, and the disorder of the human passions.

That, for Mr. Lawler, is what links Mr. Fukuyama's two very different accounts of social evolution: They both aim at human comfort, to make people completely at home in the world.

But to be at home in the world is to be inhuman. Mr. Lawler believes that human beings must always be aliens in this world, tormented by longings for immortality and understanding, ill at ease with a self-consciousness and a capacity for good and evil that is denied to the rest of creation. Indeed, the most distinctively human impulses — the artistic or philosophic impulse — begin in awe and apprehension at the vast incomprehensibility of the world.

Today, however, the anxious wonder that is the root of human excellence can be cured with a generous dose of Prozac. Francis Fukuyama, for one, worries "what the careers of tormented geniuses like Blaise Pascal or Nietzsche himself would have looked like had they been born to American parents and had Ritalin and Prozac been available to them at an early age."

Mr. Lawler's answer: "They would have been untormented! True, their torment was intertwined with their ability to know much of the truth about Being and human beings, and to be haunted and deepened spiritually by God's hiddenness or death. But according to evolutionary biology, human beings are not fitted by nature to know the truth, and the fanatical pursuit of truth by tormented geniuses has not been good for the species."

Thus, contends Mr. Lawler, despite Mr. Fukuyama's latest criticisms of biotechnology — his essay "Second Thoughts" and the recent "Our Posthuman Future" — his suggestion in "The Great Disruption" is that the world is better off without men like Pascal and Nietzsche, who disrupted our natural existence with their preference for truth over social comfort. Mr. Fukuyama, in Mr. Lawler's rendering, has no standpoint from which to criticize pharmacology or biotechnology, for these are just an extension of Mr. Fukuyama's project of making people at home in the world: "Those who prefer comfort to truth . . . will swallow pills and submit to operations for their obvious social and survival values."

"To be human," writes Mr. Lawler, "is to be alienated from and disconnected with one's natural existence." He fears that bioengineering might permanently extinguish human longings (which are now only suppressed by some medications), abolishing human distinctiveness. Mr. Lawler wants to make the case for the necessity of religious faith as a moral compass, so he derides the scientific standpoint of Mr. Fukuyama and others.

There are so many angles to attack this philosophy from that I'm afraid I will have to ramble a bit to make the circuit of them all. My first impression is that the religious impulse Lawler alludes to would have to be called a "worship of suffering". I'll leave it to the more religious of my readers to determine if this worship is a heresy or an idolatry of the true worship of God, but from a purely human-oriented standpoint I'd have to say that it is one of the most thoroughly inhuman philosophies one can imagine.

Suffering may very well be inevitable, but the only truly human response to that reality is to fight to avoid, escape and to ameliorate that suffering wherever it exists in one's life and in the lives of others. There is a nobility in the way that humans face suffering, but that nobility only comes from our defiance of that suffering and our determination to find happiness and meaning in spite of it. Human nobility never finds itself embracing suffering. That is the way to absurdity and madness. Yet it seems to be the way that Lawler would have us travel. Like prisoners in a sick experiment he would have us fall in love wiht our captors.

I have a personal reason to find Lawler's position offensive and absurd. I have been taking anti-depressants for about eight years, and I can tell you that his notion that I have somehow left my most human and meaningful self behind in my escape from chronic depression is hogwash beyond measuring. Lawler is putting forward a false dichotomy between soulful, introspective anguish and self-satisfied, soulless, unexamined comfort. Does he imagine that the human mind has some binary switch that toggles between these two states, and that anti-depressants somehow block all painful sensations, keeping the subject in a state of satisfied bliss?

A healthy mind will experience a range of emotional states between bliss and abject terror. But more importantly, the emotional states in a healthy mind will be in balance with its environmental stimulii. Healthy people feel joy when they should feel joy, fear when they should feel fear, anger when they should feel anger, etc. Chronic depression blankets all situations with an overlay of fear and anxiety. The link to the environment is broken. It is not anguish that lends itself to meaningful philosophizing about the nature of man or the universe. It is an absurd state, an inhuman state. Yet Lawler would have me remain so in the service of producing the next Nietzsche. To grasp how absurd this is, just think about how the public would react to the idea of withholding all medical assistance to the blind and deaf in the service of producing the next Helen Keller.

Though as I mentioned that I find Lawler's seemingly abject worship of suffering and rejection of the very human striving for happiness to be quite unique, I have to say that it reminds me of other philosophies that are on display in various religious and non/religious contexts. One that I have commented much on lately is Crunchy Conservatism, which seems to be standing athwart any manifestation of the common people's striving for material comfort and well being, yelling "stop". Another manifestation seems to be, in the political realm, the desire to keep "authentic" cultures in a kind of pristine state, untouched by western-ism, as if it is the duty of undeveloped societies to maintain themselves as theme park exhibits for the aesthetic and moral satisfaction of western elites.

What all of these philosopies have in common, I believe, is that they aestheticize human existence. Lawler's obsession with producing religious and philosophical geniuses seems to say that the purpose of a life lived is to provide an appropriate aesthetic object of worship for others. To him a happy Nietzsche would be a non-entity, a wasted life. Likewise with the Crunchy Cons. It is beside the point whether people are happy or fulfilled with their material goods purchased from WalMart. Living the consumer lifestyle makes people unappealing from an aesthetic sense. The Crunchy Con cannot romanticize the modern consumer, and so the consumer life is a life not worth living. Likewise for the aboriginal who wants to enjoy western music, clothing and consumer goods. The cosmopolitan and aesthetically sophisticated UN bureacrat cannot idealize the authentic noble savage in the New Guinea jungle if he is wearing Nikes and doing the Macarena. And so his life is diminished.

The pre-eminent speaker on matters of suffering and meaning, in my opinion, is the psychologist and Holocaust survivor Victor Frankl. In his book "Man's Search for Meaning" Frankl wrote that when confronted with unavoidable suffering, one must construct a personal meaning for enduring that suffering in order to survive it. But he also wrote that to endure suffering when the suffering can be avoided is a meaningless exercise - it is masochism.

Thursday, July 06, 2006

Fascism, global warming and pet rocks

Every so often, a British politician in opposition to the Government on a particular topic will find himself unexpectedly backed by newspaper opinion polls and will loudly demand the introduction of ‘direct democracy’ or ‘referendum politics’.

That is, an emphasis on single issue voting rather than just five-yearly electing. They never get their way because (1) most issues are too complex for a yes/no vote; (2) nobody can ever agree on how to phrase the referendum question; (3) public opinion is just too variable and dependent on the whims of the popular press; and (4) all politicians are well aware that just as public can validate them on one issue, so it can bite them on the behind in another.

Many is the time I’ve watched the tides of public opinion rise and fall – a million marching en masse against Bush’s War On Terror one minute only to ‘get behind the troops’ the next, hunting alleged paedophiles (and once, in a case of tragic dyslexia, a paediatrician) one day then hounding foxhunters the day after, even crucifying a national footballer after one poor game only to hail him a hero after a win – and thought to myself: "Popular opinion really is, in the words from Verdi’s Rigoletto: mobile, qual piuma al vento”. Or alternatively: "Strewth, democracy just doesn’t work."

How much happier the country, nay, the world would be, if only it were ruled by an all-knowing and infinitely wise benign dictator. Namely, me.

We Duckians have often talked up the value of the hive mind – the process by which a conglomerate of many individual responses to a question will almost always inch towards the best answer – of the value of bottom-up rather than top-down decision-making, of natural selection producing better results than the Big Idea.

But how do we square the Wisdom of Crowds with the Madness of Crowds? Or, as our own Harry Eagar puts it with his usual spiky succinctness in his newspaper column: “Who can forget fascism, global warming or pet rocks?

Elsewhere, I defined the wisdom of crowds as occurring when the individuals don’t know they’re in a crowd and make their decisions independently, while the madness happens when they do know they’re in a crowd and mindlessly follow the flock down the road to fascism, tulip-buying and pet rock ownership.

But still it seems that there’s a fundamental conflict between promoting the benefits of the hive mind, and managing the madness of the mob. In other words, if we move for a moment from the merely descriptive (that the hive mind generally comes up with the best solutions) to the prescriptive (that we ought to do more to harness this power in practical politics), how do we avoid fascism and pet rocks?

And what of the flipside issue: leadership. Winston Churchill was a great and much-loved leader in a crisis. Yet when the War was over, his hive booted him out so that it could solve its more prosaic peacetime problems. Do the crowds give birth to leaders, or do leaders manipulate the crowd? Did the hive produce Hitler, or did he create the hive?

I don’t have the answers, just a swarm of possible ideas, so I leave it to the Duck’s hive mind. What’s the buzz?

Saturday, July 01, 2006

Marriage Schmarriage

"Don't worry about marriage", says Julian Sanchez. Marriage adapts organically to society as it changes, he says, using those magic words "organic" and "adapt" that are sure to calm the imagination of any post-modern, scientifically sophisticated reader.

Marital Mythology
Why the new crisis in marriage isn’t
Julian Sanchez

Marriage, a History: From Obedience to Intimacy, or How Love Conquered Marriage, by Stephanie Coontz, New York: Viking, 432 pages, $29.95

Promises I Can Keep: Why Poor Women Put Motherhood Before Marriage, by Kathryn Edin and Maria Kefalas, Berkeley: University of California Press, 293 pages, $24.95

The end, as usual, is nigh. “Barring a miracle,” Focus on the Family founder James Dobson writes in the April 2004 edition of his group’s newsletter, “the family as it has been known for more than five millennia will crumble, presaging the fall of Western civilization itself.” Dobson obviously has a knack for apocalyptic hyperbole, but some version of that sentiment haunts many a conservative mind.

It was the eschatological horror of wedding cakes adorned with pairs of little plastic men in tuxedos that prompted Dobson’s prophecy. But the fear of gay marriage is only the most headline-friendly manifestation of a broader concern that the institution of marriage is in a parlous state. As conservatives look at high rates of cohabitation and divorce, especially among poor mothers, many conclude that the institution you can’t disparage requires a helping hand from the federal government to stay afloat. Indeed, it’s not just conservatives: Political scientist William Galston, a former adviser to President Clinton, has argued that marriage is a key component of poverty alleviation, and that government must “strengthen [two-parent] families by promoting their formation, assisting their efforts to cope with contemporary economic and social stress, and retarding their breakdown whenever possible.” The most prominent recent effort in this vein is President Bush’s Healthy Marriage Initiative, run by the Department of Health and Human Services and funded to the tune of $100 million annually, most of which goes to fund educational or mentoring programs in which couples learn “relationship skills,” often by means of grants filtered through faith-based organizations.

If the link between gay matrimony and the “crumbling” of marriage remains something of a puzzle—for all the ink and pixels expended on the issue, no one has managed a compelling explanation of precisely how allowing more people to marry will induce fewer people to marry—concerns about the state of the family aren’t groundless.

To answer Sanchez' ironically worded puzzler, you only have to point out that the semantic leap necessary to classify two men in a commited romantic relationship as a marriage will simultaneously alter the meaning that heterosexual couples invest in the institution of marriage. You cannot redefine an institution without altering the expectations that you have for that institution. To point out how silly Sanchez's statement is, compare it to this analogy: "precisely how is it that allowing more people (women) to compete with men on professional football teams will induce fewer people (men) to play professional football?".

But to give Sanchez's argument some due, the fact that gay marriage is thought about by so many people in a non-oxymoronic fasion is an indicator of how much the marriage institution has already been redefined. Gay marriage is not so much a threat to traditional marriage as it is a symptom of its decline.

A spate of studies has led to a broad consensus among social scientists that children raised by their biological parents fare significantly better than children raised by single, cohabiting, or remarried parents on a wide variety of dimensions: They’re half as likely to drop out of high school or go to prison, more likely to attend college, and less likely to have behavioral problems or encounter material hardship—differences that may be reduced but do not disappear after controlling for factors such as parental income and education. These differences are apparent even in countries like Sweden, where both social norms and public policy are more hospitable toward single-parent families.

And there’s a class chasm in family structure: Some 3 percent of births to college-educated women take place outside of marriage, compared to almost 40 percent among high school dropouts. The proportion of women between the ages of 18 and 24 who attend college doubled between 1967 and 2000, to more than 38 percent, and fertility rates are significantly lower for women of childbearing age who hold a bachelor’s degree (an average of 1.05 offspring per mom) than for those with only a high school diploma (an average of 1.46). In short, the disadvantaged children for whom the stability marriage provides would be most helpful are also the least likely to enjoy it. “That is what government neutrality has gotten us,” Sen. Rick Santorum (R-Pa.), an ardent booster of using the state to promote traditional families, told an enthusiastic audience at the 2005 Conservative Political Action Conference.

Yet two quite different recent books on marriage (and its absence) suggest there’s something seriously wrong with the popular account of the American family’s ills, which attributes them to a recent breakdown in values, caused perhaps by latte-sipping elites who scorn traditional matrimony. In Marriage, a History, Evergreen State College historian Stephanie Coontz, author of the 1992 book The Way We Never Were: American Families and the Nostalgia Trap, reveals that marriage has served diverse purposes through the ages, and that the really radical change in the institution was the 18th-century innovation of marrying for love. In Promises I Can Keep, sociologists Kathryn Edin of the University of Pennsylvania and Maria Kefalas of Saint Joseph’s University take a close look at the lives of poor single mothers in Philadelphia, where they found a story much more interesting and convincing than the familiar “values” narrative.

Does marriage, as some conservatives seem to suggest, have an intrinsic nature and a deep purpose that remain constant across millennia, such that changes in its form or meaning should be considered inherently suspect, as unnatural as oceans boiling and lambs shacking up with lions? Not so much, according to Coontz, who finds that when it comes to marriage, the most reliable constant is flux.

While “one man, one woman” has become the clarion call of gay-marriage opponents, Coontz observes that the most “traditional” form of marriage adhered more closely to the rule “one man, as many women as he can afford.” Many Native American groups cared about diversity of gender in marriage rather than diversity of biological sex: A couple had to comprise one person doing “man’s work” and one person doing “woman’s work,” regardless of sex. In Tibet prior to the Chinese occupation, about a quarter of marriages involved brothers sharing one wife. To this day, the unique Na people in southwestern China live not in couples but in sibling clusters, with groups of brothers and sisters collaboratively raising children conceived by the women during evening rendezvous with visitors.

Even within the category of monogamous heterosexual unions, Coontz finds a dizzying variety of motives and meanings associated with marriage. Among early hunter-gatherer bands, trading members to other bands as spouses was, above all, a means of establishing networks of trade and economic cooperation between men. Once each group had members with loyalties and ties to both, barter became a safer bet.

That’s not to say the husbands were in full control either: In ancient Rome, married sons and daughters both lived under control of the patriarch until his death, and ancient civilizations more generally regarded marital decisions as far too important to be left to the whims of the marrying couple.

In the medieval period, too, marriage might be a handy means of cementing an alliance or sealing a truce among rulers. In other times and places, marriage was seen primarily as a means of regulating inheritance or succession. Often, especially where simple market sales of land were tightly restricted, it was the primary means of transferring landed property, and that was seen as the decisive factor in marriage decisions. Such considerations were not limited to the nobility: Peasant farmers who held land in separate strips might arrange a marriage that allowed adjoining parcels to be united. And while formal state approval is regarded in America today as a sine qua non of a valid marriage, the church considered a couple married as soon as they had exchanged “words of consent,” even alone and without formal trappings.

Among the working classes in later pre-industrial Europe, though a village was apt to intervene if a wedding brought a poor worker into the fold, marriage was seen as more centrally about the married couple. This view was encouraged by a church doctrine that recognized as valid any union entered by mutual consent and, later, by an emerging post-feudal economy in which young people were increasingly apt to leave extended families to seek their fortunes in cities or to work their own small plots. But husbands and wives saw each other more as business partners than as lovers. Marriage was a way of establishing an efficient division of labor, and a new widow or widower represented a job opening.

The love marriage, in which people more or less freely chose partners based on mutual affection, was really an 18th-century invention, Coontz argues. It was partly a spillover effect of new political ideologies that saw government as arising from contractual agreements designed to promote the happiness of society’s members and partly a result of further increases in economic autonomy, especially the autonomy of women. As late as the mid-19th century, French wags were still bemused at the new fashion of “marriage by fascination.” Opponents of gay marriage such as Maggie Gallagher sometimes identify this development as the central problem: the idea that marriage is mainly about uniting a loving couple, from which the notion that it ought to be equally available to gay couples follows.

Such critics sometimes talk as though marriage based on love is a recent innovation, rather than a transformation that’s been going on for centuries. As Coontz notes, during the 1950s—the conservative’s golden age for families—it was precisely the prospect of finding personal fulfillment through marriage to your soul mate that gave married life its central place in the social imagination. The vision of domestic bliss familiar from sitcoms like Ozzie and Harriet and The Donna Reed Show found its complement in a spate of self-help manuals and newspaper columns touting a successful marriage as the key to happiness, as couples’ average age at first marriage reached its lowest point in half a century. “In a remarkable reversal of the past,” Coontz writes, “it even became the stepping-off point for adulthood rather than a sign that adulthood had already been established. Advice columnists at the Ladies’ Home Journal encouraged parents to help finance early marriages, even for teens, if their children seemed mature enough.”

What emerges from Coontz’s account is the realization that marriage has no “essence.” There is no one function or purpose it serves in every time and place. This shouldn’t come as any surprise to readers of F.A. Hayek, who in The Mirage of Social Justice spoke of evolved rules and institutions that “serve because they have become adapted to the solution of recurring problem situations.…Like a knife or a hammer they have been shaped not with a particular purpose or view but because in this form rather than some other form they have proved serviceable in a great variety of situations.” Institutional evolution, like its biological counterpart, is opportunistic: A structure that serves one function at one stage may be co-opted for a very different function at another stage.

This is all fine and good, but it assumes that all these different socially adapted expressions of marriage are equally desirable. Do we really want to return to some collectivized social paradigm? Is there something about the way that marriage is defined today that is desirable to retain? Are the social developments to which marriage is adapting itself today and in the future good developments, or is marriage simply devolving with society into some unpleasant post-modern miasma?

Coontz knows the benefits of marriage, but she’s wary of attempts to stand athwart history crying “Stop!” If marriage now seems especially fragile, she argues, that’s not a function of public policy mistakes subject to easy political correction. It reflects underlying economic, legal, and technological changes that are, in themselves, mostly desirable. While not opposed to attempts to help couples craft stable marriages, she warns that “just as we cannot organize modern political alliances through kinship ties…we can never reinstate marriage as the primary source of commitment and caregiving in the modern world. For better or worse, we must adjust our personal expectations and social support systems to this new reality.”

That conclusion may seem excessively fatalistic, especially given Coontz’s own chronicle of marriage’s ability to adapt to changing circumstances. But it does encapsulate a core piece of Hayekian wisdom. Organic social institutions grow and evolve from the bottom up, as individuals change their behavior in light of the circumstances they perceive on the ground. Attempts to freeze or correct them in accordance with a Grand Plan—a vision of how they ought to function that views change as a dangerous deviation from an ideal—are no more likely to succeed for marriages than for markets.

Where Coontz’s history gives a picture of marriage painted in broad strokes, Promises I Can Keep is a close-up, lapidary study of unmarried low-income mothers in eight of Philadelphia’s poorest neighborhoods, culled from interviews with 162 such women over the course of five years. Several of those years were spent living in their communities. Edin and Kefalas’ account makes it clear that the growth of single motherhood among poor urban women can’t be chalked up to anything as simple or straightforward as a “breakdown of family values.”

In a sense, the problem is an excess of family values. Women who dropped out of high school are more than five times as prone as college-educated counterparts to say they think the childless lead empty lives, and also more likely to regard motherhood as one of the most fulfilling roles for women; motherhood is so highly regarded that it becomes difficult to see even a pregnancy that comes in the mid-teens as a catastrophe to be avoided. And far from having lost interest in marriage, the authors write, the women they spoke to “revere it”—so much so that some are hesitant to marry when they become pregnant because single motherhood seems less daunting than the opprobrium they fear they’d face were they to divorce.
The growing focus on marriage in public policy owes its resonance to two distinct themes that recur in conservative thought: anxiety about unregimented sexuality, and the belief that social problems are better solved by local groups and time-tested institutions. Those tendencies make it tempting to conclude that calls for marital reform and the genuinely distressed state of some families are part of one coherent and insidious phenomenon: the collapse of marriage. Yet as Edin and Kefalas show, the biggest problems with marriage are not first or foremost problems with marriage.

Communities grappling with dim economic prospects, violence, addiction, and high rates of incarceration are going to have trouble sustaining all sorts of valuable social institutions, marriage among them. Broader changes in marriage, meanwhile, need not herald its collapse: They’re an ordinary part of the way the institution has always adapted, organically, to societies that themselves are always changing.

Okay, as long as change has always occured then we need not fear change.