Sunday, January 29, 2006

Ideas and Cosequences Part 2 - the "common" mistake of philosophy

To continue with Feser's essay:

The varieties of conservatism

If you are still with me after all that, you have no doubt already begun to see the relevance of metaphysics to conservatism, and in particular the relevance of the classical realist tradition to Weaver’s brand of conservatism. “Realist Conservatism,” as we might call it, affirms the existence of an objective order of forms or universals that define the natures of things, including human nature, and what it seeks to conserve are just those institutions reflecting a recognition and respect for this objective order. Since human nature is, on this view, objective and universal, long-standing moral and cultural traditions are bound to reflect it and thus have a presumption in their favor.

But this does not necessarily entail a deference to the status quo, for since human beings are by their nature free and fallible, it is possible for societies to deviate, even radically, from the natural law. When this happens, it is the duty of the conservative to “stand athwart history yelling ‘Stop!’” (as the editors of National Review so eloquently put it many years ago). Such yelling ought of course to be done with tact and wisdom, but if the cause of the Realist Conservative should end up a lost one, unlikely to win elections, that is irrelevant. What matters is fidelity to the True, the Beautiful, and the Good.


Here Feser unwittingly tips us off to the weaknesses of the realist argument. To begin with, the effort to inculcate morality in human society has long been recognized as an effort to encourage men to rise above their natures. If the objective order is embodied in human nature, than to base morality on this objective order is to confuse "is" with "ought". It no better proscribes morality than a materialist, evolutionary view does. But the materialist does not make this mistake.

The second weakness to Feser's argument is that as he recognizes the fallibility of human reason and the possibility that societies will go astray from the objective order, then the conservative standing athwart history yelling stop is as likely to be stopping progress towards the objective order as he is to be preventing the regress away from that order. Assuming that there is an objective order to things is not the same as knowing what that objective order is. Conservativism of this kind, embedded in the notion that the current order is the correct order, is merely reactionary and no better a guide to moral decision-making than a coin toss or onesies-twosies.

What then are the other two varieties of conservatism I promised to identify? Here another, and much briefer, excursion into the history of philosophy is in order. I noted that realism had as its great rival nominalism, but there is also a third position on the nature of universals -- “conceptualism,” which might be thought of as a kind of middle ground between realism and nominalism. The conceptualist does not quite deny that universals exist (as the nominalist does) but he does insist, contrary to the realist, that they exist only subjectively, in the human mind. If they are real, then, they are something other than what the realist takes them to be; though their existence isn’t exactly denied, they are nevertheless “reduced” to something less grand, and certainly to something less than eternal and unchanging.


Here Feser makes an aesthetic judgment. Grandeur is in the eye of the beholder. I fail to see how aesthetics should have any say in the debate over the nature of reality, yet it often does both in the realm of theology as well as science. I tend not to take any scientific argument seriously if the scientist making the claim says that it is an "elegant" solution. For one thing, seeing elegance or inelegance in a mathematical equation takes a very strange mind to begin with, but where is there any scientific merit in the claim that truth must be beautiful? That's another bit of Platonist nonsense that I won't get into here.

The debate between realists, conceptualists, and nominalists manifests a pattern that one finds repeated frequently in other areas of philosophy: where X is some object of philosophical interest, some philosophers (the Realists) say that X is real and just what it appears to be, others (the Reductionists) say that X is real but not what it appears to be, and a third group (the Anti-Realists) says that X is not real at all, but at best merely a convenient fiction (and maybe not even that). In the debate over universals, the conceptualists are the Reductionists and the nominalists are the Anti-Realists (the realists, of course, being the Realists).

Another example of this pattern in the history of philosophy would be the debate over the relationship between mind (or soul) and body. The Realist view in this case would be “dualism,” which holds that mind and body (and mind and brain, for that matter) are completely distinct, and in particular that the mind is something non-physical or immaterial, just as it seems to be to common sense. A Reductionist view would be “identity theory,” which says that the mind is real but that it is really identical to the brain -- in other words, that the mind is, contrary to common sense, just one physical object among others. An Anti-Realist view would be “eliminative materialism,” which says that the mind does not really exist at all: strictly speaking, there are no such things as thoughts, experiences, beliefs, desires, and the like, but only neural firing patterns, hormonal secretions, behavioral dispositions, and so on and so forth.


As with aesthetics, so with common sense. I would disagree that common sense dictates that the brain and the mind are separate - I would argue just the opposite. But here is where I fault Feser's argument in particular, and metaphysical philosophy in general. Common sense is not a good way to determine truth. Common sense says that the world is flat. Common sense says that heavy objects fall faster than light objects. Common sense says that there must be something holding the world up.

Common sense is truer the closer it is to real, common experience. It is a guide from experience, basically. We don't all have the same experiences, but we have enough in common to support this notion that there are some things that we just all know. Common sense says don't poke a beehive with a stick. Common sense says don't fall for an offer that sounds too good to be true. Common sense says don't try to put something in your mouth that is bigger than your fist.

The problem comes when you try to extrapolate from common experiences to unknown phenomenon using the notions that seem right to the mind, but aren't supported by experience. This is basically what is left to philosophy after the scientific method took over all investigation of physical phenomenon amenable to empirical analysis. Aristotle's ideas about the natural world, derived from mental notions, proved how useless such unemprical noodling with notions of common sense to answer questions really were. Why do rocks fall and smoke rise? Because, thought Aristotle, each element wished to return to its origin, the rock to the Earth, and the smoke from fire to the Sun.

Philiosophy is now about constructing propositions about unknowable phenomenon based on mental notions extrapolated from experience. It is a shot into the dark based on Kentucky windage where the target is not even visible. Commmon sense is useless here. All sense is useless here. The soul cannot be measured, it cannot be put under a microscope. It only exists as an idea in the mind, a notion.

Now it seems to me that the distinction between Realist, Reductionist, and Anti-Realist positions in philosophy might usefully be applied to a demarcation of various brands of conservatism.

I have already described Realist Conservatism as committed to the existence of timeless and unchanging essences from which derives a natural law that applies to all human beings in all circumstances. Reductionist Conservatism, then, might be defined as a variety of conservatism that agrees with Realist Conservatism in affirming that there is such a thing as human nature and that it is more or less fixed, but which would ground this affirmation, not in anything like an eternal realm of Forms, but rather in, say, certain contingent facts about human biology, or perhaps in the laws of economics or in a theory of cultural evolution. The Reductionist Conservative is, accordingly, more likely to look to empirical science for inspiration than to philosophy or theology. He is also bound to see grey in at least some areas where the Realist Conservative sees black and white, since facts about economics, human biology, and the like, while very stable, are not quite as fixed or implacable as the Forms. But he is less likely to see grey than is the Anti-Realist Conservative, who might be characterized as someone doubtful that any relatively fixed moral or political principles can be read off even from scientific or economic facts about the human condition. Whereas Realist and Reductionist Conservatives value tradition because there is at least a presumption that it reflects human nature, the Anti-Realist Conservative values it merely because it provides for stability and order. The closest thing we have to an objective moral order, in the view of the Anti-Realist Conservative, are whatever principles happen to be embodied in the history and practice of a particular society. Since those principles can change, though, the conservative ought, in the view of the Anti-Realist, to be willing to change with them.


So call me a reductionist conservative.

Realist Conservatives respect religion because it shores up obedience to the natural law, but especially because its teachings are either explicit or implicit affirmations of the very same metaphysical truths knowable through philosophical inquiry. The Realist Conservative also respects science, but sees it as less fundamental in the order of knowledge than is philosophy (and, for some Realists, theology), and insists that its results be interpreted in the light of more basic metaphysical truths. Reductionist and Anti-Realist Conservatives also respect religion, but only because it serves as a bulwark of social and moral order; and the Anti-Realist Conservative is just as likely to see it as a potential danger when its adherents threaten to upset social order in the name of purportedly timeless truths. Reductionist and Anti-Realist Conservatives also tend to regard science (including, for some of them, social sciences like economics) as the paradigm of knowledge, indeed perhaps as the only thing that even deserves the name of knowledge. But the Anti-Realist Conservative is less likely to see in it a source of moral and political insights that might replace the insights traditionally promised by philosophy and theology. For the Anti-Realist, it is ultimately the values that have (for whatever reason) come to prevail in a culture, rather than any objective philosophical or scientific truths, that determine what we should do. Pragmatism is his only unchanging principle.

Now this classification is an idealization, and many real world conservatives probably exhibit elements of more than one of these tendencies of thought. It seems pretty obvious, though, that religious conservatives, whether they are simple believers or intellectuals of the sort associated with journals like First Things, are paradigm Realist Conservatives. It is either realist metaphysical principles of the sort I’ve described, or the will of God, or some combination of these, that define their conservatism, and this gives it an unmistakably Realist character.

Reductionist Conservatism predominates among secular conservative intellectuals who find in social science or evolutionary psychology the ingredients for a reconstruction of the older conservative conception of human nature in more purportedly “scientific” and “up to date” terms.


I think I've done a thorough job of shredding this notion of metaphysical truths knowable through philosophy. The rest of the article is more discussion of the Anti-Realist conservative type, as embodied by Jeffrey Hart in his recent essay "The Burke Habit":

Hart trouble

This brings us to Prof. Hart’s recent essay. Hart is himself a prominent conservative. He tells us that “what the time calls for is a recovery of the great structure of metaphysics, with the Resurrection as its fulcrum, established as history, and interpreted through Greek philosophy.” This might seem to mark his conservatism as of the Realist variety, but as every good Platonist knows, appearances can be deceiving.

The part of his essay that has provoked the most controversy concerns abortion, for Hart calls on conservatives to abandon the pro-life cause. This position itself seems to have been less controversial, though, than the reasons he gives for it. For Hart appeals in his defense to the “powerful social forces” favoring abortion, such as “the women’s revolution.” He tells us that the pro-choice consensus ought not to be challenged, because it is “the result of many accumulating social facts, and its results already have been largely assimilated.” Roe v. Wade must be accepted, in his view, because it reflected “a relentlessly changing social actuality” and “the reality of the American social process.” Indeed, the conservative mind itself, Hart tells us, is “a work in progress,” and ought to be guided by “skepticism” and “the results of experience.”

Here Prof. Hart’s jargon is clearly not that of Plato or Aristotle, Augustine or Aquinas. If anything, it is that of the pragmatist (and decidedly anti-realist and unconservative) philosopher John Dewey. And lest this seem an odd thing to say about a conservative, let it be noted that Hart almost admits as much himself when he proposes Dewey’s fellow pragmatist William James, of all people, as a model for conservatives to emulate. For the Realist, abortion is either wicked or it is not, and finding out which is all that matters. But for Hart (as for Dewey) such questions are trumped by “social forces,” “social processes,” “social actuality,” “revolution,” and “progress.” God and the philosophers, tradition and reason can have their say, but it is only when The People Have Spoken that every conservative must bend the knee. If this is conservatism, it is certainly not Realist Conservatism, or even Reductionist Conservatism, but it might be Anti-Realist Conservatism.

Anti-Realist Conservatism also seems to be the working ideology of many conservative politicians and activists, for whom assimilating principle to political expediency is always a temptation. It also appears common among populists who oppose liberalism only where it happens to offend the current sensibilities of “the folks,” but who are more than happy to give up their opposition when said “folks” decide to bring formerly avant-garde attitudes and practices home with them to the suburbs
.

I bolded Hart's quote above mainly for the irony factor that a prominent religious conservative can succumb to moral relativism. What is this world coming to, when people won't act according to form? Can someone send me a program, I can't tell the players apart anymore.

But this just reinforces my idea that assuming that an objective moral order exists is of no help in determining what that moral order is.

Tuesday, January 24, 2006

Ideas and Consequences Part 1 - Plato's long shadow

Edward Feser explores the philosophical underpinnings of his fellow conservatives in his TechCentralStation essay titled The Metaphysics of Conservatism. Feser's article is long by blog standards, and is followed up by a response from Max Borders, and a counter response by Feser. This multipart series of posts will explore this very important and divisive debate of the ages, without any real expectation, of course, of laying it to rest.

First off Feser recounts the historical development of Plato's Theory of Ideas and its adoption by Christianity into a philosophy that formed the underpinnings of Natural Law Theory, and the philosophical scholl known as Realism:

A brief history of Western thought

Obviously we cannot understand the metaphysics of conservatism without knowing something about metaphysics, and in particular something about the issue with which Plato, and Weaver following him, was most concerned. So let us begin by summarizing the relevant ideas. Is it possible to do so in a way that will make them comprehensible to those unfamiliar with philosophy, yet without oversimplification? The answer, of course, is no, but I will try anyway.

The first thing to say is that the label “Theory of Ideas” is misleading, because (given the way we now use the term “ideas”) it seems to imply that Plato was concerned with something that exists only in the human mind. In fact the opposite is true. Plato’s view is also sometimes called the “Theory of Forms,” and “form” rather than “idea” better conveys what he meant.

Take the example of a triangle, which has a form that distinguishes it from a square or a circle. In Plato’s usage, this “form” includes not only its shape, but all the properties that make it the thing it is: the length of its sides, its area, the fact that its angles add up to 180 degrees, and so forth. Now any particular material triangle (such as the ones drawn in geometry textbooks) is going to have certain properties that are not part of “triangularity” as such, and will also lack certain properties that are part of triangularity as such.

For example, it will have a specific color -- green, say -- and lack perfectly straight sides, even though greenness is not part of triangularity and having straight sides is part of it. So in Plato’s view, when the intellect grasps the form of triangularity, it is not grasping something material, since nothing material manifests triangularity in the strictest sense. But neither is it grasping something mental. For there are certain facts about triangles -- the Pythagorean theorem, for example -- that are entirely objective, and discovered by the human mind rather than invented by it. Moreover, these facts are necessary and unchanging rather than contingent and alterable: the Pythagorean theorem is true eternally, whether or not any human mind thinks otherwise or would like it to be otherwise. “Triangularity” is therefore something that exists apart from either mind or matter, in a third realm of its own: the realm of Forms. And the same thing is true, according to Plato, of the Forms of everything else -- squares and circles, plants and animals, human beings, beauty, truth, and goodness.

It is important to understand that talk about the Forms existing “in” a “realm,” and so forth, is purely metaphorical. Literally they don’t exist “in” anything, since “in” is a spatial term and the Forms, being immaterial, are outside time and space. For the same reason, the “realm” of Forms isn’t literally a place, since that too would imply spatial location. The concepts we apply to material things simply don’t apply to the Forms at all, and the way we learn about material things -- through the senses -- is not how we know the Forms. They are grasped through pure intellect.

Indeed, in Plato’s view the senses don’t strictly speaking give us true knowledge at all, but merely opinion, for what they reveal to us -- material objects -- are merely more or less imperfect copies of the Forms, and are continually coming to be and passing away, whereas the Forms are eternal. The Forms are what most fully and truly exist, and genuine knowledge is knowledge of them.

Aristotle took a view that was similar to Plato’s in many ways, though he thought that forms in some sense existed “in” the material world rather than in a “realm” of their own (even though they were not in his view any more than in Plato’s reducible to matter): triangularity, for example, exists “in” all particular triangles rather than as a thing in its own right. Aristotle also emphasized the idea that a substance -- a statue, a tree, a human being -- is a composite of matter and form.

On this view, known as “hylomorphism,” a tree, for example, is not merely a hunk of matter, but a hunk of matter with a particular form, the form of “treeness,” where this form is (again) not merely a physical property alongside other physical properties of the tree. And the soul, on Aristotle’s view, is simply the form of a living body. A human person, therefore, is on his view a composite of soul (or form) and body (or living matter). By virtue of their forms, things exhibit certain natural functions, ends, or purposes, and it is the fulfillment of these natural functions, ends, or purposes that defines what is good for a thing. This is as true of human beings and their various capacities as it is of anything else.

Now the philosophers of the Middle Ages inherited these concepts from the Greeks, some of them more or less following (and sometimes amending) Plato, others, particularly later in the medieval period, following (and amending) Aristotle. St. Augustine combined Plato’s view that the Forms are eternal and independent of the human mind with the intuition that it is hard to see how they could exist apart from just any mind at all, and concluded that they exist eternally in God’s mind. St. Thomas Aquinas extended Aristotle’s view in several ways, including an emphasis on the idea that the human soul, the form of the living human body, is “subsistent” in the sense that uniquely among the forms of material things, it operates in part independently of matter (in particular, its intellectual powers do) and can survive as a particular thing beyond the death of the body it is the form of.

In general, in debating the famous “problem of universals,” the medieval thinkers carried forward the debate between Plato and Aristotle over the nature of the forms. “Universals” are really just the things we’ve been calling “forms” -- triangularity, “treeness,” humanness, goodness, and so forth -- and are to be distinguished from “particulars,” i.e. specific things (like this or that individual triangle or tree) which might instantiate or exhibit the universal or form. The term “form,” though, tends to be used by those who take the view that, in their different ways, both Plato and Aristotle (and Augustine and Aquinas and many others) took: that universals are real and not reducible to either mind or matter. This is the view about universals that came to be known therefore as “classical realism,” or just “realism” full stop.


So far so good. Feser's lead in does an admirably concise job of summarizing the key points of Plato's theory and its subsequent developments. It is a philosophy that I thoroughly disagree with, but more of that later. Feser then sets up the applicability of this philosophy and the consequences of its abandonment in the Modern age:

This was by no means a mere intellectual curiosity. A great deal rode on it -- and, as we will see, still rides on it today. To take one example, the traditional understanding of the idea that there is a “natural law” that determines what is objectively right and wrong is inextricably tied to classical realism. For “human nature,” as understood by the traditional natural law theorist, is defined in terms of the form that every human being participates in simply by virtue of being a human being. And that means it is something known ultimately and most fully only through the intellect and via philosophical reasoning, not (or at least not entirely or most deeply) through the senses and empirical biology. Moreover, this nature defines certain natural ends and purposes for human beings and their capacities, the realization of which constitutes what is good for them: good objectively, simply by virtue of their participation in the form, and regardless of whether this or that particular human being realizes or (because of intellectual error, habitual vice, psychological or genetic anomaly or whatever) fails to realize it.

To take another, and related, example, a person, being on the view in question a composite of soul (or form) and body (or matter), cannot be identified with either his psychological characteristics alone or his bodily characteristics alone. Moreover, since the soul is just the form of a living human body, for a living human body to exist at all is for it to have a soul, so that there can be no such thing as a living human body -- whether that of a fetus, an infant, a normal human adult or a severely brain damaged adult -- which does not have a soul, and which does not count as a person. For while even a human being who is damaged or not fully formed might not perfectly exhibit the form of the human body (any more than a hastily drawn triangle perfectly manifests the form of triangularity), he nevertheless does exhibit it, otherwise his body wouldn’t count as a living human body at all (just as a hastily drawn triangle is still a triangle, however imperfect). One corollary of this is that every single living human body, within the womb or without, severely damaged or not, counts as the body of a person and as a being having all the rights of a person, including the right to life.


I've never understood this need to appeal to forms or ideas to justify the reality of things that we can see or feel directly. To me this is a philosophy misnamed; it should be rightly called "Idealism", as it suggests that the ideas about things are more real than the things themselves. The human mind posesses a very powerful capacity, that of abstraction. By abstraction, the mind is capable of highlighting singular characteristics of objects, and using these characteristics to classify separate objects according to types, or forms. But Platonic realism breathes life into these abstractions, to the point that they achieve object-hood all their own.

The second bolded quote above shows the semantic shenanigans by which the realist "proves" the existence, through a simple tautology, of the soul. The soul is a name given to the abstract quality that all living people posess. So by definition all living people posess a soul. By a simple substitution of term with definition, the absurdity of this tautology is revealed: "All living people posess the quality that all living people posess".

"Participating in a form" is merely fancy language for having the qualities that would put something in a category. It is sophistry. But I am getting ahead of the narrative:

Now as the Middle Ages wore on, a rival view to that of classical realism developed: the theory called “nominalism,” which held that universals do not, strictly speaking, exist at all. Only particular things -- this or that particular triangle, this or that particular tree, this or that particular human being, and so forth -- are real, and “triangularity,” “treeness,” “humanness” and the like are nothing more than names we happen to apply to groups of things. Moreover, that the things so named are grouped the way they are is merely a function of our interests and needs, and does not reflect any objective or natural order.

As the medieval world gave way to the modern one, and medieval to modern philosophy, nominalism won the day, and modern thinkers like Descartes and Locke abandoned the old conceptual apparatus of hylomorphism, with its appeal to forms and natural ends or purposes as fundamental to the understanding of things, and to the idea of the soul as the form of the living human body. “Mechanism” -- the view that physical things operate on purely mechanical principles, without natural ends or purposes and without instantiating anything like Plato’s or Aristotle’s Forms -- entailed a redefinition of the human body as nothing more than a complex machine, and “human nature” as nothing more than a specification of the principles by which the machine operates, like clockwork.

Now if a living human body does not have a form -- any more than anything else does on the modern view -- then it does not have a soul either, at least as classically defined. Descartes thus re-defined the soul as a kind of non-physical object which is only contingently or accidentally attached to its body, rather than as a form which the body necessarily has to have in order to be a living body at all. One result of this is that the soul came to seem to modern Western thinkers an ever more elusive and mysterious entity, and therefore a dispensable one. Another is that it became harder to see what made a living human body the body of a person, since there is nothing about its being alive that entails (on the modern view anyway) that it has a soul. This problem was only exacerbated by Locke’s own re-definition of a person as a stream of connected conscious experiences, rather than a union of soul (form) and body (matter).

Thus were sown the seeds -- inadvertently, to be sure -- that would eventually develop into the view that neither a fetus nor a Terri Schiavo counts as a person having a right to life. And in the other trends alluded to -- nominalism and mechanism -- we see the origins of the idea that “human nature” is either a purely human construct, or something that exists objectively only as a collection of behavioral tendencies, of no more inherent moral significance than the workings of a clock. We might, as a matter of prudence, want to keep them in mind as a possible barrier to the realization of our desires, but if we could find a way to alter them there would be no objective reason not to do so.


Now here is where the Platonic realists veer off into hysterical fearmongering so familiar to those secular "non-realists" among us who must bear the brunt of our religiously-minded Platonist friends tirades against the consequences of our soulless worldview. As I mentioned above, the "soul" is just the realist's term for defining human-ness. It adds nothing to our understanding, or our valuing of humanness to assume that the qualities that make humans humans reside in some mystical, free floatng, formless "form" rather than in our concrete existence as humans. The realist imagines that noone valued human life prior to Plato's invention of this idea of separating essence and existence into two separate but interacting objects, one material and one immaterial. Furthermore they imagine that this one idea is capable of forcing the idea holder into placing a higher value on human life than he otherwise would.

Contrary to Feser's contention that this notion flows from common sense (I'm getting ahead of the narrative again), it is the ultimate victory of intellectual hubris over organic common sense. Our species would have perished long ago had it been dependent on the development of a cerebral cortex advanced enough for abstract thought in order to properly value its own existence.

One comment about this notion that without a Soul, a person is nothing but a mere machine. This argument drives me up a wall. Does a person seem to be nothing more than a machine? Can a clock do all the things that a person can do? Do clocks fall in love? Has anyone ever fallen in love with a clock? We don't need a philosophical justification for the level of value that we ascribe to our existence. We value ourselves because we are self-valuing beings. It is inbred, it is organic. No theoretical underpinnings required.

This reminds me of a common, cliched comedy routine that has been repeated in many a movie and TV sitcom. Generically it goes something like this:

A man visiting an exotic, faraway land is invited by a local chieftain to dinner. Before him are placed several dishes of unfamiliar content. The man hesitantly begins to nibble at the fare.

American: "Hmm,... this isn't bad. As a matter of fact, this stuff is great! What do you call it?"

Chief: "We call it 'Bskhblakkxxxa'.

American: (stuffing large spoonfuls into his mouth) "Bisblaka huh? What is it made of?"

Chief: "Elephant testicles".

American: "Bleeeech!!!!!" (regurgitates his meal into the chieftains lap).


The food represents human life. We know by our experience that it is something good, something that we value highly. Yet we don't know quite what it is made of. Just like the hot-dogs and sausages we consume daily. The realist whining that if we find out that we're just machines without souls, we won't have any reason to value our lives is like the diner worrying that if he finds out what his hot-dogs really are he will decide that he really doesn't like hot-dogs after all. We don't need to know the recipe to know whether or not we like the end result.

Thursday, January 19, 2006

Heart of darkness

Psst – is the coast clear? Any ladies present?

Who am I kidding? The Daily Duck is a more male-dominated environment than the Gent’s toilets during the half-time break in the Superbowl at the annual convention for America’s Top 500 CEOs.

So given that, check this article out, lads.

The other woman is, simply and crudely, a door left ajar, through which he almost certainly has no intention of passing. She is somebody different to shag, where the need to do so is driven not by an uncontrollably rampant libido but by a deeply located fear that This Is All There Is, the end of the line, and that the next stop can be only death...

...Sex with other women, he comes to feel, is all that stands between him and the grave and the general and widely ignored futility of the human condition. Men see this futility clearer than women because their lives are more obviously futile. That’s why so many of them top themselves, for no apparent reason. For a man, an affair is, almost always, nothing to do with the woman involved. It’s not really anything to do with sex, either. It ’s about life and death. And that’s it, nothing more or less.

...

It’s regarded as a terribly empty and insulting platitude, but when a man utters the cliche “it meant nothing to me”, he means it, completely. Women refuse to accept this, perhaps because they can’t imagine being in that situation themselves without some form of emotional attachment, but a man is more than capable of having repeated, regular, illicit sex — risking losing the woman he loves and the family they have spawned — with someone he can, quite possibly, barely stand to be around.



Back in the 1840s, The Times used to be nicknamed The Thunderer. Jonathan Gornall (aka Mircowave Man) seems intent on reviving the epithet single-handedly.

What do you reckon? Brutally honest, or just plain brutish?

Global Warming™, Global Cooling... All Part of a Natural and On-going Cycle

Rewriting Glacial History In Pacific North America

Edmonton AB, Jan 10, 2006 [All emph. add.]
Glacier fluctuations are sensitive indicators of past climate change, yet little is known about glacier activity in Pacific North America during the first millennium A.D.
Alberto Reyes, a PhD student in the Department of Earth and Atmospheric Sciences [at the University of Alberta], and his research team have found evidence for a regionally-extensive glacier expansion in the first millennium AD, suggesting that climate during the last several thousand years may have been even more variable than previously thought.
The research appears in the journal Geology. [...]

Most of the evidence they found was in the form of buried soils and logs covered by glacial sediments. "In some cases, entire forest stands were buried by sediments and their trunks sheared off by advancing ice," said Reyes. [...]

Samples were then sent off for radiocarbon dating and when the results came back, the researchers were able to tell a story about when each individual glacier was expanding. Reyes had earlier noted the first millennium AD glacier advance at the glacier he was studying for his master's thesis, which jumped out because it was not thought that glaciers in the region were expanding at that time.
After poring over old data and early results of new research, the team found that many other glaciers had advanced during that period. "If only one or two glaciers are advancing at any particular time it is not really significant," said Reyes. "But when many glaciers across a wide region are advancing with some degree of synchronicity, there is likely something going on with regional climate that causes the glaciers to advance."

Reyes was surprised that the regional nature of this first millennium AD glacier advance remained unrecognized for so long. He suspects some of the earlier reports that hinted at the existence of an advance stayed under the radar because they did not fit into the established chronology of past glacier activity.The glacier data reported by Reyes and colleagues, together with other clues of past climate, support an emerging idea that climate in the North Pacific region has cycled from warmer to colder intervals several times over the last 10,000 years.

Tuesday, January 17, 2006

Richard’s Awfully Big Religious Adventure (part 2)

Last night the UK’s Channel 4 broadcast the second and final part of Richard Dawkins’s series about religion, ‘The Root of all Evil?’

This was a much stronger and more challenging programme than the first episode. In that, Dawkins did no more than interview some lunatics on the fundamentalist fringes of Christianity, Islam and Judaism, point out the bleedin’ obvious (that they were lunatics), and express a wish that all religion would vanish from the earth, taking lunacy and fundamentalism with it. The moderate masses in each of the three major Abrahamic religions didn’t even appear on the radar.

The second episode had a bit more to get one’s teeth into: Dawkins addressed religion’s claims to be the foundation of morality, the issues surrounding the religious education of children, and, at last, he spoke to some moderates.

(There were still plenty of loonies, but the tone of the programme was quite different. Dawkins was far less argumentative and far more conciliatory with his interview subjects. This made me suspect that the editors deliberately used all the fire and brimstone to get the series to go off with a controversial, publicity-generating bang at the start.)

Regarding morality, Dawkins observes that the Old Testament – the foundation of most religious claims to moral authority – is internally inconsistent, and where it isn’t inconsistent is quite often wicked. He criticises the notion of doing good out of fear of hellfire rather than for its own sake, and makes the case for an evolutionary origin of morality based on the benefits of mutual cooperation. Nothing I could disagree with and nothing much earth-shattering for anyone who has engaged in this kind of debate.

Education
But his look at religious education (or as he called it, the 'religious indoctrination' of children) did give me pause for thought.

Dawkins put the question: For what reason, other than the fact that centuries of practice have meant we’ve got used to it, do we allow the religious labelling of children from birth? A child of Muslim parents is automatically Muslim; a child of Christian parents automatically Christian, etc. We would never label a child with its parents’ political beliefs, so why do otherwise liberal, choice-valuing societies tolerate this ‘indoctrination’ into a particular sectarian belief system during the most crucial formative years?

Tricky one, eh? Not tricky for the religious, but tricky for someone who, like me, claims to be happy to tolerate all faiths to the extent that they themselves are tolerant, and to place paramount importance on individual freedom to choose.

It’s an easy enough matter when it comes to publicly-funded education. Dawkins visits some crackpot private faith schools where the science lessons have God on every page and include details of Noah’s Ark. Most sane people agree that this sort of ‘education’ shouldn’t be funded by the public purse.

But it’s a more difficult question when it comes to whether communities should be allowed to isolate children and teach them articles of faith as if they were facts. Children brought up in small loony Waco-style cults are considered victims of brainwashing and mental abuse. They have to be reintegrated into normal society when they’re older, often with great difficulty.

Dawkins’s programme raises the question: other than scale and sheer number of believers, what is the difference between indoctrination by a loony cult, and indoctrination by a Muslim or Jewish or Christian community?

Moderates
Over on Think of England I’ve got a reference to a famous joke from the satirical puppet show Spitting Image (if you never saw it, think Muppet-versions of politicians, but much nastier): There was a knock on the door. A man answered. “Do you believe in God?” he was asked. “Of course not,” he replied. “I’m Church of England.”

Dawkins interviews a clergyman from the liberal end of the Church of England – and let me tell you, that is pretty damn liberal. He doesn’t believe in the virgin birth and thinks gay marriage is fine. His reason for these variations from orthodoxy is Reason. Dawkins asks him the obvious question: so why do you bother being a Christian at all? His answer, as is always the moderate answer, is that the essence of the faith is more important than the literal words of Scripture. (He does believe in the Resurrection, but all moderates have their lines in the sand). Dawkins, of course, doesn’t buy this (though disappointingly, neither does he tenaciously argue the point).

Wrapping up
Dawkins finishes the programme with a slightly cheesy and predictable – but genuine – mountaintop plea for us all to enjoy and appreciate the wonder of the one world we live in, rather than enduring it in the hope of a non-existent afterlife.

But more significant was his earlier claim that religion is ‘an insult to human dignity’. After talking at length to another loony – a friend and defender of abortionist-murderer Paul Hill - Dawkins claims to have essentially liked the man, but to have been appalled to see him so warped by a twisted version of the Christian faith.

He quotes US physicist Steven Weinburg:

With or without religion, you would have good people doing good things and evil people doing evil things. But for good people to do evil things, that takes religion.

The extent to which you agree with this quote – and whether you suspect it is missing an extra clause, “although sometimes religion can make evil people become good” – will to a large extent determine whether you think Dawkins is brilliant, or misguided, or a menace.

As will your answers to these questions:

1) Would the world be a better place if all religious faith disappeared overnight without a trace?
2) Is there a practical way to make the worst aspects of religion – that is, violent hatred and intolerance – disappear? If so, what is that way?
3) Should we allow children to be brought up in a religious faith from a young age, before they get the chance to decide on these matters for themselves?
4) How do the benefits of religious faith balance against the negative consequences?
5) Is it desirable, or does it serve any good purpose, to initiate a giant debate between Reason and Faith, with the ultimate aim of producing one overall winner?
6) Supposing religious faith had never existed, do you think the history of mankind would have been significantly less bloody and intolerant, or would men have found other reasons to butcher each other?


Dawkins’s answers would be:
1) Yes
2) Yes, by persuading people that all faith is fundamentally misguided
3) No
4) They don’t even register
5) Yes
6) Yes, it would have been much less bloody.

For the record, my answers would be:
1)Yes
2) Hopefully, but persuading people that all faith is fundamentally misguided is unlikely to work and in fact could be self-defeating
3) I’d prefer not, but that’s complicated
4) They do register, but probably not enough to tip the scales
5) No
6) Maybe slightly less bloody, but they would certainly have found other reasons.

Sunday, January 15, 2006

How America Came to be THE Hyperpower, Why She Will Continue to Pull Away From the Rest of the World - and Why I'm Such a Loser

Hat tip to Photoncourier :

Red Herring, the venture capital magazine, has a piece on a company called Theranos, founded and being run by Elizabeth Holmes. The company has been developing a device which detects adverse drug reactions. It works by analyzing a tiny amount of blood from a person's finger or arm, then transmitting the data to a Theranos server, which uses biostatistics algorithms to profile the information.
In 2004, there were more than 400,000 adverse drug reactions reported to the FDA, and Ms Holmes wants this device to bring the numbers down. She also believes that the device could fundamentally change healthcare by determining of a particular drug is working on an individual basis.

She just turned 21--and this is her second company...


Now, I have no doubt whatsoever that I'm a "late bloomer", and I fully expect to do some slightly great(er) things during my life, in the fullness of time - assuming that I have a fullness of time remaining, of course.

Ms Holmes demonstrates why I'm second-team (if I may be permitted to flatter myself).

In the first place, this is a really smart idea. It could both save lives directly, and, by essentially converting every wired patient into an ongoing drug effectiveness test, it could lead to better drugs for all, fewer side effects from unnecessary overmedicating, and help us avoid wasting time and money on ineffective drugs.

So, wonderful. (Potentially).

Secondly, and this deals with two-thirds of the headline, in how many nations of the world could A TEENAGE GIRL get a serious audience, and then MILLIONS OF DOLLARS in VC funding, to develop her idea ?!?

There are many unpleasant consequences to American society being perpetually adolescent, a bit shallow and thrill-seeking, with an attention deficit and a naive optimism born of ignorance about the odds, but this type of thing is one of the UPSIDES of being that way.

In America, if you can do, the odds are pretty good that you'll be allowed to do, regardless of your shortcomings and quirks. We're flexible and goal-driven, not so much wedded to process.

Hey, this is the society that brought a castles-in-the-sky SciFi dream, a ballistic missile defense system, from concept to deployed and defending in only 23 years.
When "Star Wars", the Strategic Defense Initiative, was first proposed during the Reagan admin, it was flat out IMPOSSIBLE.
Not difficult, not challenging, it COULD NOT BE DONE.

Well, it's still impossible, in its original form, but the limited SDI system that's been deployed is a technological wonder, one that makes North America somewhat safer, and it clearly points the way towards the eventual success of the original "space shield" vision.

Who else but America would have the chutzpah (and the spare cash), to actually attempt to build such a Flash Gordon system ?

Well, along those lines, I've got to hand it to the PRC, their space programme is visionary and bold, but until and unless they actually do more than put a few people into orbit, they're still only potential worldbeaters.

Saturday, January 14, 2006

Very Odd, Especially to Hypersexualized American Eyes...

Japan's Virgin Wives Turn to Sex Volunteers
Lustless matches put country on brink of demographic disaster.

By Guardian Newspapers, 4/4/2005

Like many Japanese women, Junko waited until her early 30s to get married. When she and her fiance, an employee of a well-known firm, decided to tie the knot, she set her sights on making a home, putting away some money and starting a family.
Fifteen years later, Junko and her husband are childless. It is not that they cannot have children; it is just that they have never had sex.

The sexless marriage is one of several reasons why experts fear Japan is on the verge of a demographic disaster.
In 2003 Japan's birthrate hit a record low of 1.29 - the average number of times a woman gives birth during her lifetime - one of the lowest rates in the world, according to the cabinet office. The population will peak [...] at about 128 million, then decline to just over 100 million by 2050. [...]

The number of married couples is in rapid decline. In 2000 almost 70% of men and 54% of women between 25 and 29 were unmarried. That bodes ill for the birthrate, [Ya think ?! - Oro] as conservative Japanese society frowns upon having children outside marriage.

A survey of 600 women found that 26% had not had sex with their husbands in the past year.
"We are sort of room-mates rather than a married couple," one 31-year-old man, who had not had sex with his wife for two years, told the Asahi Shimbun newspaper.

The government has introduced several measures to lift the birthrate. Fathers will be encouraged to take more than the 47% of annual paid leave they currently use, and their employers will be told to provide more opportunities for them to stay at home with their children.

Local authorities, meanwhile, are devising novel ways to increase fertility. In the town of Yamatsuri women will receive 1m yen [$ 8,500; £ 5,000] if they have a third child, and in Ishikawa prefecture families with three children will get discounts at shops and restaurants. [...]

The divorce rate has nearly doubled in the past 10 years, with more women blaming their sexually inactive, as opposed to sexually errant, husbands for break-ups.


What's a girl to do ?
If you click through the link, the full article describes how someone has set up a network of amateur sex surrogates, who, out of the goodness of their hearts, are willing to go out on a dinner date with these hard-up women, and then on to a hotel.

"The population will decline to just over 100 million by 2050." Maybe. Just a guess, really, but it's a good-enough guideline to use when making long-term decisions, as long as it's kept in mind that both the population figure and date are CERTAIN to be wrong.

Italian society frowns on out-of-wedlock births, as well, but in Italy that's led to more abortions, not less sex.
However, the Italian fertility rate is virtually identical to that of Japan, so it's apparent that both societies are broken in some way.

If that 600-woman survey is anywhere near correct...
I have no doubt that a similar survey of American women would find that 26% of women had sex with their husbands only once or twice a month, (although I'm too lazy to actually look it up, since that survey is sure to exist), but something is clearly wrong when one out of four marriages are platonic.

Also, if 70% of 20-something men, but only 50% of 20-something women, are unmarried, it suggests that Japanese women often marry much older men.
While that was exactly the marriage advice proffered in Little Women, there the women in question were a decade younger, and so were the "older men".

Wednesday, January 11, 2006

Richard’s Awfully Big Religious Adventure (part 1)

The inestimable Richard Dawkins – without whom, let’s face it, this world would be a duller place – has a new two-part documentary on the UK’s Channel 4, entitled The Root of All Evil.

Predictably, it’s a merciless tirade against religion. I watched a tape of the first episode last night. Here’s my review:

First, the positive news: one thing is for sure, Dawkins is certainly not afraid to enter the lions’ den and start gleefully waving juicy chunks of meat.

He visits Ted Haggard, President of the National Association of Evangelicals, at his gargantuan (12,000 people!) multi-million dollar, state of the art New Life Church in Colorado Springs, and his opening conversational gambit is to tell the pastor that the flashy hi-tech service he has just witnessed reminded him of the Nuremburg rallies.

Haggard – who is weirdly charismatic and looks like he’s been modelled in plastic based on a Thunderbirds character - laughs this one off, since he’s still in ‘Welcome to America, my quaint little eccentric English friend!’ mode. Haggard also magnanimously declares that his evangelicals approve of the scientific method. But he stops laughing pretty sharpish when Dawkins informs him that the scientific method clearly points to the Earth being about 4.5 billion years old.

Haggard then suddenly and without provocation launches into a good scoff about the notion of the human eye developing entirely by accident. Dawkins informs him that no biologist he has ever met thinks the human eye developed entirely by accident. A few minutes later, our hero is being firmly escorted from the grounds of the New Life Church with Haggard yelling at him for ‘calling his children animals’ (which puzzles Dawkins as of course he did no such thing, but it turns out to be a reference to evolution).

Having thus warmed up, the intrepid Dawkins toddles along to the Holy Land, to see if there might be anybody there whose wrath he can incur. Oh dear, thinks the viewer.

Now we all know that Jerusalem is the world’s largest outdoor lunatic asylum (with typical bluntness, Dawkins introduces it as ‘the least enlightened place on the planet’) and it doesn’t take long for him to locate and duly upset a couple of bona fide lunatics.

The most frightening of these is a young American-born Jew who migrated to the city and somehow ended up converting to Islam. Dawkins – in his own admitted naivety – arranged an interview thinking that this fellow might be able to offer ‘both sides of the story’. But just two seconds after Dawkins professes to being an atheist, the converted Muslim launches into a vicious rant in a broad American accent, praising Osama bin Laden and damning Dawkins for ‘allowing his women to dress as whores and fornicate in the street.’ Our Richard initially argues back with equal vehemence that essentially, they ain’t his whores and they dress themselves, but after a while even he has to concede that arguing with this guy is like trying to nail jelly to a ceiling, and hastily departs.

All very entertaining, but hardly earth-shattering. Dawkins wants to conclude that all religious belief is A Bad Thing because faith breeds ignorance, which breeds intolerance, which leads ultimately to suicide bombers. But his method in this episode - showing that fundamentalist lunatics are lunatics - is no more useful than Michael Moore’s trick of picking mad or senile interview targets and mocking them.

He has yet to interview, for example, a tolerant, mild-mannered, churchgoing scientist. Or any of the old ladies in my local parish. Perhaps he’ll explain how these fit into the slippery-slope to suicide-bombing in the next episode. I’ll let you know...

Monday, January 09, 2006

Won’t somebody please think of the children!

The (London) Times today brings up an old chestnut:

VIOLENT computer games trigger a mechanism in the brain that makes people more likely to behave aggressively, research suggests.

A study of the effects of popular games such as Doom, Mortal Kombat and Grand Theft Auto, which involve brutal killings, high-powered weaponry and street crime, indicates that avid users become desensitised to shocking acts of aggression. Psychologists found that this brain alteration, in turn, appeared to prime regular users of such games to act more violently.

Many studies have concluded that people who play violent games are more aggressive, more likely to commit violent crimes, and less likely to help others. But critics argue that these correlations prove only that violent people gravitate towards violent games, not that games can change behaviour.


The debate about whether violent computer games lead to actual violence reached a peak after the Columbine massacre, but seems to have largely died away. Of course, the argument pre-dates computer games: remember the fuss over Kubrick’s A Clockwork Orange? That film now seems pretty tame by today’s standards. It’s clear enough that most people can play games and watch movies without feeling the urge to act them out.

However, there is one area of popular culture that I have to admit disturbs me. LA ‘Gangsta’ culture is ruthlessly marketed and is extremely popular among youths across the world, and Britain is certainly no exception. My sister teaches in a multi-racial comprehensive in London, and she reliably informs me that the rapper 50 Cent is worshipped by boys of all races, but especially blacks and Asians.

And now it seems these kids are playing a video game in which you live out 50 Cent’s (real or imagined?) gangsta lifestyle, including such tasteful activities as shooting rival gang members and looting bodies to buy 50 Cent records and videos.

Maybe it’s all as harmless as the (I suppose, extremely bloody) games of outlaws and pirates that we used to play as children. But those make-believe outlaws and pirates belonged to an obviously imaginary world distinct from the rather mundane real one. This game, called ’50 Cent: Bulletproof’, takes a real pop star and glamorises an apparently highly lucrative and exciting lifestyle of guns and crime.

I've found that it is very easy to underestimate the common sense of youths. They're usually much more sophisticated and morally aware than you think, and plenty of them turn out to see such old-fashioned family fun as Wallace and Gromit, Narnia or Harry Potter.

Nonetheless, I can't help thinking that the existence of '50 Cent: Bulletproof' means we have to chalk one up for the hell-in-a-handcart brigade.

Sunday, January 08, 2006

CVD spells The End of Western Civilization

CONTENT WARNING: Young children should leave the room. Those of you prone to the vapors should not watch the screen while reading.

Life is too often like dealing with a magician. You find yourself distracted by the hand-waving, when the real action is coming from an entirely different direction.

It is very easy to enumerate the threats to Western Civilization: desiccating consumerism, moral relativism, declining fertility. And, particularly since 9-11, the ugly, unshaven head of califascistic Islamism.

That's all hand waving. The real threat to Western Civ is virtually on our doorstep. It will cause Team Estrogen to ask, collectively, "What's a Girl to do?"

When they fail to come up with a satisfactory, to them, answer, we will be so sunk.

CVD. Three letters all of us should view with eyes of fear and looks of hate.

CVD is an acronym for Chemical Vapor Deposition, the term given to a process that will bring diamond manufacturing to a basement near you.
In the back room of an unmarked brown building in a run-down strip mall, eight machines, each the size of a bass drum, are making diamonds.

That's right — making diamonds. Real ones, all but indistinguishable from the stones formed by a billion or so years' worth of intense pressure, later to be sold at Tiffany's.

Sold is the problem. These diamonds, virtually indistinguishable from the "real" thing, will initially hit the market for roughly 30% less than mined diamonds.

In most cases, cheap is good. But not with diamonds. Cheap is a bad, a very bad, thing.

Value and cost do not necessarily coincide. Air is very valuable, but is essentially free. Electricity is nearly as valuable, and almost as free. Pornography, in contrast, pretty much plumbs the stygian depths of the value basement, yet costs a great deal more than air or electricity.

Diamonds are an oddity. They have little intrinsic, and even less actual scarcity, value. Without De Beers aggressively maintaining a diamond cartel that engineers artificial scarcity, their cost would plummet.

And so their value to Team Estrogen.

There is only one reason De Beers has been able to maintain its cartel. We, men, are willing participants in our impovershment. Women need to have a portable relationship signifier (PRS)* whose value derives solely and directly from cost. In order to fulfill its function, therefore, the PRS must be both expensive, and without utility.

Diamonds fulfilled the PRS role perfectly. They are the image of durability, and even small gradations in size signify the economic status of the attendant relationship. When women say "size does not matter," that assertion is trustworthy only with respect to "of what?"

Through a brilliant marketing campaign starting the late 30's, De Beers has provided the answer to "of what?"
[Their] slogan "A Diamond Is Forever" was born, launching one of the most brilliant, sophisticated and enduring marketing campaigns of all time. Without ever mentioning the name De Beers, the campaign set out to seduce every man, woman and child in America with the notion that no romance is complete without a rock -- and the bigger the rock, the better the romance. That men also now had a way to show the world how much money they made was an added bonus.

CVD diamonds will destroy this edifice of Western Civilization, by introducing the Free Rider problem that bedevils honest libertarians.

For as these diamonds reach market, men will be able to faultlessly mimic a PRS of larger significance, both emotional and material, for a fraction of the cost, hence the value, of the "real" thing.

(Speaking of signifiers, scare quotes here signify that the difference between "real" mined diamonds and "fake" CVD diamonds is without distinction.)

So long as there are few free loaders, diamonds will still be able to fulfill their PRS function.

But because the mimicry is so effective, and the cost differential so large (even ignoring that CVD diamonds will eventually become so cheap that they will be given away free with laptop computers), De Beers will be faced with two choices: hope to defend market share by opening its vaults, thereby destroying "real" diamonds engineered scarcity cost, or follow the Dodo to ignominous extinction.

A Hobbes choice for De Beers, to be sure; a plight over which not even a crocodile could shed a tear.

The impact on Team Estrogen is an entirely different matter, though. CVD will devalue existing and future PRSs just as thoroughly.

I'm sure most readers here have heard the saying that when the US economy catches a cold, the rest of the world gets pneumonia.

Well, when Team Estrogen gets grumpy over the loss of their PRS, men will wistfully reminisce about pneumonia's pleasures.


* The author could probably earn a lucrative second income generating TLAs** for the military.

** What does TLA stand for? Why, Three Letter Acronym, of course.

Saturday, January 07, 2006

Don't cry for me, Europa!

Cry the Beloved Continent. Victor Davis Hanson pleads with a senescent Europe to re-awaken to its once and future glory:

Despite the bitter recrimination and growing rift between you and us, most Americans have not forgotten that a strong, confident Europe is still critical to the material and spiritual well being of the United States.

It is not just that as Westerners you have withstood — often later at our side — all prior challenges to the shared liberal civilization you created, whether the specter of an Ottoman global suzerainty, Bonapartism, Prussian militarism, Nazism, fascism, Japanese militarism, or Soviet Communism.

Nor is our allegiance a mere matter of history. Europe is the repository of the Western tradition, most manifestly in shrines like the Acropolis, the Pantheon, the Uffizi, or the Vatican. We concede that the Great Books — we as yet have not produced a Homer, Virgil, Dante, Shakespeare, or Locke, much less a Da Vinci, Mozart, or Newton — and the Great Ideas of the West from democracy to capitalism to human rights originated on your continent alone. And if Americans believe our Constitution and the visions of our Founding Fathers were historic improvements on Europe of the 18th-century, then at least we acknowledge in our humility that they were also inconceivable without it.

No, there is a greater oneness between us, an unspoken familiarity even now in the age of global sameness, that makes an American feel at home in Amsterdam, Paris, Rome, or Athens in a way that is not true of Istanbul, Cairo, or Bangkok.

In the multiracial society of the United States, an American black, Asian, or Latino finds natural affinity in London and Brussels in a way not true in Lagos, Ho Chi Min City, or Lima. For millions of Americans "Eurocentric" is no slur — for it is an appellation of shared values and ideas not of race.

Even in this debased era of multiculturalism that misleads our youth into thinking no culture can be worse than the West, we all know in our hearts the truth that we live by and the lie that we profess — that the critic of the West would rather have his heart repaired in Berlin than in Guatemala or be a Muslim in Paris rather than a Christian in Riyadh, or a woman or homosexual in Amsterdam than in Iran, or run a newspaper in Stockholm rather than in Havana, or drink the water in Luxembourg rather than in Uganda, or object to his government in Italy rather than in China or North Korea. Radical Muslims damn Europe and praise Allah — but whenever possible from Europe rather than inside Libya, Syria, or Iran.


Hanson's critique of Europe's plight is earnest and heartfelt, and displays none of the shaudenfreude evident in those conservative critics that see Europe's demise as a just punishment for it's atrophied religious faith. Hanson is peerless in his ability to articulate the qualities which define the "West", especially in contrast to those critics who can see nothing of value in the Western heritage that does not directly derive from Christianity.

Alas, recently, Europeans have been taken hostage on the West Bank, Yemen, and Iraq. All have been released. There are two constants in the stories: Some sort of blackmail was no doubt involved (either cash payments or the release of terrorist killers in European jails?), and the captives often seem to praise the moderation of their captors. Is this an aberration or indicative of a deeper continental malady? Few, in either a private or public fashion, suggested that such bribery only perpetuates the kidnapping of innocents and provides cash infusions to terrorists to further their mayhem.

On the home front, a single, though bloody, attack in Madrid changed an entire Spanish election, and prompted the withdrawal of troops from Iraq — although the terrorists nevertheless continued, despite their promises to the contrary, to plant bombs and plan assassinations of Spanish judicial officials. Cry the beloved continent.

The entire legal system of the Netherlands is under review due to the gruesome murder of Theo van Gogh and politicians there who speak out about the fascistic tendencies of radical Islam often either face threats or go into hiding. Cry the beloved continent.

Unemployment, postcolonial prejudice, and de facto apartheid may have led to the fiery rioting in the French suburbs, but it was also energized by a radical Islamic culture of hate. In response followed de facto French martial law. All that remains certain is that the rioting will return either to grow or to warp liberal French society. Indeed, so far has global culture devolved in caving to Islamism that we fear that only two places in the world are now safe for a Jew to live in safety — and Europe, the graveyard of 20th-century Jewry, is tragically not among them. Cry the beloved continent.


You can only hope that this litany of shame would arouse a spark of righteous anger in the Eurpoean breast. Yet I think that Europe's long history of invasions by barbarians, Saracens, Nazis and Soviets has bred within the European psyche a sense for realpolitik that we cannot fathom, a sense that can weigh options of resistance or appeasement with a cold calculus of cost/benefit that our historic idealism cannot fathom.

We wish you well in your faith that war has become obsolete and that outlaw nations will comply with international jurisprudence that was born and is nurtured in Europe. Yet your own intelligence suggests that the Iran theocracy is both acquiring nuclear weaponry and seeking to craft missile technology to put an Islamic bomb within reach of European cities — oblivious to the reasoned appeals of European Union diplomats, who themselves operate as Greek philosophers in the agora only on the condition that Americans will once more play the role of Roman legionaries in the shadows.

Russia may no longer be the mass-murdering Soviet Union, but it remains a proud nationalist and increasingly autocratic power of the 19th-century stripe, nuclear and angry at the loss of its empire, emboldened by the ease that it can starve energy supplies to Western Europe, and tired of humanitarian lectures from Westerners who have no real military to match their condescending sermons. Old Europe has neither the will nor the power to protect the ascending democracies of Eastern Europe, much less the republics of the former Soviet Union from present Russian bullying — and perhaps worse to come.

The European strategy of selling weapons to Arab autocracies, triangulating against the United States for oil and influence, and providing cash to dubious terrorists like Hamas has backfired. Polls in the West Bank suggest Palestinians hate you, the generous and accommodating, as much as they do us, the staunch ally of Israel.

So, terrorists of the Middle East seem to have even less respect for you than for the United States, given they harbor a certain contempt for your weakness as relish to the generic hatred of our shared Western traditions.

You will, of course, answer that in your postwar wisdom you have transcended the internecine killing of the earlier 20th century when nationalism and militarism ruined your continent — and that you have lent your insight to the world at large that should follow your therapeutic creed rather than the tragic vision of the United States.

But the choices are not so starkly bipolar between either chauvinistic saber rattling or studied pacifism. There is a third way, the promise of muscular democratic government that does not apologize for 2,500 years of civilization and is willing to defend it from the enemies of liberalism, who would undo all that we wrought.

A European Union that facilitates trade, finance, and commerce can enrich and ennoble your continent, but it need not suppress the unique language, character, and customs of European nationhood itself, much less abdicate a heritage that once not merely moralized about, but took action to end, evil.

The world is becoming a more dangerous place, despite your new protocols of childlessness, pacifism, socialism, and hedonism. Islamic radicalism, an ascendant Communist China, a growing new collectivism in Latin America, perhaps a neo-czarist Russia as well, in addition to the famine and savagery in Africa, all that and more threaten the promise of the West.

So criticize us for our sins; lend us your advice; impart to America the wealth of your greater experience — but as a partner and an equal in a war, not as an inferior or envious neutral on the sidelines. History is unforgiving. None of us receives exemption simply by reason of the fumes of past glory.

Either your economy will reform, your populace multiply, and your citizenry defend itself, or not. And if not, then Europe as we have known it will pass away — to the great joy of the Islamists but to the terrible sorrow of America.


My sense is that Europe will eventually come around, but it will require a level of provocation greater than any they have experienced to date. When a generation emerges in Europe that comes to the realization that its parents generation has left it with a bankrupt patrimony and morally defenseless against the barbarians, then a psychic shift will occur. Judging by past European history, it may be a cure that is worse than the disease.

Thursday, January 05, 2006

Move over Nostradamus

James Lileks takes on 2006 with prognostications that are sure to provoke laughter. Only if all the news were this funny:

The spy stories continued to add up, as it became increasingly obvious that the Administration was boosting employment statistics by hiring hundreds of thousands of people to read every text-message sent by cell phones. “It’s dull, useless, meaningless work,” said one official, “but as long as it detracts from the search for terror suspects and needlessly intrudes on the right of 14 year old to send inscrutable, abbreviated rants about their parents without fear of detection by indifferent authorities looking for terror warnings, we’re all for it.”

The Supreme Court banned no-knock searches in Mosul; Congress passed legislation requiring US Spec-Ops troops to give up night-vision gear, and wear squeaky shoes, and speak in stage whispers.

The New York Times, fresh from reporting the self-destruct codes for the American spy satellites that had inadvertently listened into fifteen pay-per-view porn orders from cable subscribers in Omaha, revealed that US subs have been violating Chinese territorial waters to monitor military communications. The Times named the boat, the captain, his home address, and posted his credit report online. The boat was never heard from again, and was presumed sunk. Outrage was swift – but only when the Justice Department demanded the names of the people who’d leaked the secret information. “Not content with destroying the Fourth Amendment, this administration seems intent on demolishing the First,” said one legal expert who appeared on CNN but declined to give his name, fearing reprisals. (His name was later leaked to the Times, which printed it, but declined to name its sources.)


Read the whole thing.

Wednesday, January 04, 2006

Momma, don't let your gay-boys grow up to be cowboys..

I have had no desire, up until now, to comment on that latest milestone in the evolution of American manhood known as "Brokeback Mountain", the Ang Lee film about two cowboys who discover their gayness on the wide open grasslands of Montana. Other than to say, as a non-gay male, that the chances of my seeing this movie, regardless of the cinematic, directorial or dramatic excellence of its achievements, are less than Larry David's chances of being named People Magazine's "Sexiest Man Alive". But this, of course, is a given, and I really have no desire to comment on a film that I have not, nor would not see. Let 1000 flowers bloom, I say, and move on.

Yet this commentary about the film, posted by Ross (last name unknown) on Andrew Sullivan's Daily Dish, has engaged my response reflex in a contrary manner.

To a certain extent, the drama of the movie necessitates this kind of contrast, but it's significant, I think, that the film doesn't offer any model of successful heterosexual masculinity, or of successful heterosexual relationships in general. The straight men are all either strutting oafs, bitter bigots like Jack Twist's father, or "nice-guy" weaklings like Alma's second husband, whose well-meaning effeminacy contrasts sharply with Ennis's rugged manliness. Jack and Ennis are the only "real men" in the story, and their love is associated with the high country and the vision of paradise it offers - a world of natural beauty and perfect freedom, of wrestling matches and campfires and naked plunges into crystal rivers - and a world with no girls allowed. Civilization is women and babies and debts and fathers-in-law and bosses; freedom is the natural world, and the erotic company of men. It's an old idea of the pre-Christian world come round again - not that gay men are real men too; but that real men are gay.


Perfect freedom, you say? No babies? No in-laws? Hmmmmm! When you put it that way, it does sound appealing. But is it a hallmark of manhood?

This is not an old idea from the pre-Christian world, but a not-so-old idea from the Modern world. The ancient Greeks, for whatever Onanistic horseplay they engaged in with their camp-mates, also carried the responsibilities for their wive's, children's and city's defense and welfare upon themselves. This modern Peter Pan of the prairie does not. Men have always been sexual rogues, and always will be. Yet civilized society, to its credit, has always put a higher price on that distinction it chose to name manhood, and that price was sacrifice to the larger community. Such sacrifice was, on occasion, heroic and glorious, but in the main it was onerous, dreary, and debilitating.

Today, of course, it is less so onerous and debilitating. Yet men have learned to shy away from familial responsibilities all the more. I have nieces in their twenties and early thirties who despair of ever finding a man willing to commit to even a long term relationship, let alone crying babies and debts and in-laws. I find it very difficult to see in this trend a movement toward a more real manhood.

Monday, January 02, 2006

Is Lit-Crit Legit?

If there is an academic discipline that can elicit the MEGO response more than Literary Criticism, and makes the average taxpayer more angry about the use of his hard earned dollars at the hands of tenured university drones, I've yet to hear of it. Nick Gillespie reports on the state of "Lit-Crit" in the third of his series reporting on the annual meeting of the Modern Language Association:

One of the subtexts of this year's Modern Language Association conference -- and, truth be told, of most contemporary discussions of literary and cultural studies -- is the sense that lit-crit is in a prolonged lull. There's no question that a huge amount of interesting work is being done -- scholars of 17th-century British and Colonial American literature, for instance, are bringing to light all sorts of manuscripts and movements that are quietly revising our understanding of liberal political theory and gender roles -- and that certain fields -- postcolonial studies, say, and composition and rhetoric -- are hotter than others. But it's been years -- decades even -- since a major new way of thinking about literature has really taken the academic world by storm.

If lit-crit is always something of a roller-coaster ride, the car has been stuck at the top of the first big hill for a while now, waiting for some type of rollicking approach to kick in and get the blood pumping again. What's the next big thing going to be? The next first-order critical paradigm that -- like New Criticism in the 1940s and '50s; cultural studies in the '60s; French post-structural theory in the '70s, and New Historicism in the '80s -- really rocks faculty lounges? (Go here for summaries of these and other movements).



Amateur Analysis Alert: I post this alert to warn our more sensitive readers of the soon to follow amateur analysis by yours truly, who has no professional training in Literary Criticism or in Literary Criticism Criticism. Yet those of you who have been with the Daily Duck from it's inception will know from it's inaugural post that the DD is founded on the premise that "the interested amateur has an important role to play in making the professional "experts" who operate the levers of power in our society accountable to the rules of common sense."

Now, down to business. By following the link provided by Gillespie, you will learn that Literary Criticism, counter to common sense, is often very weakly associated with the criticism of literature. By perusing the list of various schools of Lit-Crit through the last two centuries, from Pragmatism to Formalism to Marxism, Feminism, Post-Colonialism, Psychoanalysis, Queer Theory and the ever popular Deconstructionism, one quickly gets the impression that literature is merely a convenient jumping-off point for a host of radical philosophies and ideologies that cut across politics, "science", social engineering, religion and the meaning of existence itself.

I ask the reader to indulge my naiivete for one moment. As I see it, Literary Criticism should be about the study of Literature, and should help the student answer questions like: What are the hallmarks of good literature? What literary works of the past and present exhibit these hallmarks? What can be gained by reading these works? Granted, it seems like a very limited, focused set of goals. Not the kind of stuff of which societies are transformed or hegemonies overthrown.

Yet what to make of these dizzying varietes of "disciplines" that have evolved and mutated from this one, seemingly simple undertaking? I use quotations around the word disciplines precisely because it seems to me that the field lacks the discipline to constrain itself within a focused set of goals. There are psychological, philosophical and historical aspects to literature, but literature is but a small aspect of how these phenomena are expressed, and not the aspect through which authoritative theories of these disciplines are to be discovered. Yet it would seem, judging by the ambitious, overarching goals set by the lit-crit practicioners, that Literature is the oracle through which all things can be known.

It was with this question in mind that I attended yesterday's panel on "Cognition, Emotion, and Sexuality," which was arranged by the discussion group on Cognitive Approaches to Literature and moderated by Nancy Easterlin of the University of New Orleans. Scholars working in this area use developments in cognitive psychology, neurophysiology, evolutionary psychology, and related fields to figure out not only how we process literature but, to borrow the title of a forthcoming book in the field, Why We Read Fiction.

Although there are important differences, cognitive approaches often overlap with evolutionary approaches, or what The New York Times earlier this year dubbed "The Literary Darwinists"; those latter critics, to quote the Times:


“...read books in search of innate patterns of human behavior: child bearing and rearing, efforts to acquire resources (money, property, influence) and competition and cooperation within families and communities. They say that it's impossible to fully appreciate and understand a literary text unless you keep in mind that humans behave in certain universal ways and do so because those behaviors are hard-wired into us. For them, the most effective and truest works of literature are those that reference or exemplify these basic facts.“


Both cognitive and evolutionary approaches to lit-crit have been gaining recognition and adherents over the past decade or so. Cognitive critics are less interested in recurring plots or specific themes in literature, but they share with the Darwinists an interest in using scientific advances to help explore the universally observed human tendency toward creative expression, or what the fascinating anthropologist Ellen Dissanayake called in Homo Aestheticus: Where Art Comes From and Why, “making special.”



Of all the questions that we would have cognitive psychology answer for us, I can't think of many with less import than why we read fiction. The bolded quote above is indicative of the mindset of lit-crit; that somehow literature is useless to us without the work of the lit-critician to tell us why we should appreciate it. It is even worth asking whether literature is important at all. Art and literature are things that we participate in because we want to. There is this sense among the learned that the non-artistic or non-literary life is not worth living, yet human history has progressed forward largely by the efforts of people with barely a literary bone in their bodies.

The first presenter was Alan Palmer, an independent scholar based in London and the author of the award-winning Fictional Minds. For Palmer, how we process fiction is effectively hardwired, though not without cultural emphases that depend on social and historical context; it also functions as a place where we can understand more clearly how we process the "real" world. After summarizing recent cognitive work that suggests "our ways of knowing the world are bound up in how we feel the world...that cognition and emotion are inseparable," he noted that the basic way we read stories is by attributing intentions, motives, and emotions to characters. "Narrative," he argued, "is in essence the description of fictional mental networks," in which characters impute and test meanings about the world.


Is there anything in the bolded quotes that isn't blaringly obvious to the rankest amateur reader? Do we need academicians to confirm those ideas which common sense would already tell us are true? What notion of any import was added to the store of human knowledge by his analysis?

He led the session through a close reading of a passage from Thomas Pynchon's The Crying of Lot 49. The section in question was filled with discrepant emotions popping up even in the same short phrases. For instance, the female protagonist Oedipa Maas at one point hears in the voice of her husband "something between annoyance and agony." Palmer -- whose argument was incredibly complex and is hard to reproduce -- mapped out the ways in which both the character and the reader made sense of those distinct emotional states of mind. The result was a reading that, beyond digging deep into Pynchon, also helped make explicit the "folk psychology" Palmer says readers bring to texts -- and how we settle on meanings in the wake of unfamiliar emotional juxtapositions. As the panel's respondent, University of Connecticut's Elizabeth Hart, helpfully summarized, Palmers' reading greatly "complexified the passage" and was "richly descriptive" of the dynamics at play.


Complexification! That is the value add! To complexify the obvious would be an apt description of the lit-crit practicioner's art. Note my bolding of that universal weasel word, "richly". It is a word that puts a positive spin on clutter, excess, meaningless ornamentation. No professional complexifier's arsenal is complete without this word.

Will cognitive approaches become the next big thing in lit-crit? Or bio-criticism of the Darwinian brand? That probably won't happen, even as these approaches will, I think, continue to gain in reputation and standing. More to the point, as I argued in a 1998 article, these scholars who are linking Darwin and Dickens have helped challenge an intellectual orthodoxy that, however exciting it once was, seems pretty well played out. In his tour de force Mimesis and the Human Animal: On the Biogenetic Foundations of Literary Representation (1996), Temple's Robert Storey -- one of Nancy Easterlin's doctoral advisors -- warns:

“If [literary theory] continues on its present course, its reputation as a laughingstock among the scientific disciplines will come to be all but irreversible. Given the current state of scientific knowledge, it is still possible for literary theory to recover both seriousness and integrity and to be restored to legitimacy in the world at large.”

Ten years out, Storey's warning seems less pressing. The lure of the most arch forms of anti-scientific postmodernism has subsided, partly because of their own excesses and partly because of challenges such as Storey's. As important, the work being done by the cognitive scholars and others suggest that literature and science can both gain from ongoing collaboration.


How or when did lit-crit ever gain standing as a scientific discipline? Why should it? If one wishes to study cognitive psychology, then study cognitive psychology.

The only way that lit-crit can avoid perpetuationg it's laughingstock status would be for it to give up all it's pretensions to scientific status, and to trim it's ambitins back within the focused goals that I enumerate above. Yet Gillespie's enthusiasm for exciting developments to come is indicative of an attitude driven by institutional identity over rational analysis. We have this field, with established foundations, university chairs and departments, and a cadre of practitioners whose reputations, ambitions and future career paths are entirely dependent on the continuance of the fiction that this discipline holds an import and a promise well beyond itself.

New Year's Resolution

I have never been good at keeping resolutions, so I'll try to keep this list short. Mainly it will consist of the books that I want to read. It seems that my resolution failures can be grouped in two categories: fitness/weight loss and leisure time activities. Work and family have always taken care of themselves, for better or worse, but those things that I wanted to do with my leisure have always slipped through the cracks.

So here are my top ten must read books for 2006 (some of which I already own):

1. What Evolution Is by Ernst Mayr.

2. Cryptonimicon by Neal Stephenson (finish reading).

3. Imperial Grunts : The American Military on the Ground by Robert Kaplan.

4. How the Mind Works by Steven Pinker.

5. The Blank Slate : The Denial of Human Nature and Modern Intellectual Life also by Steven Pinker.

6. The Civil War by Shelby Foote - part 1: from Ft. Sumter to Perryville.

7. Religion Explained by Pascal Boyer (finish reading).

8. The Reformation: A History by Diarmaid MacCulloch (finish reading).

9. Against the Gods: The Remarkable Story of Risk by Peter L. Bernstein.

10. The Roots of American Order by Russell Kirk.

As you can see, my list is highly skewed toward non-fiction. If I would add any other fiction title, it would probably be something by Tom Wolfe, otherwise I see most modern fiction as a wastleland.

Comments? Suggestions?