Monday, May 25, 2009

A Return to the Constitution

By Larry P. Arnn
Imprimis, Hillsdale College

Larry P. Arnn is the twelfth president of Hillsdale College. He received his B.A. from Arkansas State University and his M.A. and Ph.D. in government from the Claremont Graduate School. He also studied at the London School of Economics and at Worcester College, Oxford University. From 1985 to 2000, he was president of the Claremont Institute for the Study of Statesmanship and Political Philosophy. He is on the boards of directors of the Heritage Foundation, the Henry Salvatori Center at Claremont McKenna College, Americans Against Discrimination and Preferences, the Center for Individual Rights, and the Claremont Institute. He is the author of Liberty and Learning: The Evolution of American Education (Hillsdale College Press, 2004).


It is a custom more honored in the breach than the observance.
Hamlet, Act I, Scene IV

Even in Hamlet, where ghosts help the action along, it is necessary to choose between the breach and the observance of a custom. The Bard can twist things around plenty in his plays, but the law of contradiction is stubborn. For all his art, Shakespeare cannot make his characters do a thing, and not do it, at the same time.

We live in a more liberated age, the age of bureaucratic government. Here rules abound in such profusion that they seem to overbear the laws of nature themselves. So it is with honoring the Constitution these days. We honor it more avidly than ever in the breach of its restraints, but at the same time we pay it the respect of mandatory, hectic, and empty observance. Except for our dishonoring of it, we have never honored it so much.

Take two examples, the first from Senator Robert C. Byrd of West Virginia. He is the longest serving senator, and by reputation a great historian of that body and of the nation. He is fond of the Constitution of the United States. He talks of it often, and he carries a copy with him, he says, at all times. He is the author of a law now three years old that requires Constitution Day celebrations at schools and colleges across the land, if they take the federal dollar, which with rare exceptions they do. Never mind that there is a constitutional question about that federal dollar. We make it the ground of a federal command to respect the Constitution nonetheless. The government’s breach is the authority for mandated observance.

In a fine quote, Senator Byrd calls upon us to make the Constitution an active part of our lives. He reminds us that we cannot defend and protect it if we are ignorant of its history and how it works. He recalls the “limits that the Constitution places on how political power is exercised,” which limits have ensured our freedom for more than two centuries. Then he votes for earmarks on a scale to shame a Vermont liberal (or, these days, a Nebraska Republican), for subsidies to investigate the fluke and the flounder in maritime centers in landlocked states (especially his own, where the centers are named for him), and for every federal gazebo and portico from West Virginia to Baja. Whatever the “limits that the Constitution places on how political power is exercised,” these days they leave the budget process in a position of latitude.

This is Senator Byrd, modern paragon of service to the Constitution.

The other example is from our most recent Constitution Day, September 17, 2007, the 220th anniversary of that greatest written instrument of government. I do not mean the official celebration, which was noble and good. The big Washington players of the day were not there, but Colin Powell read the Preamble aloud in the Capitol, and people around the world had a chance to read along by way of the Internet. Somehow Constitution Day manages to come off better than the other days that have grown up under the “national day of observance” laws that place Constitution Day on a par with National Maritime Day (May 22), America Recycles Day (November 15), and Pan American Day (April 14, which grants us a little gaiety before taxes come due).

Our second example is rather another ceremony happening on that day in Washington. Over in another part of town, in one of those newer sorts of buildings that now obscure and offend the lovely architecture of L’Enfant, a ceremony of a different sort was held. It was a ceremony of dedication to make a new hero out of an old name: Lyndon Baines Johnson. That name is now attached heroically to the building where the Department of Education resides. The family of President Johnson was present to receive honor for the mighty good he did for education.

The Chronicle of Higher Education did a good job covering this event. They put the main point first:

Washington — A decade ago, Republicans were vying to eliminate the Education Department, deriding it as a wasteful expansion of federal authority. Today, they led a ceremony outside its headquarters here not only to celebrate the department, but to name the building after a trademark big-government Democrat from Texas: Lyndon Baines Johnson.

The first sentence recalls Republican platforms as recent as the one of 1996, which declares:

Our formula is as simple as it is sweeping: the federal government has no constitutional authority to be involved in school curricula or to control jobs in the work place. That is why we will abolish the Department of Education, end federal meddling in our schools, and promote family choice at all levels of learning.

Reading this passage is like watching the first Die Hard movie, where Bruce Willis seems so young, and the movie for all its violence has a kind of innocence. That was before the real terror war. That was when we thought a man of action working under his own direction was the key to making great things happen. That was also when we thought that we could do something about the centralization of power that is the great tendency of the age. Having lost this innocence, now we are slightly embarrassed to read the naivete of the 1996 Republican platform, or to think how foolish Ronald Reagan might have been to try to get rid of the Department of Education.

At the ceremony, Secretary of Education Margaret Spellings seems unaware of these former embarrassments, although we will shortly recite a little evidence that she is not. At the ceremony, she is full of the glories of the Great Society:

Forty years ago, education was just beginning to transition from being a small office dedicated to gathering statistics. Today we’re 4,500 strong, armed with computers and Blackberrys, and we’re committed to a mission, ranging from financial aid to special education and to making sure that no child is left behind.

Note how the Secretary conflates the name of the office where she works with the thing that it regulates. The 4,500 people who work at the Department of Education are not teachers, at least not any more, and they do not directly accomplish education. But still, in some sense, they have become “education.” It is they, and not the hundreds of thousands of teachers who work in education, nor the millions who have worked in it during the past 40 years, who make the difference in education. It is they, believes the Secretary, who leave no child behind. But that, alas, is a much easier thing to say than it is to do.

Never mind also, at least on this day, the 40 years of political history that have intervened since the Great Society. Never mind the service of President Reagan, whose political achievements provide the ground upon which Secretary Spellings stands, even if she is inclined to jump off it. In fact neither she nor President Bush would likely be in office but for him. This is such an obvious fact, and President Bush himself has so often spoken of the achievements of Reagan and his wish to emulate him, that it is hard to believe that Reagan now seems forgotten.

There is evidence that he is in fact not forgotten, but rather ignored. Secretary Spellings recently gave an interesting interview to Human Events reporter Terence Jeffrey. She was candid and intelligent in the interview, for one thing disarmingly ready to admit the failures of her policies so far, even while defending them and predicting their long-term success. She favors school choice and works to get it implemented, if so far without much success. She has tough words for the education union that is such a dreaded political obstacle to reform. But toward the end of the interview she was asked a pair of questions that she found difficult.

Mr. Jeffrey asked her if she could “point to language in the Constitution that authorized the federal government to have a Department of Education.” Her reply shows that she knew the bearing of the inquiry: “I think we had come to an understanding, at least, of the reality of Washington and the flat world, if you will, that the Department of Education was not going to be abolished, and we were going to invest in our nation’s neediest students.”

Mr. Jeffrey persisted: “It is one thing to say that the political reality is we are not going to abolish the federal Department of Education, but can you seriously point to where the Framers actually intended the Constitution to authorize a Department of Education?”

The Secretary replied: “I can’t point to it one way or the other. I’m not a constitutional scholar, but I’ll look into it for you, Terry.” Mr. Jeffrey reports that he did not get his answer.

This is Secretary of Education Margaret Spellings, sworn to uphold the Constitution in the exercise of her office.

Secretary Spellings and her department provide an example to stand for the rest of the federal domestic establishment. It is the archetype of our current condition and the direction in which we travel. It is doing what the rest are doing, and it is doing it for the same reasons. In examining it, we can see both the problem and the solution.

The Department of Education grows now at a rate much faster than the Department of Defense, even in time of war. It grows much faster than the domestic economy, even now when the economy grows rapidly. It grows faster than the population it serves, even when that population is growing. The pace of its growth will quicken with the recent passage of the Higher Education Access Act of 2007, which reduces the size of student loan subsidies, but redeploys that money into outright grants, loan forgiveness, and new programs. If the past is prologue, these new programs will grow as fast as the old ones have done.

Why would this be happening in a Republican administration, the first in a generation (prior to the 2006 elections) to control both houses of Congress along with the White House? The people involved are not for the most part corrupt or ill-intended; surely Secretary Spellings is neither. Something strong is moving them and her. There are two kinds of things.

The first kind is found in the obstacles anyone in office must face. Education is desperately in need of reform; for example, our high school graduates have math and science scores at the bottom of the industrialized world. The longer they are in school, the lower they fall. When one attempts to repair this, one meets quickly the most powerful of public sector lobbies, the education union. Its members have a vested interest to protect and the prestige that comes rightly from serving, but not rightly when only seeming to serve, the young. Finally, the cost and complication of college is fearsome to parents, who are unaware that the subsidies and outside interests that control education make both of them worse. It is very difficult in the circumstances to do anything good.

The second kind is to be found inside the Secretary and other parts of the Administration. They are drawn to the principles of the Great Society. In her interview with Terence Jeffrey, Secretary Spellings refers to the “flat world.” Doubtless she means the pressure of globalized economic competition made possible by global communications. She likes to say in her speeches that, to face this competition, we have to emulate the achievement of the Great Society. She mentions in particular the response to the Sputnik crisis, which was a great national effort to subsidize higher education and thereby beat the Soviet Union to the moon. Now we can beat China and other competitors economically by the same device.

From a simple chronological point of view, the example of the Sputnik crisis does not quite work. There was not enough time for the federal programs to educate any appreciable number of scientists to participate in the NASA programs that got us to the moon. Twelve years elapsed between the Sputnik and the landing on the moon; one does not produce astrophysicists very quickly. The mistake goes deeper than a point of chronology. It involves a mistake about the nature of constitutional government and the source of American power. And this is connected to a mistake about the purpose of education itself.

These two mistakes are closely related. The American government is explicitly, and to a unique degree, justified by an account of the nature of man and his relation to God above and the beasts below. This involves a perception, not of utility here on earth, but rather of the order of nature against which all utility must be judged. In the old understanding of America, the one propagated by those who made the nation, the preparation for leadership and for excellent living consisted in the contemplation of this order and the study of its application to our lives.

One cannot miss this if he studies old documents, both about the making of the Constitution and about the founding of colleges. Our own college, by no means unique in this respect, was built in service of the blessings of “civil and religious liberty and intelligent piety.” The first two are civic goods, achieved first in the American Republic. To secure these blessings, we promise an education that will “develop the minds and improve the hearts” of our students. In other words, the purpose of education has both an intellectual and a moral component, and these are connected essentially. One will find these sentiments in the founding of nearly any old college of quality.

One will find them also in the founding documents of the country. They are present most famously in the Northwest Ordinance, which in its third article proclaims that “Religion, morality, and knowledge, being necessary to good government and the happiness of mankind, schools and the means of education shall forever be encouraged.” In other words, good living both in the private and the public sense requires knowledge of the things above. The purpose of education, and especially of higher education, is to come to know and contemplate these higher things.

One will not find these sentiments in the plans for education made in the Department of Education today. Of course it would be difficult to put them there: religion, for one thing, has now been systematically excluded from the public schools as a matter, purportedly, of constitutional law. There is, however, no sign that the people in charge of the department have any wish to include them. The report of the National Commission on the Future of Higher Education reduces education to the purpose of preparing young people for a job and of making the nation powerful and successful in its economic competition with other nations. The idea—questionable upon its face—is that only a national coordinated effort can make us formidable to China, for example. China is indeed growing rapidly. This has become possible only because, under duress and against its every wish, the government of China has liberated its people to start their own businesses and make their own plans. They seek to emulate our successes to the extent that they are forced. We seek to emulate their failures because we find them attractive.

What then is to be done?

In this gloomy picture there is no major national force, at least no political force, united to support constitutional government in its old and proven sense. If we cannot find our solution in the present, then we must look to the past. One of those successes is the recently rejected example of Ronald Reagan. When Reagan began his career it seemed simply impossible to resist this type of bureaucratic government, just as it seems today. He proceeded nonetheless, in part because he had a clear understanding of the purpose both of education and of constitutional government. About the purpose of education, he said:

“Train up the child in the way he should go,” Solomon wrote, “and when he’s old he will not depart from it.” That is the God-given responsibility of each parent, the compact with each teacher, and the trust of every child.

In another place he revels in the love of the founders for education and their faith “that an educated populace would guarantee the success of this great experiment in democracy…”

As for the organization of education, Reagan understood it from the constitutional perspective of self-government. His First Inaugural, a worthy successor to the greatest inaugural speeches of the greatest presidents, is built around the theme of self-government and the association of every American with the great heroes of America, including Washington, Lincoln, and Jefferson, in the practice and defense of self-government. In another speech he said:

Our leaders must remember that education doesn’t begin with some isolated bureaucrat in Washington. It doesn’t even begin with State or local officials. Education begins in the home, where it’s a parental right and responsibility. Both our public and our private schools exist to aid our families in the instruction of our children, and it’s time some people back in Washington stopped acting as if family wishes were only getting in the way.

A government that forgets this sentiment is not competent to give instructions for higher education. Forgetting the purpose of education, such a government is likely to forget its own purpose, too. That is dangerous both to liberty and to justice.

The question what is to be done is simple to answer: it is not enough anymore to rehearse by rote the Constitution or to celebrate it in vacuous observances. Both our statesmen and our citizens must return first to its study, with depth and intensity, and then to its sustenance, with eloquence and resolve. Nothing else will do.

Debating the Separation of Religion and Politics / The Bishops' Conscience Clause

By Richard John Neuhaus
First Things

Last Saturday, the British magazine The Economist, sponsored a debate on this resolution: "Religion and politics should always be kept separate." There was an audience of about a thousand, and at the beginning of the debate the vote was about five to one in favor of the resolution. This is Manhattan, after all. At the end of the debate the house was pretty evenly divided but still with a slight majority in favor. Barry Lynn of Americans United for the Separation of Church and State was the lead for the other side. He spent most of his time highlighting some of the less circumspect statements of Jerry Falwell and others on the "religious right" who, according to Lynn, want to establish a theocracy in America. Barry Lynn deeply mourns the passing of Jerry Falwell.

Herewith my opening statement at the debate. The connections between religion and politics is a huge subject and, in First Things and elsewhere, I have addressed other dimensions of the question. But this was tailored to an audience assumed, correctly, to be strongly hostile to the argument. And the house was turned around, almost. You might perhaps find the statement of some interest.



I speak in favor of the separation of church and state, and therefore against the resolution that religion and politics should always be kept separate. Permit me to explain. To enforce the exclusion of religion from politics, or from public life more generally, violates the First Amendment guarantee of the "free exercise of religion." The free exercise of religion is the reason for the separation of church and state¯a principle that aims not at protecting the state from religion but at protecting religion from the state.

In the First Amendment, religious freedom is of a piece with, indeed is in the very same sentence with, free speech, free press, free assembly, and the right to challenge government policy. Hence the resolution put before this house flatly contradicts the guarantees of a free and democratic society enshrined in the Constitution of the United States.

Secondly, I urge you to oppose the resolution because it is foolish to attempt to do what by definition cannot be done. Such an attempt can only intensify confusions and conflicts, further polarizing our public life. To exclude religion is to exclude from politics the deepest moral convictions of millions of citizens¯indeed, in this society, the great majority of citizens. Thus the resolution before this house is a formula for the death of democracy and should be resolutely defeated.

What do we mean by politics? I believe the best brief answer is proposed by Aristotle. Aristotle teaches that politics is free persons deliberating the question "How ought we to order our life together?" The ought in that definition indicates that politics is in its very nature, if not always in its practice, a moral enterprise. The very vocabulary of political debate is inescapably moral: What is just? What is unjust? What is fair? What is unfair? What serves the common good? On these questions we all have convictions, and they are moral convictions.

It is not true that our society is divided between a moral majority of the religious, on the one hand, and an immoral or amoral minority of the nonreligious, on the other. Atheists can have moral convictions that are every bit as strong as the moral convictions of the devout Christian or observant Jew. What we have in the political arena is not a division between the moral and the immoral but an ongoing contention between different moral visions addressing the political question¯how ought we to order our life together?

This ongoing contention, this experience of being locked in civil argument, is nothing less than democracy in action. It is Lincoln and Douglas debating the morality of slavery; it is the argument about whether unborn children have rights we are obliged to respect; it is the argument over whether the war in Iraq is just or unjust. And on and on. These are all moral arguments to which people bring their best moral judgment. In short, our political system calls for open-ended argument about all the great issues that touch upon the question "How ought we to order our life together?"

The idea that some citizens should be excluded from addressing that question because their arguments are religious, or that others should be excluded because their arguments are nonreligious or antireligious, is an idea deeply alien to the representative democracy that this constitutional order is designed to protect. A foundational principle of that order is that all citizens have equal standing in the public square.

But what about the institutions of religion such as churches or synagogues? They may understand themselves to be divinely constituted, but, in the view of the Constitution, they are voluntary associations of citizens who join together for freely chosen purposes. They are in this respect on the same constitutional footing as labor unions, political action groups, professional associations, and a host of other organizations formed by common purpose. In the heat of the political fray, all these institutions are tempted to claim that, on the issues that matter most to them, they have a monopoly on morality. All of them are wrong about that.

Religious institutions are also¯some might say especially¯tempted to claim a monopoly on morality. Whether it is the religious right or the much less discussed religious left, their leaders sometimes make a political assertion and then claim, "Thus saith the Lord." Jim Wallis, a prominent leader of the religious left and of the Democratic party's effort to reach so-called values voters, has even written a book with the title God's Politics. In his book, he lays out, among many other things, how the prophet Isaiah would rewrite the federal budget of the United States. This is presumption and foolishness of a high order. But the constitutional guarantee of the free exercise of religion guarantees that foolish things will be done in the name of religion. Just as the guarantee of free speech ensures that foolish things will be said in innumerable other causes. We all¯left and right, liberal and conservative¯have a constitutional right to be stupid.

As I have suggested, religion cannot be separated from politics. More precisely, religion cannot be separated from democratic politics. But I do believe that religious leaders should be more circumspect and restrained than they sometimes are in addressing political issues, and that for two reasons. The first and most important reason is that the dynamics of political battle tend to corrupt religion, blurring the distinctions between the temporal and the eternal, the sacred and the profane. So the first concern is for the integrity of religion.

The second concern is for the integrity of politics. Making distinctively religious arguments in political debates tends to be both ineffective and unnecessarily polarizing. Citizens who are religious, like all citizens, should as much as possible make arguments on the basis of public reasons that are accessible to everyone. That is my advice to both the religious left and the religious right, to both Jim Wallis and Pat Robertson. But they are under no constitutional obligation to accept my advice, and, based on past history, they probably won't. Remember the constitutional right to do dumb things.

There is a long and complicated history by which the West, and America in particular, has arrived at our commitment to freedom of religion, freedom of the press, freedom of speech, and freedom of political action. These freedoms, as they are enshrined in the First Amendment, are all of a piece. Our history and our commitment is not shared by everyone in the world. In most dramatic contrast today are Islamic societies in which, as many see it, the brutal choice is posed between monolithic religion or monolithic secularism. We have to hope that is not the case, but that is a problem for Muslims to resolve.

Thank God, and thank the American Founders, our circumstance is very different. Ours is a pluralistic society in which, by the means of representative democracy, all citizens¯whether religious, nonreligious, antireligious, or undecided¯are on an equal footing as they bring their diverse and sometimes conflicting moral visions to bear on the great question of politics¯how ought we to order our life together?

The resolution before the house is "Religion and politics should always be kept separate." Because it violates the First Amendment guarantee of the free exercise of religion and associated guarantees such as free speech, because it is alien to the American experience, and because it could not be implemented without undermining the equality essential to a pluralistic and democratic society, I urge you to defeat this profoundly illiberal resolution.






The United States Conference of Catholic Bishops (USCCB) has just completed its fall meeting in Baltimore, and, with predictable alacrity, the usual suspects are jumping in to reap what benefits they can for their favored causes. Which I suppose means that I'm jumping in too. Fair enough.

The focus of attention is on the "Forming Consciences for Faithful Citizenship" document and a statement on the Iraq War. The latter is technically a statement by the outgoing president of the conference, Bishop William Skylstad, but it is approved by the conference. "Faithful Citizenship" is in a decades-long line with election-year documents issued by the conference and is sometimes referred to as a voting guide, which it is not. This year's document is different in that it is the product of a much wider process of consultation and was debated on the floor in open session.

"Faithful Citizenship" is a carefully considered reflection on political responsibility, the difference between "intrinsic evils" and prudential judgments, and the ways in which conscience is rightly formed. Several news stories have highlighted that the bishops left "loopholes" or created "leeway" for Catholics to vote for pro-abortion candidates. That is not true. The document repeatedly and emphatically gives top priority to the protection of innocent human life in instances such as abortion, euthanasia, and embryonic stem cell research.

But, of course and of necessity, it also says: "In making these decisions, it is essential for Catholics to be guided by a well-formed conscience that recognizes that all issues do not carry the same moral weight and that the moral obligation to oppose intrinsically evil acts has a special claim on our consciences and our actions. These decisions should take into account a candidate's commitments, character, integrity, and ability to influence a given issue. In the end, this is a decision to be made by each Catholic guided by a conscience formed by Catholic moral teaching."

The "conscience clause" is not a loophole but speaks to a solemn obligation. It is clear Catholic teaching that one must act in accord with conscience, even if one's conscience is misguided. At the same time, one is obliged to form one's conscience according to moral truth. It is also the Church's teaching, reiterated in this document, that acting according to a rightly formed conscience is a matter that impinges upon one's eternal salvation.

It is obvious that some Catholics are, if not pro-abortion, at least of the view that opposition to abortion is trumped by other matters they consider more important, such as an immediate withdrawal from Iraq, their preferred health plan, or a Democratic victory in the presidential election. If knowingly and with full intent they collaborate in the intrinsic evil of abortion, they may say that the conscience clause provides them with a loophole. In such an instance, as "Faithful Citizenship" rightly underscores, they are provided only with the choice to surrender their soul's salvation.

The first news reports on Bishop Skylstad's statement on Iraq, which was approved by the conference, highlighted the statement that the situation in Iraq is unacceptable and unsustainable. One can hardly imagine how anyone could say that it is acceptable and sustainable. The question is what is to be done about it. The statement says that the conference "encourages our national leaders to focus on the morally and politically demanding, but carefully limited goal, of fostering a 'responsible transition' and withdrawal at the earliest opportunity consistent with that goal."

The statement goes on to say: "We do not have specific competence in political, economic, and military strategies and do not assess particular tactics, but we can, as teachers, share a moral tradition to help inform policy choices. Our Catholic teaching on war and peace offers hard questions, not easy answers. Our nation must now focus more on the ethics of exit than on the ethics of intervention."

Exactly. People may argue until the cows come home about the rightness or wrongness of what was done in 2003, but the question now is what is required for a "responsible transition," recognizing that such a transition entails many considerations, including stability in the Middle East, the credibility of American power in world affairs, and the prospect of securing a government of law and basic decency for the Iraqi people. The statement is notable also for lifting up concerns that are largely neglected by others, especially the plight of refugees and of the Chaldean Christians in Iraq.

All in all, the statement on Iraq is a carefully considered moral reflection on a set of problems that do not lend themselves to easy resolution. In this statement, as in the "Faithful Citizenship" document, the bishops have provided an example of how teachers of the Church can inform public discourse by neither exceeding their competence nor shirking their responsibility.

Teaching on Purpose

By Phillip E. Johnson
Touchstone Magazine

On September 16, 2007, Yale Law Professor Anthony Kronman published an essay in the online edition of the Boston Globe, provocatively criticizing today’s elite colleges for neglecting to address the most fundamental question of all. This is, Kronman wrote, “the question of the meaning of life, of what one should care about and why, of what living is for. In a shift of historic importance, colleges and universities have largely abandoned the idea that life’s most important question is an appropriate subject for the classroom.”

“In doing so,” continued Kronman, “they have betrayed their students by depriving them of the chance to explore it in an organized way, before they are caught up in their careers and preoccupied with the urgent business of living itself.” This, of course, is something that critics of modern liberal education have been saying for years, and it’s refreshing to hear it from a Yale professor. This is not just an academic question, a matter of what students study, but a critical, blameworthy failure.

Kronman also is worried about the consequences for society as a whole. He warns, “This abandonment has also helped create a society in which deeper questions of values are left in the hands of those motivated by religious conviction—a disturbing and dangerous development.” I understand this to mean that the development is disturbing and dangerous because Kronman is worried that religiously motivated people will not accept the naturalistic worldview that is mostly taken for granted in the classrooms of secular universities like Yale.

Kronman thinks that colleges do not address life’s most fundamental questions because our top universities “have embraced a research-driven ideal that has squeezed the question of life’s meaning from the college curriculum, limiting the range of questions teachers feel they have the right and authority to teach.” Consequently, the humanities, “the disciplines with the oldest and deepest connections to this question,” have been “badly weakened,” and are now left “directionless and open to being hijacked for political ends.”

Hunger to Explore

Kronman is encouraged that today there seems to be “a growing hunger among students to explore the big questions. As questions of spiritual urgency—abortion, creationism, and the destruction of the environment—move to the center of debate in our society,” Kronman writes, we need “an alternative approach to a college education that takes these matters seriously without pretending to answer them in a doctrinaire way.”

He does not speculate on what might happen if teachers at Yale began to take disputes over abortion, creationism, and global warming seriously, without ensuring that the answers were both politically correct and satisfactory to the powerful scientific organizations that define scientific correctness in our institutions.

I think it is all to the good if the dominant metaphysical assumptions in our universities are being questioned, whether the questioners are religiously motivated or not. Nevertheless, I am sure that college teachers hesitate to address the meaning of life, not because they must do research, but because they are confined by the assumption that, in a world created by purposeless material forces, life has no inherent meaning, or at least no meaning of which we can have knowledge. Thus, on this subject there can be nothing more than subjective preferences, unsuitable for rigorous analysis.

Nonetheless, I highly recommend Kronman’s essay for pointing the way to a much-needed rejuvenation of undergraduate education, and also to a reconsideration of the assumptions of the research university. To raise the question of the purpose of life implies that life may well have a purpose, and this premise poses an implicit challenge to the naturalistic worldview, which has for too long been allowed to define rationality in the research universities that dominate our intellectual life.

With or Against Culture?

By Jean Bethke Elshtain
Christian Vision Project

What can Christians embrace in the here and now? The blessings are all around us.

Christ against culture; the Christ of culture; Christ above culture; Christ and culture in paradox; and Christ as transformer of culture—these are the possibilities enumerated by H. Richard Niebuhr in his classic work. I take these to be strong tendencies rather than airtight laws of Christian engagement, often with considerable overlap between categories. They can help us to take our bearings as we reflect on the question of a counterculture for the common good. Clearly the question presupposes not one but at least two of Niebuhr's models: both Christ against culture and Christ as transformer of culture.

In our time, these are not mutually exclusive. As a stand-alone posture, against too often turns into brittle condemnation, a stance of haughty (presumed) moral superiority, wagons circled. Transform on its own may degenerate into naïve idealism, even utopianism, a stance concerning which Dietrich Bonhoeffer reserved some of his most severe words. The radical begrudges God his creation, Bonhoeffer insists, for the radical seeks a self-sovereignty incompatible with recognition of our indebtedness to others in the past as well as the present. The radical is all ultimacy, prepared to sacrifice the penultimate, the here and now, for some eschatological goal.

Avoiding these extremes, we must see Christ against and for, agonistic and affirmative, arguing and embracing. This is complex but, then, Christianity is no stranger to complexity. One of the glories of the faith historically has been its wonderful intricacy, the way in which it engages the intellect, helping us to "serve God wittily, in the tangle of our minds," words uttered by the St. Thomas More character in Robert Bolt's A Man for All Seasons. What can Christians embrace in the here and now? The blessings are all around us. In my book Who Are We? Critical Reflections, Hopeful Possibilities, I tell the story of one of our grandsons, who, when he was but two years old, exclaimed one beautiful sunny morning as he was swinging in the backyard: "Grandma, everything is everywhere!" I loved those words then and I love them now. They remind us of how much there is to be grateful for and how easy it is to take it all for granted.

I don't want to wax romantic and sound like Wordsworth extolling daffodils, but it is hard not to sound that way if one attempts to describe the beauties of creation. Even St. Augustine, wary as he was of worldly pleasure and beauty, couldn't help himself. Why is everything so beautiful? he asked. It is as if the very trees and flowers long to be known; indeed, the flowers lift their faces to us. Augustine's great biographer, Peter Brown, notes Augustine's "immoderate love of the world." That love includes friendship and family and the canny ways human beings craft and build and care.

So one can affirm the works of human beings and the creative spark we bring to the most utilitarian crafts and activities. Nor can one neglect the institutions human beings have built in order to sustain a way of life in common together. Amor mundi—love of the world—is surely a Christian way of being in the world, so long as it is tempered by a recognition of that beyond the world as we know it, that which claims us and names us "Christian."

Yet contra mundum is also part and parcel of Christ as transformer of culture, so long as this does not become a hardened ideological stance of opposition, an exaltation of grumpiness as a rarified norm, when it is merely the burnishing of contempt. There is in fact much to oppose. If we frame matters with an eye to the ultimate as well as the penultimate, the here and now, we will be able to assay matters critically, and ultimately be able to offer an alternative.

Let's take two items ripped from today's headlines, as we say, the first from an article in The New York Times of June 6, 2006. The article informs us of the startling and alarming fact that the use of antipsychotics by young people in the United States rose fivefold in one decade. Unless American children are suddenly being overtaken by psychoses, this datum calls for sober analysis and criticism. What does this medicalization of childhood portend? How does one explain it? What should we do about it? The cultural boosters will appear before us decked out in sunny hues and tell us that the feeding of anti-psychotic drugs to America's children arises from ever more vigilant care and attunement to the needs of the young. But we cannot take that at face value. Medicine does not exist in a cordon sanitaire free from the influences of economic and cultural forces. What are the problems being treated? "Aggression" and "mood swings," we are told, in addition to that old standby "attention deficit disorder."

Such a basic piece of cultural information matters to a Christian believer. To embrace the Christ who is both with and against culture, hence the affirmer and critic and transformer of culture, is to be particularly attuned to a culture's children, all of them children of God, beings of inestimable worth. It is also to become attuned to their complexity and diversity—not primarily the bean-counting diversity that currently prevails but rather the diverse gifts and qualities that distinguish one child from another, even in the same family. How can we nourish these qualities? How can we control those attributes that are self- or other-destructive? This leads us straight to questions of cultural and religious formation.

For our cultural milieu is one in which the norm is both parents working outside the home, exhausted and busy. It values success and drivenness, measuring success through monetary reward. It glamorizes celebrity and ignores the hard work people do every day to raise children and sustain neighborhoods, to make life less brutal and more decent and kind. It is a milieu of pervasive family fragmentation if not outright breakdown, to which many children respond with anger and "acting out." In this milieu every personal question, and many public questions, are medicalized and psychologized; new drugs are touted not only to the public but to the medical profession via lavish marketing stratagems and budgets.

Christians begin their reflections on this cultural setting with the gift and integrity of the bodies and beings of children. They go on to consider the gift of time and how precious it is. They consider the concreteness of the Christian message—do unto others here and now, not in the distant future, not in an abstract way. Do not ignore the person before you. This, in turn, invites critical reflection on whether we are rushing to diagnose children as "troubled" or "hyperactive" in part because parents no longer spend concentrated time with their children and prefer them to be pacified when they are with them. Such reflection suggests that radical and uncontrolled experimentation on America's children, by way of powerful drugs, many with known, deleterious side-effects, absent knowledge of long-range effects, may be undertaken at least as much for the convenience of adults as it is for the benefit of children.

Any assault on the integrity of the human body should be of heightened concern to the Christian because Christianity is an exquisitely embodied religion. We recall sobering moments from the past when children—and adults—were quickly labeled "antisocial" or "incorrigible," institutionalized and forgotten. Now we think we are humane in rushing to medicalize, often against the advice of cautious voices within the medical community as to the alleged benefits and the many known dangers of massive drug use. One doctor cited in the Times spoke of children put on "three or four different drugs," each of which created new symptoms and side effects, before going on to ask: "How do you even know who the kid is anymore?"

That is a frightening sentence: how do you even know who this child is? If we believe every child is claimed by his or her Creator, we should be alarmed by a social milieu where children are treated instrumentally, where pacification of children rather than care and attention to each child in his and her particularity becomes a social norm. We are against this. What are we for? Minimally, we are for taking a hard look at how children are faring in our society. That, in turn, can spur transformation, especially in what I have called "the politics of time." Good, old-fashioned time is what so many children need. How can a society that pretends to be child-centered justify culturally approved neglect? It goes without saying that neglect comes in many forms: tens of thousands of privileged children are neglected in the way I am noting here.

The second item comes from the June 8, 2006 issue of Time Out New York, under the heading "Get Naked," a regular feature in which the magazine's "sexpert" explores the "ins and outs of love and lust." In this issue, the "sexpert" congratulates a mother who has written him extolling the joys of masturbation discovered by her precious seven-year-old daughter, who now wants to know all there is to know about the penis. The mother has decided to enlighten her seven-year-old with photographs by Robert Mapplethorpe, whose explicit images of male genitalia in different postures of sex acts and bodily functions stirred up such controversy a few years ago.

Even the "sexpert" cavils at this. Perhaps that is going a bit too far, he suggests, for there are other ways to educate about the penis. He closes by extolling this mother's "shame-free attempts to give your daughter the information she needs to become a well-adjusted, self-empowered individual." Of course, the Mom must be alert to all those meanies out there who might try to fill her daughter's brain with "body-detesting nonsense," presumably along the lines of: "When you're a bit older we'll discuss those things," or, "You know, that just isn't appropriate. Let's think of something else to do," or, "Respecting your body means to take care of it and not just to use it for any purpose."

"Well-adjusted, self-empowered"—the mantra of our time. And well-adjusted means no worries, no shame. Everything is to be uncovered, everything displayed. "Self-empowered" means one can do anything and everything that gives one pleasure, though whether that will bring joy is quite another matter. Where does one start with this pack of nonsense and untruths? The Christian repairs to the story of creation and fall. We cannot escape the heritage of human sin and shortcoming. We inherit it. To pretend that stark nakedness, unveiling everything, is the ideal is to pretend we are back in the Garden and have no history. It is to pretend that the categories of good and evil no longer apply: one is truly in a world beyond good and evil. We have seen what such a world looks like, a world fabricated by those who believed they were supermen beyond normative constraint, and it is a violent, cruel, systematically horrific world.

We should not be fooled. Cultural mavens preaching the gospel of the "well-adjusted" purport to embrace what is "natural" when, in fact, their ministrations disrupt the natural, or so Bonhoeffer argues, by treating life in a vitalistic way that destroys limits and misuses freedom. The Christian tradition offers a number of ways to articulate those limits, but that is beyond the scope of a modest paper. Yet articulate them we must, for those who embrace a shame-free (not shameless, one notices) life assault the integrity of our bodilyness—in this case exploiting a seven-year-old child to promote a cultural agenda not the child's own, using her body to score a point in favor of destroying shame and abrogating limits.

A seven-year-old should be discovering the joys of friendship and learning, finding how to make her way in the world, and imagining and preparing for a future. Instead, and by the mother's own account, this child is self-obsessed, masturbating at will and demanding more information on "private parts," private no more in the shame-free society.

The Christian can remind our culture that crossing the barrier of shame is a very dangerous thing and must be considered carefully. Minimally, eradicating shame in the name of being "well-adjusted" must be questioned and challenged at every point. And this should be done in the light of Christ with, against, and transforming culture—not simply critique, but the embrace of an articulated alternative. The world is a vast laboratory for our consideration. As I indicated above, there is much to applaud. But there is also a good deal to condemn, yes, condemn, even as Christ condemned the money changers in the temple. Such condemnation must flow not from the Christian as a "despiser of men" (as Bonhoeffer wrote) but as one who loves and cares for the world.

The common good is clearly at stake in all of this. A society that does not tend to its young lovingly and with respect for each child's deepest needs and integrity is a culture that is trapped in a spiral of self-loathing at some very deep and largely unarticulated level. We are not called to be gods or supermen and women. We are called to be, simply, human and to love one another, to care enough to set limits and to evoke normative ideals to which we are all accountable. We must not abandon our culture to its own most witless tendencies, even if these come dressed up as adjustment and self-empowerment.

Indeed, a Christian counterculture cannot simply be "counter" if it is to enact projects for the common good. People quite reasonably resist the ministrations of those whose proclamations are all negation. Confronted with such jeremiads, we tend to focus on what's eating such people, why their lives are all bitterness and rue, rather than on what may be vital and true about the message. (And if the message is entirely caustic it is unlikely to be true.) If, instead, one criticizes in a manner that displays one's love and concern for the culture and country of which one is a part, then and only then will one be taken seriously. Only then can one hope to spark projects of transformation, achieving a good in common that we cannot know alone.

Why the West Is Best

By Ibn Warraq
City Journal

My response to Tariq Ramadan
Winter 2008

Last October, I participated in a debate in London, hosted by Intelligence Squared, to consider the motion, “We should not be reluctant to assert the superiority of Western values.” Muslim intellectual Tariq Ramadan, among others, spoke against the motion; I spoke in favor, focusing on the vast disparities in freedom, human rights, and tolerance between Western and Islamic societies. Here, condensed somewhat, is the case that I made.

The great ideas of the West—rationalism, self-criticism, the disinterested search for truth, the separation of church and state, the rule of law and equality under the law, freedom of thought and expression, human rights, and liberal democracy—are superior to any others devised by humankind. It was the West that took steps to abolish slavery; the calls for abolition did not resonate even in Africa, where rival tribes sold black prisoners into slavery. The West has secured freedoms for women and racial and other minorities to an extent unimaginable 60 years ago. The West recognizes and defends the rights of the individual: we are free to think what we want, to read what we want, to practice our religion, to live lives of our choosing.

In short, the glory of the West, as philosopher Roger Scruton puts it, is that life here is an open book. Under Islam, the book is closed. In many non-Western countries, especially Islamic ones, citizens are not free to read what they wish. In Saudi Arabia, Muslims are not free to convert to Christianity, and Christians are not free to practice their faith—clear violations of Article 18 of the United Nations’ Universal Declaration of Human Rights. In contrast with the mind-numbing enforced certainties and rules of Islam, Western civilization offers what Bertrand Russell once called “liberating doubt,” which encourages the methodological principle of scientific skepticism. Western politics, like science, proceeds through tentative steps of trial and error, open discussion, criticism, and self-correction.

One could characterize the difference between the West and the Rest as a difference in epistemological principles. The desire for knowledge, no matter where it leads, inherited from the Greeks, has led to an institution unequaled—or very rarely equaled—outside the West: the university. Along with research institutes and libraries, universities are, at least ideally, independent academies that enshrine these epistemological norms, where we can pursue truth in a spirit of disinterested inquiry, free from political pressures. In other words, behind the success of modern Western societies, with their science and technology and open institutions, lies a distinct way of looking at the world, interpreting it, and recognizing and rectifying problems.

The edifice of modern science and scientific method is one of Western man’s greatest gifts to the world. The West has given us not only nearly every scientific discovery of the last 500 years—from electricity to computers—but also, thanks to its humanitarian impulses, the Red Cross, Doctors Without Borders, Human Rights Watch, and Amnesty International. The West provides the bulk of aid to beleaguered Darfur; Islamic countries are conspicuous by their lack of assistance.

Moreover, other parts of the world recognize Western superiority. When other societies such as South Korea and Japan have adopted Western political principles, their citizens have flourished. It is to the West, not to Saudi Arabia or Iran, that millions of refugees from theocratic or other totalitarian regimes flee, seeking tolerance and political freedom. Nor would any Western politician be able to get away with the anti-Semitic remarks that former Malaysian prime minister Mahathir Mohamad made in 2003. Our excusing Mahathir’s diatribe indicates not only a double standard but also a tacit acknowledgment that we apply higher ethical standards to Western leaders.

A culture that gave the world the novel; the music of Mozart, Beethoven, and Schubert; and the paintings of Michelangelo, da Vinci, and Rembrandt does not need lessons from societies whose idea of heaven, peopled with female virgins, resembles a cosmic brothel. Nor does the West need lectures on the superior virtue of societies in which women are kept in subjection under sharia, endure genital mutilation, are stoned to death for alleged adultery, and are married off against their will at the age of nine; societies that deny the rights of supposedly lower castes; societies that execute homosexuals and apostates. The West has no use for sanctimonious homilies from societies that cannot provide clean drinking water or sewage systems, that make no provisions for the handicapped, and that leave 40 to 50 percent of their citizens illiterate.

As Ayatollah Khomeini once famously said, there are no jokes in Islam. The West is able to look at its foibles and laugh, to make fun of its fundamental principles: but there is no equivalent as yet to Monty Python’s Life of Brian in Islam. Can we look forward, someday, to a Life of Mo? Probably not—one more small sign that Western values remain the best, and perhaps the only, means for all people, no matter of what race or creed, to reach their full potential and live in freedom.

Micromotives and Macrobehavior: How Diversity Leads to Homogeneity in the Blogosphere

By Joe Carter
Evangelical Outpost

[Note: I’m still trying to acclimatize to the pace of working on a Presidential campaign (I love saying that), so for the next few days I’ll be recycling material.]

“There were never in the world two opinions alike, any more than two hairs or two grains. Their most universal quality is diversity.”Montaigne

In his latest report, Technorati founder and CEO David Sifry claims that in the last 20 months, the blogosphere has increased in size by over 16 times. Technorati now tracks over 7.8 million blogs which, if Montaigne is correct, makes for a great deal of diversity. Yet if opinions are so dissimilar, why does their appear to be a general homogeneity of viewpoints within the world of blogging? Why can the vast majority of blogs be grouped according to binary political categories? A forty year old experiment on racial diversity might just hold the answer.
In the 1960’s, the Harvard economics professor Thomas C. Schelling devised a simple model to test his intuitions about segregated neighborhoods. Shelling found that most neighborhoods in America were mostly or entirely comprised of black or white families. Only a handful of neighborhoods where found where neither race made up more than three fourths of the total. Racism seemed to be the obvious culprit for the lack of diversity but Schelling thought something else might be involved.


His model showed how even tolerant people can behave in ways that can lead to segregated neighborhoods. It consisted of a checkerboard with 64 squares representing places where people can live. Two types of actors (representing, for example, whites and blacks) are placed at random among the squares, with no more than one per square. Schelling provided a “rule” that an actor will be content if more than one-third of its immediate neighbors (those in adjacent squares) are of the same type as itself. For example, if all the eight adjacent squares were occupied, then the actor is content if at least three of them are the same type itself as itself. If an actor is content, it stays put. If it is not content it moves. In Schelling’s original model, it would move to one of the nearest squares where it would be content.
Not surprisingly, Schelling found that the board quickly evolved into a strongly segregated pattern if the agents’ “happiness rules” were specified so that segregation was heavily favored. What was unexpected, though, was that initially integrated boards tipped into full segregation even if the agents’ happiness rules expressed only a mild preference for having neighbors of their own type.

Figure 1 on the right shows four stages in a simulation run by The Atlantic’s

Schelling’s model implied that even the simplest of societies could produce outcomes that were simultaneously orderly and unintended: outcomes that were in no sense accidental, but also in no sense deliberate. “The interplay of individual choices, where unorganized segregation is concerned, is a complex system with collective results that bear no close relation to the individual intent,” he wrote in 1969. In other words, even in this extremely crude little world, knowing individuals’ intent does not allow you to foresee the social outcome, and knowing the social outcome does not give you an accurate picture of individuals’ intent.

While the applicability to real-world housing situation may be a bit suspect, I believe that the model could provide some valuable insight into how blog clusters develop. Bloggers, for instance, would only need a mild preference for reading blogs that hold similar opinion to their own for clusters to develop spontaneously. If Schelling’s model is correct, the broad diversity of viewpoints among individuals would inevitably lead them to link and interact more often with those who hold similar opinions. Over time, the eight million blogs would naturally fall into distinct groupings that would cause them to appear rather homogeneous. “Micromotives,” as Schelling calls them, would lead to strikingly peculiar “macrobehavior.”
Montaigne was correct when he claimed that opinions are as different as grains of sand. But just as sand collectively combines into dunes and beaches, opinions in the blogosphere collect into distinct clusters, swarms, and alliances. Oddly enough, it may just be the extreme diversity that creates congruity.

Note: A downloadable version of the Schelling Segregation Model can be found on this page.

Blockhead

By Peter Wood
National Review Online

Higher education’s imbalancing act.

I have the original cardboard box, barely held together by Scotch™ tape that looks like it might date back to the days of the highland clans. And inside the box is every piece of my childhood Blockhead™ set. Blockhead is a game in which you add sundry-shaped wooden blocks one at a time to an increasingly wobbly tower. Players take turns, and whoever topples the structure, loses the round. Lose three rounds, you are a blockhead. The game does not teach sound architecture, since the advantage goes to the player whose addition to the tower is so precariously placed that the next player is almost certain to knock the whole thing over.

Blockhead, I ‘m happy to report, remains on the market though it seems temporarily out of stock. There is no doubt a Cultural Studies Ph.D. dissertation waiting to be written on the generational shifts in the packaging and the shapes of some of the blocks.

I was reminded of Blockhead today by reading a couple of stories about the scarcity of conservatives and the similar rarity of Republicans on college campuses. Robert Maranto, writing in The Washington Post, reviews the most recent statistical studies about the disparities. Upon hearing that conservatives and libertarians are outnumbered 20 to 1 by liberals and radicals in my discipline, anthropology, I was faintly cheered. Most days it feels more like 100 to 1. Maranto doesn’t believe that “leftist professors have set out to purge academia of Republican dissenters.” Rather, in his view, professors tend to hire people like themselves, and higher education has reached a kind of tipping point of “unsubtle effects on the ideological makeup of the professoriate.” The unfortunate results, he says, are that universities, wearing their ideological blinders, are less adept at addressing domestic and foreign challenges. The campus “monoculture” also makes universities “intellectually dull places.”

Brian Morelli writing in the Iowa City Press-Citizen offers a punchier view of the scarcity of Republicans at one institution, the University of Iowa. The 2,527 member faculty there is 46.4-percent registered Democrats and 8.1-percent registered Republicans. Morelli found 18 University of Iowa departments in which 80-percent or more of the faculty were registered Democrats and 21 departments with one or fewer Republicans. So what? Morelli, unlike Maranto, doesn’t offer an opinion on whether this disparity has any important consequences. He quotes a Republican political science professor who says it is a disservice to students and a Democratic history professor (the University of Iowa has no Republican historians) who has “great faith in the integrity of faculty members to not put political views on students.”

The Iowa City Press-Citizen presumably took up this topic because of the recent case of Mark Moyar, a chaired professor of history at the U.S. Marine Corps University in Quantico, Virginia. Professor Moyar’s academic credentials far above average: summa cum laude from Harvard, Ph.D. from Cambridge, two books from Cambridge University press, numerous article in the popular press. If academe were a meritocracy, Professor Moyar would be a star. But when he applied for an open position in the history department of the University of Iowa, he was turned down in favor of a substantially less qualified candidate. Then Professor Moyar did some really unusual. He filed a formal complaint that the University discriminated against him because he is an acknowledged conservative and registered Republican. Of course, universities being universities, the University of Iowa has surrounded the case with a thick fog of obfuscation and denials. Proving discrimination in such matters is well nigh impossible. Still the public is intrigued by the history department stats: 27 registered Democrats, 0 registered Republicans.

Morelli’s article is posted online and has attracted some comments including one from “HK” that is a pretty typical left-of-center response to the phenomenon: “What a bunch of crybabies! Boo hoo…It is so hard being a conservative in America! Such as persecuted minority!” This what-a-bunch-of-crybabies response is so common that I have abbreviated it, and write WABOC in the margins when it turns up, as always it does, when the academic left is momentarily faced with the evidence of its exclusionary hiring policies. WABOC is a bit rich coming from folks whose cri de coeur has been the unfairness of America’s majorities to America’s minorities. The ruinous thought that they themselves are a self-satisfied majority heedless of minority rights must be banished the instant it occurs. Hence WABOC becomes the rally cry, and the smug sense of having done with that bit of impertinence.

There are, however, more sophisticated responses from the academic left. In September two Harvard researchers, Neil Gross and Solon Simmons, issued a report that was headlined in several places as showing that the nation’s faculty are more “moderate” than “liberal.” Gross and Simmons reached this conclusion by way of some elastic definitions of moderation, and they quickly came under criticism from, among others, the president of the National Association of Scholars, Steve Balch. Apart from the creative effort to re-label political categories by moving the Left to the Center, Gross and Simmons turned up statistical results that match the half-dozen other formal studies of political alignment in academe. That’s to say, in the usual parlance of “conservative” vs. “liberal” in the United States, about 80-percent of academics are liberal, and about ten percent are conservative.

Gross and Simmons also attempted to expand the ranks of conservatives by including community-college faculty members in their statistics. This was especially interesting because they inadvertently added to the growing body of evidence that those conservatives who do stick it out in academic careers end up in disproportionate numbers in the lowest paying and least prestigious institutions. Monato catches another piece of this when he points to the recent study by Stanley Rothman of Smith College and S. Robert Lichter of George Mason University, which shows that conservatives have to publish substantially more to get the same benefits that liberals get by publishing substantially less: “Among professors who have published a book, 73 percent of Democrats are in high-prestige colleges and universities, compared with only 56 percent of Republicans.”

Let me save some readers the trouble: WABOC.

Is there anything really to be concerned about in the Left’s preponderance in faculty positions at prestigious public and private universities, ordinary colleges, and so on down the line? Is it useful to bear in mind Gross and Simmons’ point that the hard left, the real radicals, are in the minority, and the mainstream of higher education is just somewhere in the vicinity of Al Gore, Ralph Nader, and perhaps John Edwards? Does it matter that, although the Left predominates in virtually all areas of the university (including, contrary to the stereotype, most business schools), the super-concentration is limited to the social sciences and the humanities? (Gross and Simmons found that 25.5 percent of American sociologists who have academic appointments consider themselves Marxists.)

Moranto, I think, has a pretty compelling point in suggesting that the scarcity of conservative views on campus injures students. If one spends four years (now, more often, five or six years) in an environment where the pieties of the Left are treated as plain commonsense; where disdain for America’s history, our key institutions, and our foreign policies is normative; where the invocation of “diversity” trumps any argument on any subject, one isn’t much prepared to find one’s way in actual American life.

This isn’t a hypothetical circumstance. If you talk to recent college graduates, many of them seem in a state of resentful befuddlement. They have been taught that America is bad. They carry around their disdain for this unworthy nation along with their tattered editions of Howard Zinn’s People’s History of the United States. Some seem to have arrived from another country — as though they had spent their years in Kazakhstan and knew America via poorly translated comic books. Mostly they do not see themselves as aligned with the political Left. That’s because the category of “Left” is meaningless without a contrasting term, and all they know about conservatives is that they are heartless, exploitative, money-grubbing, hate-mongering hoarders of class, race, and gender privilege.

They may have a cause or two they uphold. AIDS prevention. Abortion. But most often it is the fight against global warning — a fight for which they lack any hint of “critical thought,” despite their four, five, or six-year immersion in pedagogy that ostensibly values “critical thinking” above any other intellectual endeavor.

This portrait of our college grads is, as I intend it, exaggerated, but I have run into the genuine article often enough to let it stand. Sending students off to college to hear what Leftist professors have to say about the world is not the problem; the problem is that, if that’s all they hear, they end up with a shallow education, and very little grip on the reality of American life. A diet of ideology is like a diet of candy bars; and we have filled up our universities with candy bar salesmen.

Moranto also suggests that the ideological one-sidedness of our university faculties is bad for the nation’s capacity to think its way through its domestic and international problems. Well, yes. Surely reasoned debate among advocates of various principles views would be more promising. The complication, however, is that a substantial chunk of the academic Left has now taken a “principled” stand against reason itself. We name this stand in different ways — post-modernism, identity politics, anti-foundationalism — but it amounts to the view that the yearning for power and authenticity counts more than old-fashioned “reason” and “argument.” This new view elevates “studies” over the traditional “disciplines,” e.g. Women’s studies, Africa-American studies, post-colonial studies, area studies, cultural studies, queer studies, and so on. This isn’t just nomenclature. “Studies” as opposed to “disciplines” have vague and transient standards of what counts as “scholarship.” They have an affinity for ideological formulation and, characteristically, diminish or even discard the distinctions between fact and opinion, and between evidence and assertion.

Thus, I am not as confident as Moranto that restoring some semblance of ideological balance among faculty members would restore to our universities the opportunity for reasoned debate on important issues. Restoring ideological balance is necessary, but not sufficient. We need to restore as well some basic principles of intellectual inquiry and fair play — and it is not at all clear how that might be done. Do we wait out the decline of post-modernism? But all those “studies” have been busy institutionalizing themselves since the 1980s.

And this is what makes me think of the precarious towers of Blockhead. In a crucial way, the dynamic of higher education in the United States today consists not in building lasting foundations, but in improvising artful imbalance. The faculty at the University of Iowa and many other institutions isn’t conspiring to keep Republicans or conservatives out of academic appointments, but on the other hand they are not attracted to having people around who don’t play the same ideological game. In many fields in the humanities today, it is the highest praise to say that someone has “destabilized” the meaning of a text, or “subverted” a tradition. To “transgress” means to undermine the stale, old, and (in this view) inevitably oppressive order. Those departments caught up in this brave new construction of intellectual towers are not about to appoint, or even seriously consider appointing, colleagues who would speak for value of the very traditions that the departments are attempting to sunder. Practically speaking, there is much less room for a political scientist who thinks the Cold War was an appropriate response to Soviet aggression; a historian who thinks that the U.S. was right to fight the Vietnam War and could have been prevailed; or a professor of English who thinks that there are works of literature the greatness of which lies in the intrinsic power of the writing.

In the game of Blockhead, of course, the tower sooner or later comes crashing down, as its accumulation of instabilities tips over into chaos. The catastrophes, however, are incidental. Out of the rubble rises a new tower, just as wobbly as the last. The game, after all, is about courting catastrophe, and the players are all eventually Blockheads.

The future belongs to Islam

By Mark Steyn
Macleans.ca

The Muslim world has youth, numbers and global ambitions. The West is growing old and enfeebled, and lacks the will to rebuff those who would supplant it. It's the end of the world as we've known it. An excerpt from 'America Alone'.

Sept. 11, 2001, was not "the day everything changed," but the day that revealed how much had already changed. On Sept. 10, how many journalists had the Council of American-Islamic Relations or the Canadian Islamic Congress or the Muslim Council of Britain in their Rolodexes? If you'd said that whether something does or does not cause offence to Muslims would be the early 21st century's principal political dynamic in Denmark, Sweden, the Netherlands, Belgium, France and the United Kingdom, most folks would have thought you were crazy. Yet on that Tuesday morning the top of the iceberg bobbed up and toppled the Twin Towers.

This is about the seven-eighths below the surface -- the larger forces at play in the developed world that have left Europe too enfeebled to resist its remorseless transformation into Eurabia and that call into question the future of much of the rest of the world. The key factors are: demographic decline; the unsustainability of the social democratic state; and civilizational exhaustion.

Let's start with demography, because everything does:

If your school has 200 guys and you're playing a school with 2,000 pupils, it doesn't mean your baseball team is definitely going to lose but it certainly gives the other fellows a big starting advantage. Likewise, if you want to launch a revolution, it's not very likely if you've only got seven revolutionaries. And they're all over 80. But, if you've got two million and seven revolutionaries and they're all under 30 you're in business.

For example, I wonder how many pontificators on the "Middle East peace process" ever run this number:

The median age in the Gaza Strip is 15.8 years.

Once you know that, all the rest is details. If you were a "moderate Palestinian" leader, would you want to try to persuade a nation -- or pseudo-nation -- of unemployed poorly educated teenage boys raised in a UN-supervised European-funded death cult to see sense? Any analysis of the "Palestinian problem" that doesn't take into account the most important determinant on the ground is a waste of time.

Likewise, the salient feature of Europe, Canada, Japan and Russia is that they're running out of babies. What's happening in the developed world is one of the fastest demographic evolutions in history: most of us have seen a gazillion heartwarming ethnic comedies -- My Big Fat Greek Wedding and its ilk -- in which some uptight WASPy type starts dating a gal from a vast loving fecund Mediterranean family, so abundantly endowed with sisters and cousins and uncles that you can barely get in the room. It is, in fact, the inversion of the truth. Greece has a fertility rate hovering just below 1.3 births per couple, which is what demographers call the point of "lowest-low" fertility from which no human society has ever recovered. And Greece's fertility is the healthiest in Mediterranean Europe: Italy has a fertility rate of 1.2, Spain 1.1. Insofar as any citizens of the developed world have "big" families these days, it's the anglo democracies: America's fertility rate is 2.1, New Zealand a little below. Hollywood should be making My Big Fat Uptight Protestant Wedding in which some sad Greek only child marries into a big heartwarming New Zealand family where the spouse actually has a sibling.

As I say, this isn't a projection: it's happening now. There's no need to extrapolate, and if you do it gets a little freaky, but, just for fun, here goes: by 2050, 60 per cent of Italians will have no brothers, no sisters, no cousins, no aunts, no uncles. The big Italian family, with papa pouring the vino and mama spooning out the pasta down an endless table of grandparents and nieces and nephews, will be gone, no more, dead as the dinosaurs. As Noel Coward once remarked in another context, "Funiculi, funicula, funic yourself." By mid-century, Italians will have no choice in the matter.

Experts talk about root causes. But demography is the most basic root of all. A people that won't multiply can't go forth or go anywhere. Those who do will shape the age we live in.

Demographic decline and the unsustainability of the social democratic state are closely related. In America, politicians upset about the federal deficit like to complain that we're piling up debts our children and grandchildren will have to pay off. But in Europe the unaffordable entitlements are in even worse shape: there are no kids or grandkids to stick it to.

You might formulate it like this:

Age + Welfare = Disaster for you;

Youth + Will = Disaster for whoever gets in your way.

By "will," I mean the metaphorical spine of a culture. Africa, to take another example, also has plenty of young people, but it's riddled with AIDS and, for the most part, Africans don't think of themselves as Africans: as we saw in Rwanda, their primary identity is tribal, and most tribes have no global ambitions. Islam, however, has serious global ambitions, and it forms the primal, core identity of most of its adherents -- in the Middle East, South Asia and elsewhere.

Islam has youth and will, Europe has age and welfare.

We are witnessing the end of the late 20th- century progressive welfare democracy. Its fiscal bankruptcy is merely a symptom of a more fundamental bankruptcy: its insufficiency as an animating principle for society. The children and grandchildren of those fascists and republicans who waged a bitter civil war for the future of Spain now shrug when a bunch of foreigners blow up their capital. Too sedated even to sue for terms, they capitulate instantly. Over on the other side of the equation, the modern multicultural state is too watery a concept to bind huge numbers of immigrants to the land of their nominal citizenship. So they look elsewhere and find the jihad. The Western Muslim's pan-Islamic identity is merely the first great cause in a world where globalized pathologies are taking the place of old-school nationalism.

For states in demographic decline with ever more lavish social programs, the question is a simple one: can they get real? Can they grow up before they grow old? If not, then they'll end their days in societies dominated by people with a very different world view.

Which brings us to the third factor -- the enervated state of the Western world, the sense of civilizational ennui, of nations too mired in cultural relativism to understand what's at stake. As it happens, that third point is closely related to the first two. To Americans, it doesn't always seem obvious that there's any connection between the "war on terror" and the so-called "pocketbook issues" of domestic politics. But there is a correlation between the structural weaknesses of the social democratic state and the rise of a globalized Islam. The state has gradually annexed all the responsibilities of adulthood -- health care, child care, care of the elderly -- to the point where it's effectively severed its citizens from humanity's primal instincts, not least the survival instinct. In the American context, the federal "deficit" isn't the problem; it's the government programs that cause the deficit. These programs would still be wrong even if Bill Gates wrote a cheque to cover them each month. They corrode the citizen's sense of self-reliance to a potentially fatal degree. Big government is a national security threat: it increases your vulnerability to threats like Islamism, and makes it less likely you'll be able to summon the will to rebuff it. We should have learned that lesson on Sept. 11, 2001, when big government flopped big-time and the only good news of the day came from the ad hoc citizen militia of Flight 93.

There were two forces at play in the late 20th century: in the Eastern bloc, the collapse of Communism; in the West, the collapse of confidence. One of the most obvious refutations of Francis Fukuyama's famous thesis The End Of History -- written at the victory of liberal pluralist democracy over Soviet Communism -- is that the victors didn't see it as such. Americans -- or at least non-Democrat-voting Americans -- may talk about "winning" the Cold War but the French and the Belgians and Germans and Canadians don't. Very few British do. These are all formal NATO allies -- they were, technically, on the winning side against a horrible tyranny few would wish to live under themselves. In Europe, there was an initial moment of euphoria: it was hard not be moved by the crowds sweeping through the Berlin Wall, especially as so many of them were hot-looking Red babes eager to enjoy a Carlsberg or Stella Artois with even the nerdiest running dog of imperialism. But, when the moment faded, pace Fukuyama, there was no sense on the Continent that our Big Idea had beaten their Big Idea. With the best will in the world, it's hard to credit the citizens of France or Italy as having made any serious contribution to the defeat of Communism. Au contraire, millions of them voted for it, year in, year out. And, with the end of the Soviet existential threat, the enervation of the West only accelerated.

In Thomas P. M. Barnett's book Blueprint For Action, Robert D. Kaplan, a very shrewd observer of global affairs, is quoted referring to the lawless fringes of the map as "Indian territory." It's a droll joke but a misleading one. The difference between the old Indian territory and the new is this: no one had to worry about the Sioux riding down Fifth Avenue. Today, with a few hundred bucks on his ATM card, the fellow from the badlands can be in the heart of the metropolis within hours.

Here's another difference: in the old days, the white man settled the Indian territory. Now the followers of the badland's radical imams settle the metropolis.

And another difference: technology. In the old days, the Injuns had bows and arrows and the cavalry had rifles. In today's Indian territory, countries that can't feed their own people have nuclear weapons.

But beyond that the very phrase "Indian territory" presumes that inevitably these badlands will be brought within the bounds of the ordered world. In fact, a lot of today's "Indian territory" was relatively ordered a generation or two back -- West Africa, Pakistan, Bosnia. Though Eastern Europe and Latin America and parts of Asia are freer now than they were in the seventies, other swaths of the map have spiralled backwards. Which is more likely? That the parts of the world under pressure will turn into post-Communist Poland or post-Communist Yugoslavia? In Europe, the demographic pressures favour the latter.

The enemies we face in the future will look a lot like al-Qaeda: transnational, globalized, locally franchised, extensively outsourced -- but tied together through a powerful identity that leaps frontiers and continents. They won't be nation-states and they'll have no interest in becoming nation-states, though they might use the husks thereof, as they did in Afghanistan and then Somalia. The jihad may be the first, but other transnational deformities will embrace similar techniques. Sept. 10 institutions like the UN and the EU will be unlikely to provide effective responses.

We can argue about what consequences these demographic trends will have, but to say blithely they have none is ridiculous. The basic demography explains, for example, the critical difference between the "war on terror" for Americans and Europeans: in the U.S., the war is something to be fought in the treacherous sands of the Sunni Triangle and the caves of the Hindu Kush; you go to faraway places and kill foreigners. But, in Europe, it's a civil war. Neville Chamberlain dismissed Czechoslovakia as "a faraway country of which we know little." This time round, for much of western Europe it turned out the faraway country of which they knew little was their own.

Four years into the "war on terror," the Bush administration began promoting a new formulation: "the long war." Not a good sign. In a short war, put your money on tanks and bombs. In a long war, the better bet is will and manpower. The longer the long war gets, the harder it will be, because it's a race against time, against lengthening demographic, economic and geopolitical odds. By "demographic," I mean the Muslim world's high birth rate, which by mid-century will give tiny Yemen a higher population than vast empty Russia. By "economic," I mean the perfect storm the Europeans will face within this decade, because their lavish welfare states are unsustainable on their post-Christian birth rates. By "geopolitical," I mean that, if you think the United Nations and other international organizations are antipathetic to America now, wait a few years and see what kind of support you get from a semi-Islamified Europe.

Almost every geopolitical challenge in the years ahead has its roots in demography, but not every demographic crisis will play out the same way. That's what makes doing anything about it even more problematic -- because different countries' reactions to their own particular domestic circumstances are likely to play out in destabilizing ways on the international scene. In Japan, the demographic crisis exists virtually in laboratory conditions -- no complicating factors; in Russia, it will be determined by the country's relationship with a cramped neighbour -- China; and in Europe, the new owners are already in place -- like a tenant with a right-to-buy agreement.

Let's start in the most geriatric jurisdiction on the planet. In Japan, the rising sun has already passed into the next phase of its long sunset: net population loss. 2005 was the first year since records began in which the country had more deaths than births. Japan offers the chance to observe the demographic death spiral in its purest form. It's a country with no immigration, no significant minorities and no desire for any: just the Japanese, aging and dwindling.

At first it doesn't sound too bad: compared with the United States, most advanced societies are very crowded. If you're in a cramped apartment in a noisy congested city, losing a couple hundred thousand seems a fine trade-off. The difficulty, in a modern social democratic state, is managing which people to lose: already, according to the Japan Times, depopulation is "presenting the government with pressing challenges on the social and economic front, including ensuring provision of social security services and securing the labour force." For one thing, the shortage of children has led to a shortage of obstetricians. Why would any talented ambitious med school student want to go into a field in such precipitous decline? As a result, if you live in certain parts of Japan, childbirth is all in the timing. On Oki Island, try to time the contractions for Monday morning. That's when the maternity ward is open -- first day of the week, 10 a.m., when an obstetrician flies in to attend to any pregnant mothers who happen to be around. And at 5.30 p.m. she flies out. So, if you've been careless enough to time your childbirth for Tuesday through Sunday, you'll have to climb into a helicopter and zip off to give birth alone in a strange hospital unsurrounded by tiresome loved ones. Do Lamaze classes on Oki now teach you to time your breathing to the whirring of the chopper blades?

The last local obstetrician left the island in 2006 and the health service isn't expecting any more. Doubtless most of us can recall reading similar stories over the years from remote rural districts in America, Canada, Australia. After all, why would a village of a few hundred people have a great medical system? But Oki has a population of 17,000, and there are still no obstetricians: birthing is a dying business.

So what will happen? There are a couple of scenarios: whatever Japanese feelings on immigration, a country with great infrastructure won't empty out for long, any more than a state-of-the-art factory that goes belly up stays empty for long. At some point, someone else will move in to Japan's plant.

And the alternative? In The Children Of Men, P. D. James' dystopian fantasy about a barren world, there are special dolls for women whose maternal instinct has gone unfulfilled: pretend mothers take their artificial children for walks on the street or to the swings in the park. In Japan, that's no longer the stuff of dystopian fantasy. At the beginning of the century, the country's toy makers noticed they had a problem: toys are for children and Japan doesn't have many. What to do? In 2005, Tomy began marketing a new doll called Yumel -- a baby boy with a range of 1,200 phrases designed to serve as companions for the elderly. He says not just the usual things -- "I wuv you" -- but also asks the questions your grandchildren would ask if you had any: "Why do elephants have long noses?" Yumel joins his friend, the Snuggling Ifbot, a toy designed to have the conversation of a five-year old child which its makers, with the usual Japanese efficiency, have determined is just enough chit-chat to prevent the old folks going senile. It seems an appropriate final comment on the social democratic state: in a childish infantilized self-absorbed society where adults have been stripped of all responsibility, you need never stop playing with toys. We are the children we never had.

And why leave it at that? Is it likely an ever smaller number of young people will want to spend their active years looking after an ever greater number of old people? Or will it be simpler to put all that cutting-edge Japanese technology to good use and take a flier on Mister Roboto and the post-human future? After all, what's easier for the governing class? Weaning a pampered population off the good life and re-teaching them the lost biological impulse or giving the Sony Corporation a licence to become the Cloney Corporation? If you need to justify it to yourself, you'd grab the graphs and say, well, demographic decline is universal. It's like industrialization a couple of centuries back; everyone will get to it eventually, but the first to do so will have huge advantages: the relevant comparison is not with England's early 19th century population surge but with England's Industrial Revolution. In the industrial age, manpower was critical. In the new technological age, manpower will be optional -- and indeed, if most of the available manpower's Muslim, it's actually a disadvantage. As the most advanced society with the most advanced demographic crisis, Japan seems likely to be the first jurisdiction to embrace robots and cloning and embark on the slippery slope to transhumanism.

Demographic origin need not be the final word. In 1775, Benjamin Franklin wrote a letter to Joseph Priestly suggesting a mutual English friend might like to apply his mind to the conundrum the Crown faced:

Britain, at the expense of three millions, has killed 150 Yankees this campaign, which is £20000 a head... During the same time, 60000 children have been born in America. From these data his mathematical head will easily calculate the time and the expense necessary to kill us all.

Obviously, Franklin was oversimplifying. Not every American colonist identified himself as a rebel. After the revolution, there were massive population displacements: as United Empire Loyalists well know, large numbers of New Yorkers left the colony to resettle in what's now Ontario. Some American Negroes were so anxious to remain subjects of King George III they resettled as far as Sierra Leone. For these people, their primary identity was not as American colonists but as British subjects. For others, their new identity as Americans had supplanted their formal allegiance to the Crown. The question for today's Europe is whether the primary identity of their fastest-growing demographic is Muslim or Belgian, Muslim or Dutch, Muslim or French.

That's where civilizational confidence comes in: if "Dutchness" or "Frenchness" seems a weak attenuated thing, then the stronger identity will prevail. One notes other similarities between revolutionary America and contemporary Europe: the United Empire Loyalists were older and wealthier; the rebels were younger and poorer. In the end, the former simply lacked the latter's strength of will.

Europe, like Japan, has catastrophic birth rates and a swollen pampered elderly class determined to live in defiance of economic reality. But the difference is that on the Continent the successor population is already in place and the only question is how bloody the transfer of real estate will be.

If America's "allies" failed to grasp the significance of 9/11, it's because Europe's home-grown terrorism problems had all taken place among notably static populations, such as Ulster and the Basque country. One could make generally safe extrapolations about the likelihood of holding Northern Ireland to what cynical strategists in Her Majesty's Government used to call an "acceptable level of violence." But in the same three decades as Ulster's "Troubles," the hitherto moderate Muslim populations of south Asia were radicalized by a politicized form of Islam; previously formally un-Islamic societies such as Nigeria became semi-Islamist; and large Muslim populations settled in parts of Europe that had little or no experience of mass immigration.

On the Continent and elsewhere in the West, native populations are aging and fading and being supplanted remorselessly by a young Muslim demographic. Time for the obligatory "of courses": of course, not all Muslims are terrorists -- though enough are hot for jihad to provide an impressive support network of mosques from Vienna to Stockholm to Toronto to Seattle. Of course, not all Muslims support terrorists -- though enough of them share their basic objectives(the wish to live under Islamic law in Europe and North America)to function wittingly or otherwise as the "good cop" end of an Islamic good cop/bad cop routine. But, at the very minimum, this fast-moving demographic transformation provides a huge comfort zone for the jihad to move around in. And in a more profound way it rationalizes what would otherwise be the nuttiness of the terrorists' demands. An IRA man blows up a pub in defiance of democratic reality -- because he knows that at the ballot box the Ulster Loyalists win the elections and the Irish Republicans lose. When a European jihadist blows something up, that's not in defiance of democratic reality but merely a portent of democratic reality to come. He's jumping the gun, but in every respect things are moving his way.

You may vaguely remember seeing some flaming cars on the evening news toward the end of 2005. Something going on in France, apparently. Something to do with -- what's the word? -- "youths." When I pointed out the media's strange reluctance to use the M-word vis-à-vis the rioting "youths," I received a ton of emails arguing there's no Islamist component, they're not the madrasa crowd, they may be Muslim but they're secular and Westernized and into drugs and rap and meaningless sex with no emotional commitment, and rioting and looting and torching and trashing, just like any normal healthy Western teenagers. These guys have economic concerns, it's the lack of jobs, it's conditions peculiar to France, etc. As one correspondent wrote, "You right-wing shit-for-brains think everything's about jihad."

Actually, I don't think everything's about jihad. But I do think, as I said, that a good 90 per cent of everything's about demography. Take that media characterization of those French rioters: "youths." What's the salient point about youths? They're youthful. Very few octogenarians want to go torching Renaults every night. It's not easy lobbing a Molotov cocktail into a police station and then hobbling back with your walker across the street before the searing heat of the explosion melts your hip replacement. Civil disobedience is a young man's game.

In June 2006, a 54-year-old Flemish train conductor called Guido Demoor got on the Number 23 bus in Antwerp to go to work. Six -- what's that word again? -- "youths" boarded the bus and commenced intimidating the other riders. There were some 40 passengers aboard. But the "youths" were youthful and the other passengers less so. Nonetheless, Mr. Demoor asked the lads to cut it out and so they turned on him, thumping and kicking him. Of those 40 other passengers, none intervened to help the man under attack. Instead, at the next stop, 30 of the 40 scrammed, leaving Mr. Demoor to be beaten to death. Three "youths" were arrested, and proved to be -- quelle surprise! -- of Moroccan origin. The ringleader escaped and, despite police assurances of complete confidentiality, of those 40 passengers only four came forward to speak to investigators. "You see what happens if you intervene," a fellow rail worker told the Belgian newspaper De Morgen. "If Guido had not opened his mouth he would still be alive."

No, he wouldn't. He would be as dead as those 40 passengers are, as the Belgian state is, keeping his head down, trying not to make eye contact, cowering behind his newspaper in the corner seat and hoping just to be left alone. What future in "their" country do Mr. Demoor's two children have? My mother and grandparents came from Sint-Niklaas, a town I remember well from many childhood visits. When we stayed with great-aunts and other relatives, the upstairs floors of the row houses had no bathrooms, just chamber pots. My sister and I were left to mooch around cobbled streets with our little cousin for hours on end, wandering aimlessly past smoke-wreathed bars and cafes, occasionally buying frites with mayonnaise. With hindsight it seemed as parochially Flemish as could be imagined. Not anymore. The week before Mr. Demoor was murdered in plain sight, bus drivers in Sint-Niklaas walked off the job to protest the thuggery of the -- here it comes again -- "youths." In little more than a generation, a town has been transformed.

Of the ethnic Belgian population, some 17 per cent are under 18 years old. Of the country's Turkish and Moroccan population, 35 per cent are under 18 years old. The "youths" get ever more numerous, the non-youths get older. To avoid the ruthless arithmetic posited by Benjamin Franklin, it is necessary for those "youths" to feel more Belgian. Is that likely? Colonel Gadhafi doesn't think so:

There are signs that Allah will grant Islam victory in Europe -- without swords, without guns, without conquests. The fifty million Muslims of Europe will turn it into a Muslim continent within a few decades.

On Sept. 11, 2001, the American mainland was attacked for the first time since the War of 1812. The perpetrators were foreign -- Saudis and Egyptians. Since 9/11, Europe has seen the London Tube bombings, the French riots, Dutch murders of nationalist politicians. The perpetrators are their own citizens -- British subjects, citoyens de la République française. In Linz, Austria, Muslims are demanding that all female teachers, believers or infidels, wear head scarves in class. The Muslim Council of Britain wants Holocaust Day abolished because it focuses "only" on the Nazis'(alleged)Holocaust of the Jews and not the Israelis' ongoing Holocaust of the Palestinians.

How does the state react? In Seville, King Ferdinand III is no longer patron saint of the annual fiesta because his splendid record in fighting for Spanish independence from the Moors was felt to be insensitive to Muslims. In London, a judge agreed to the removal of Jews and Hindus from a trial jury because the Muslim defendant's counsel argued he couldn't get a fair verdict from them. The Church of England is considering removing St. George as the country's patron saint on the grounds that, according to various Anglican clergy, he's too "militaristic" and "offensive to Muslims." They wish to replace him with St. Alban, and replace St. George's cross on the revamped Union Flag, which would instead show St. Alban's cross as a thin yellow streak.

In a few years, as millions of Muslim teenagers are entering their voting booths, some European countries will not be living formally under sharia, but -- as much as parts of Nigeria, they will have reached an accommodation with their radicalized Islamic compatriots, who like many intolerant types are expert at exploiting the "tolerance" of pluralist societies. In other Continental countries, things are likely to play out in more traditional fashion, though without a significantly different ending. Wherever one's sympathies lie on Islam's multiple battle fronts the fact is the jihad has held out a long time against very tough enemies. If you're not shy about taking on the Israelis and Russians, why wouldn't you fancy your chances against the Belgians and Spaniards?

"We're the ones who will change you," the Norwegian imam Mullah Krekar told the Oslo newspaper Dagbladet in 2006. "Just look at the development within Europe, where the number of Muslims is expanding like mosquitoes. Every Western woman in the EU is producing an average of 1.4 children. Every Muslim woman in the same countries is producing 3.5 children." As he summed it up: "Our way of thinking will prove more powerful than yours."