Thursday, January 29, 2009

Fear Hath No Shelf-Life: Our Torture Dilemma

By Robert D. Kaplan
The Atlantic

The torture debate is critical not only because it gets us to the core of our values, but because the danger to American cities is not from tanks and armies, but from individuals and their intentions. Saving thousands of American lives may come down to the gifts of a talented interrogator and the tools at his or her disposal. Remember that usually only in the movies does a prisoner spill the beans on an upcoming plot. As interrogators will tell you, information about terrorist activities tends to come in fragments that are assembled from scores of interrogations, even as the truth is distilled—accidentally almost—from a spewing forth of lies and subtle evasions. The front line of our defense against al-Qaeda and its offshoots is painstaking, tedious work that rewards those best able to fill in the blank spaces from a shattered jigsaw puzzle.

Interrogators, because they deal with a single combatant face to face for hours at a time, often develop more sympathy for the enemy than any one else in our security establishment. After all, the combatant, because he has a real face, becomes human to them. Such sympathy is necessary if they are to do their jobs well. “To defeat the enemy you first have to love them—that is, their culture,” an army special forces lieutenant colonel told me years ago in Afghanistan.

Good interrogators become masters at discerning body language and eye movements in the person they are questioning. Their worst enemy is not the inability to torture, but their own bureaucracy, which often doesn’t share vital information with its constituent parts. There is a lot we can do to improve the quality of our interrogations without torturing people.

And yet the problem is not that easily solved. It nags at one. Stating flatly that torture doesn’t work is a narrow version of the truth. Torture may not work, but the fear of it can work wonders. When a prisoner is captured, you may want him to be roughly handled, and put in grim and disorienting surroundings, for that heightens the fear of what might happen to him next. “Fear is often an interrogator’s best ally, but it doesn’t have a long shelf life,” write Chris Mackey and Greg Miller, authors of The Interrogators: Inside the Secret War Against al-Qaeda, a very wise book by a former interrogator and a journalist. Fear can by extended for a time through clever techniques such as spreading false rumors. Example: the authors report that they received added cooperation from prisoners in Afghanistan after deliberately spreading a rumor that they were going to be sent back to their home countries in the Middle East in return for a $100,000 payment per man. But ultimately, as the weeks pass, and nothing bad happens to him, the prisoner’s fear fades and he becomes less useful. While torture is bad, the thoroughly humane approach, contrary to our desires, has its limits. And that is our dilemma.

Spreading rumors, rough treatment, grim conditions are all the very beginnings of a slippery path toward much worse things. To say that we cannot step even one foot along that path is impractical, for that would deprive the nation of vital intelligence and threaten to leave it defenseless. But to go far along that path is morally unbearable. So the debate is really about how far we do go.

The extreme example that’s frequently discussed—Should we resort to torture in the event of a ticking bomb?—is interesting but statistically not very useful, since most scenarios are much more ambiguous. And again, a single prisoner is more likely to reveal a shard of evidence than the where and when of a plot.

The authors Mackey and Miller describe a group of interrogators who solved their moral dilemma through the use of a technique called “monstering.” That is, if the interrogator had to put up with the same ill-treatment as the prisoner, then it wasn’t immoral. To wit, sleep deprivation combined with long interrogations was allowed if the interrogator himself went without sleep for the same amount of time. But to double-team the prisoner, with one interrogator sleeping while the other worked the prisoner over verbally, was considered immoral. Monstering was a matter of who broke first, the interrogator or the prisoner. And if the prisoner broke first, even part of the time, it was worth doing.

Of course, one could carry this logic as far as waterboarding, since many of our own special operations forces and combat pilots have been waterboarded as part of their training at Survival, Evasion, Resistance, and Escape (SERE) School. Indeed, far more Americans have been waterboarded than prisoners at Guantanamo Bay. But while the Americans who have been waterboarded have been physically and psychologically trained to perfection beforehand, and experience it only for short periods, unsuspecting detainees experience waterboarding under far more onerous circumstances. So what is hard, tough training in one instance, can be torture in another.

But if we can’t—or shouldn’t—waterboard, then how far do we push the envelope? Because if we don’t push it somewhat, we face what I call the 15 percent dilemma. Let me explain.

If we act like angels, and another, even more massive attack occurs in the United States, then the public might cry for blood, as it did to an extent in the immediate days and weeks after 9/11, when torture was occasionally spoken of in different terms than it is now. Remember that some Bush Administration policies were drawn up in the context of one public mood, and carried out in the context of another. Were that to happen, detainees’ rights might decline by, say, 50 percent. But if we push the envelope only 15 percent along that dangerous path, then we might avoid the 50 percent trap, and save many of our own lives.

But even 15 percent makes me queasy. To avoid the question, though, is itself irresponsible.

High Table: Luther

By Melissa Schubert
Scriptorium Daily

Our reading assignment: Martin Luther’s “Theses for the Heidelberg Disputation,” “Two Kinds of Righteousness,” and “The Freedom of the Christian” from Martin Luther’s Basic Theological Writings.

Faculty training at the Torrey Honors Institute puts our pedagogy to the test. We call our semesterly training meetings ‘High Tables.’ Nothing as lofty as it sounds, we read a curricular text and, led by a colleague, discuss it for three hours.

Led by Dr. Matt Jenson (whose The Gravity of Sin includes some great work on Luther’s doctrine of the same), last week my colleagues and I spent some good words (and a few shabby ones) discussing what Luther might teach us about the gospel, faith, Christ’s finished work, virtue, and good works.

It is the bane of familiarity to breed contempt. But may it never be that the Christian tires of considering the mighty gospel and the great gift of faith. For my own part, I confess that I so often forget and pervert the gospel that I can all too often use a good reminder. I include my favorite section from “The Freedom of the Christian” here in case your memory, too, could use refreshing.

“The third incomparable benefit of faith is that it unites the soul with Christ as a bride is united with her bridegroom… And if they are one flesh and there is between them a true marriage—indeed, the most perfect of all marriages, since human marriages are but poor examples of this one true marriage—it follows that everything they have they hold in common, the good as well as the evil… Let faith come between them and sins, death, and damnation will be Christ’s, while grace, life, and salvation will be the soul’s: for if Christ is a bridegroom he must take upon himself the things which are his bride’s and bestow upon her the things that are his.” And Christ submitting to suffering, death, and hell, breaks their power: “for his righteousness is stronger than all the sins of man, his life stronger than death, his salvation more invincible than hell… Thus the believing soul by means of the pledge of its faith is free in Christ, its bridegroom, free from all sins, secure against death and hell, and is endowed with the eternal righteousness, life, and salvation of Christ its bridegroom.”

Some highlights of our discussion follow.

Overarching enquiry: Jenson began our discussion with the following question: Does Luther think the Christian should be virtuous? The question cleverly distracts from the more Lutheran terms ‘works’ and ‘righteousness’ to recall classical concepts of the ethical life, often appropriated for Christian constructs of spirituality. Can the Christian become good? Not in her own power. Should the Christian—knowing her life to be hid in Christ with God, her righteousness to be alien, her only hope the finished work of Christ—presume to become “righteous, free, and a Christian by means of some good work”? Only at the cost of her faith. But Luther, too, teaches that “faith does good works.” So, shouldn’t the Christian be virtuous?

Most radical enquiry: Is it good for a rich man to give his money to the poor if he believes that he is good for doing so? Luther, it seems, says no. In the Heidelberg Disputation Luther claims, “The works of men may always be attractive and seemingly good. It appears nevertheless that they are mortal sins.” Any presumption to be good apart from Christ kills my soul (my body, already mortal). Better for my neighbor to lie, cheat and steal than to be a “good person” because his works are “all the more deadly” when done with “pure and evil assurance.”

Most lexical enquiry: Of course, we asked what is ‘faith’ in Luther. Luther rejects faith as a virtue, but still calls it a hard thing. Its object is Christ (rather than, say, the whole creed.) Are ‘trust’, ‘belief,’ or ‘knowledge’ helpful synonyms?

Most grammatically technical enquiry: Luther writes, “Faith alone is the saving and efficacious use of the Word of God.” It seemed to me that this sentence was syntactically awkward in order to suggest that the efficacious Word of God occasions faith. But is there an implied subject, the one who only uses the Word of God effectively when she believes what she hears and reads (or ministers the Word to the same end)? Who is the origin of faith: the one who hears and believes or the One who promises and speaks?

Most troubling enquiry: Does Luther make the gospel a legal fiction?

Most biographical enquiry: Johann von Staupitz was Luther’s confessor during the season of Luther’s life wherein the knowledge of his sin was making him inconsolable. How do his various counsels to the angst-ridden Luther contribute to Luther’s theology of the cross?

Most pedagogical enquiry: Which hymn might you sing with a class to prepare them for the discussion? Professor Henderson (our resident hymn-leader) prefers the more thematically apt “When I Survey the Wondrous Cross” to the more obvious because the Luther-penned choice, “A Mighty Fortress.” Other options include “O Sacred Head Now Wounded” and “Jesus, Lover of my Soul.”

Most pastoral enquiry: What are good words for students who express anxiety about their faith: is it enough? Luther speaks hard words of peace to the introspective: stop looking at yourself. Look to Christ, the author and finisher of our faith. Do not pervert faith by considering it a meritorious work. Despair of yourself, even with respect to your faith, so long as you look to Christ.

Most tacit enquiry: Talking doctrine, especially such historically divisive doctrine might seem combative and merely heady. But reading and talking Luther with my colleagues was much more hearty and lively than that. The core of our conversation suggested to me another Heidelberg reminder, this time from the catechism, a document that would have birthed a Luther had he not been one of its fathers:

Q: Christian, what is your only comfort in life and in death? A: That I am not my own, but belong—body and soul, in life and in death—to my faithful Savior Jesus Christ.

He has fully paid for all my sins with his precious blood, and has set me free from the tyranny of the devil. He also watches over me in such a way that not a hair can fall from my head without the will of my Father in heaven: in fact, all things must work together for my salvation.

Because I belong to him, Christ, by his Holy Spirit, assures me of eternal life and makes me wholeheartedly willing and ready from now on to live for him.

Conversation and Conversion

By John Mark Reynolds
Scriptorium Daily

Islam has succeeded in much of the world partly because it is intellectually interesting, culturally potent, and has spiritual power. One can acknowledge this while also believing Islam to be fundamentally wrong and knowing the darker side of Islamic history.

President Obama was right to reach out to the Islamic world at his inauguration. Given his background, Obama has a unique chance to promote what is good about the United States while distinguishing those virtues from our problems and failures.

Prejudice sometimes blocks dialogue and, unfortunately, serious thought about Islam has suffered from two sorts of it in the United States.

Extreme secularists cannot engage in real dialogue or a sympathetic study of religions like Islam, because they cannot admit the possibility that any religion could be true. Any American approach to the Islamic world that refuses to talk about religion or treat it seriously is doomed to failure. One can go to the Super Bowl and refuse to talk about the game, there will be quite a few people there just for the “event,” but the majority will think, and are entitled to think, that they are missing the point. It would be even more foolish to go to many football games with ardent fans and never bother to learn a single thing about football.

In the same way dialogue about politics will inevitably bring moral and religious ideas into the discussion, but much of our leadership is educationally ill prepared to start. Much of our elite educational programs ignore serious treatment of religion. Their graduates often don’t even know the names of the players!

Other narrow-minded Americans act as if open- minded dialogue and study of a religion not their own couldn’t possibly be of benefit. Of course, being right about one thing has never been a very good guarantee that you are right about everything else! Even if your religion is correct, a humble and open-minded approach to other perspectives is always valuable. You might after all be wrong and if you are right such an approach makes it more likely that your dialogue partner will listen to you.

Real dialogue begins with both sides acknowledging they might have something to learn and being willing to listen and change. Like former President George W. Bush, President Obama has the right approach to this topic, but unlike Bush he has the global popularity and perceived credibility to advance American interests.

It is obvious that non-Muslims will benefit from this exchange.

Islam has formed important, beautiful, and long lasting human cultures. These have produced orderly nations that have made important contributions to world culture. Order and the rule of law are basic requirements of higher civilizations that Americans are fortunate enough to be able to take for granted. We have sustained them for a very short period of time compared to many Islamic people groups.

Order in Islam comes with charitable activity and a commendable concern for the poor and disadvantaged. For the most part Islam is part of the modern monotheistic moral consensus that dominates most of the world. Most branches of Islam encourage private and public moral behavior that most non-Muslim Americans would also encourage.

Americans too often see the obvious failures of nations where most people are Muslims and simplistically attribute all those failures to Islam. We forget that much of the leadership class in those nations for the last one hundred years was educated in Western socialist economic ideas and were encouraged to adopt totalitarian forms of government from places such as Moscow and Berlin. It is unjust to blame Islam for the relative poverty of nations like Egypt and Indonesia when socialist and nationalist economic policies, put in place by men who were not particularly religious, is a more likely explanation.

Ironically, so-called “Islamic” terrorist groups adopt economic and political policies that have historical roots in European secularism rather than the Koran. Marx is often a better guide than Muhammad in predicting their political and economic behavior!

Dialogue will benefit both sides.

Philosophers and theologians already benefit from the exchange. For example, non-Muslim philosophers of religion currently use arguments such as the kalaam argument for the existence of God that have roots in Islamic philosophy. Any philosopher hoping to understand Platonic and Aristotelian thought will benefit from reading Islamic thinkers. Scholars in many other fields, ranging from law to the arts, gain from careful study of Islamic ideas and achievements.

Of course, this dialogue goes both ways and must take ideas seriously enough to admit the possibility of both sides fundamentally changing their minds. If a scholar becomes convinced that Islam, Christianity, secularism or any other idea is wrong, they must have the right to pursue their ideas to their logical conclusion in the public square without fear.

Many dominantly Christian nations and many dominantly secular nations allow this freedom, but it cannot be taken from granted in much of the Islamic world.

Clergy and laity of my church who argue that Christianity is right and Islam is wrong have been murdered or face credible threats of death. Their governments do not protect them. This is not ancient history, but is happening now. Conversation must carry the possibility of change and conversion to be meaningful. Conversion from one faith to another must not carry a death sentence!

Terrorism also stops all discussion. We cannot listen to the ideas of a terrorist over the screams of their victims. There can be no dialogue with a person, religious or secular, who will not denounce terrorism. There should be no dialogue with those who will not admit the right of Israel to exist or who wish to exterminate the Jewish people.

Our President is right to show no tolerance for any person who despises American virtues. American society allows for a great deal of individual liberty and this is remarkable and good given world history. Surely, however, the hundreds of millions of people in the Islamic world who see our excesses and vice also have a point. When thoughtful external critics say that the American lifestyle has become too associated with instant gratification, a lack of community values, and hedonism, we should listen.

When terrorists use our vices as an excuse to attack us for our virtues, we should fight and win.

President Obama has said these things more eloquently than I can, but now must implement them. President George W. Bush was successful in keeping America free from further terrorist attacks after 9/11. This was an enormous accomplishment for which he has received too little credit, but Bush failed to successfully communicate our ideals to the rest of the world. President Obama must build on the success of the Bush administration while correcting this failure. All patriots hope he does so.

Wednesday, January 28, 2009

When Shall We Be Made Like Our Christ?

By R. A. Torrey
Scriptorium Daily

Q: When will the full change (begun in us by the change in heart) be completed? That is, when shall we be made like our Christ?

A: The moment that one is born again by the power of the Holy Ghost, he is made in a measure like Christ. In his standing before God, he is just like Christ, perfectly accepted, “justified from all things” (Acts 13:38, 39; Rom. 8:1).

Furthermore, at that time he is made in character like Christ, but not perfectly so; as he feeds upon the Word and has personal communion with Christ he becomes more and more like his Lord day by day (II Cor. 3:18). This is a progressive process. He is changed into the image of the Lord “from glory to glory.” That is, in each new time of communion with Him, he catches something more of the Lord’s glory. The transformation is completed and we become perfectly like our Lord at His return. When He comes again, “We shall be like Him, for we shall see Him as He is.” (I John 3:2)

Q: Where does it say in the Bible that God prepared hell for man?

A: It does not say so anywhere in the Bible. On the other hand we are distinctly told in the Bible that the everlasting fire was prepared for the devil and his angels. (Matt. 25:41). It was not God’s purpose nor desire that any man should go there –it was not prepared for him, but if any man chooses to cast in his lot with the Devil and his angels by persistently rejecting Jesus Christ and continuing in sin, then he shall have to share the destiny of the Devil and his angels, and be “tormented day and night forever and ever.” (Rev. 20:15.) But he goes there of his own choice and in spite of all God’s gracious works to save him.

Originally published in The King’s Business, January 1913 (Volume 4, Issue 1, p. 38).

Wednesday, January 21, 2009

The Keys of the Kingdom

By R. A. Torrey
Scriptorium Daily

RA Torrey Q and A

Q: Please explain Matt. 16:19, “And I will give unto thee the keys of the kingdom of heaven; and whatsoever thou shalt bind on earth shall be bound in heaven; and whatsoever thou shalt loose on earth shall be loosed in heaven.”

A: It was the custom when our Lord was here upon earth to give to the one who had gone through the Rabbinical schools a key as a symbol of his ability to unlock the truth to men. Our Lord by saying unto Peter that He would give to him the keys of the kingdom of heaven simply meant what the figure indicates, that He would give to him the ability to unlock the truth of the kingdom of heaven to men. We see Peter using the keys with the Jews and unlocking the truth and the door into the kingdom to them in Acts 2:38 and the following verses. We see Peter using the keys with the Gentiles and unlocking the truth and the door into the kingdom to them in Acts 10:34-43. Our Lord made the promise to Peter as a Spirit taught man (cf. V. 16), and every one who is taught of the Spirit has this same power of the keys.

In the usage of our Lord’s Day to “bind and to loose” meant respectively to forbid or permit. It was a proverb in that day that “What Shammai (a very strict rabbi) binds (that is, forbids,) Hillel (a more moderate teacher) looses (that is, permits).” So these words used in this verse simply meant what Peter as a Spirit-taught man forbade was what was forbidden in heaven and what he permitted was what was permitted in heaven; that is, that the teaching of Peter in regard to what a man ought to do or not to do was the true heavenly doctrine. This promise was not made to Peter as a priest or pope, but simply as a Spirit-filled man.

In a similar way our Lord said to His disciples in John 20:22, 23, when He had breathed on them and bestowed upon them the gift of the Holy Ghost, “Whosoever sins ye (that is, ye as Spirit-filled men,) remit, they are remitted unto them; and whosoever sins ye (that is, ye as Spirit-filled men,) retain, they are retained.” In other words, the Spirit-filled teacher has discernment to know where there is true repentance and faith, and in a case where he, as a Spirit-filled teacher, discerns true repentance and faith and declares sins remitted, there is really true repentance and faith and those sins “are remitted.” We see the exercise of this power on Peter’s part in Acts 8:20-23. There is no indication that this power was bestowed upon Peter as a priest or as a pope, but simply as a Spirit-filled man.

First published in The King’s Business, January 1913, p. 38.

Tuesday, January 20, 2009

The Humanities Move Off Campus

By Victor Davis Hanson
City Journal

As the classical university unravels, students seek knowledge and know-how elsewhere.
Autumn 2008

Until recently, classical education served as the foundation of the wider liberal arts curriculum, which in turn defined the mission of the traditional university. Classical learning dedicated itself to turning out literate citizens who could read and write well, express themselves, and make sense of the confusion of the present by drawing on the wisdom of the past. Students grounded in the classics appreciated the history of their civilization and understood the rights and responsibilities of their unique citizenship. Universities, then, acted as cultural custodians, helping students understand our present values in the context of a 2,500-year tradition that began with the ancient Greeks.

But in recent decades, classical and traditional liberal arts education has begun to erode, and a variety of unexpected consequences have followed. The academic battle has now gone beyond the in-house “culture wars” of the 1980s. Though the argument over politically correct curricula, controversial faculty appointments, and the traditional mission of the university is ongoing, the university now finds itself being bypassed technologically, conceptually, and culturally, in ways both welcome and disturbing.

At its most basic, the classical education that used to underpin the university often meant some acquaintance with Greek and Latin, which offered students three rich dividends. First, classical-language instruction meant acquiring generic methods of inquiry. Knowledge was no longer hazy and amorphous, but categorized and finite. Classical languages, like their Western successors, were learned through the systematic study of vocabulary, grammar, and syntax. Such philological study then widened to reading poetry, philosophy, history, and oratory. Again, the student learned that there was a blueprint—a structure—to approaching education. Nothing could ever be truly new in itself but was instead a new wrinkle on the age-old face of wisdom. Novel theories of education and entirely new disciplines of learning—to the extent that they were legitimate disciplines—could take their place within existing classical divisions of finite learning, such as philosophy, political science, or literature.

More than just an educational buzzword, then, “interdisciplinary” represented a real unity among fields as diverse as numismatics, epigraphy, architecture, archaeology, philology, art, and literature. Reading Homer or Virgil evoked history, culture, geography, style, language, and philosophy. Poetry was not just the modern habit of breaking up prose into bits and pieces but a discipline of poetic language, meter, and subject matter. Oratory was not just speaking publicly but the art of metaphor, allusion, exaggeration, invective, and hyperbole. The formation of university departments, the concept of a core general-education curriculum, and the expectation that graduates would leave the university with certain skills and shared wisdom were all outgrowths of the study of classics and evolved over two millennia. Classics was not some esoteric discipline but a holistic way of thinking about the world that elevated reason over cant, fad, and superstition.

Second, classical education—reading Homer, Sophocles, and Aristotle, or studying the Delphic Charioteer and red-figure vase painting—conveyed an older, tragic view of man’s physical and mental limitations at odds with the modern notion of life without limits. Love, war, government, and religion involved choices not between utopian perfection and terrible misery but between bad and worse alternatives, or somewhat good and somewhat better options—given the limitations of human nature and the precarious, brief span of human life. Humility permeated traditional liberal arts education: the acceptance that we know very little; that as frail human beings, we live in an unforgiving natural world; and that culture can and should improve on nature without destroying it.

In this regard, the university living experience—on-campus residence, close association with professors at dinners, and attendance at university lectures—helped reinforce the abstract lessons of the classroom and promote a certain civic behavior. Students had a precious four years in such a landscape to prepare their intellectual and moral skills for a grueling life ahead. The university was a unique place; it thrived because liberal arts in the holistic sense simply could not be emulated by, or outsourced to, private enterprise or ad hoc self-improvement training.

Third, classical education was a window on the West. Study of Athenian democracy, Homeric epic, or Roman basilicas framed all exploration of subsequent eras, from the Middle Ages to modernity. An Aquinas, Dante, Michelangelo, or Montesquieu could be seen as reaffirming, adopting, modifying, or rejecting something that the Greeks or Romans had done first. One could no more build a liberal education without some grounding in the classics than one could construct a multistory house without a foundation.

Over the last four decades, various philosophical and ideological strands united to contribute to the decline of classical education. A creeping vocationalism, for one, displaced much of the liberal arts curriculum in the crowded credit-hours of indebted students. Forfeiting classical learning in order to teach undergraduates a narrow skill (what the Greeks called a technĂȘ) was predicated on the shaky notion that undergraduate instruction in business or law would produce superior CEOs or lawyers—and would more successfully inculcate the arts of logic, reasoning, fact-based knowledge, and communication so necessary for professional success.

A therapeutic curriculum, which promised that counseling and proper social attitudes could mitigate such eternal obstacles to human happiness as racism, sexism, war, and poverty, likewise displaced more difficult classes in literature, language, philosophy, and political science. The therapeutic sensibility burdened the university with the task of ensuring that students felt adjusted and happy. And upon graduation, those students began to expect an equality of result rather than of opportunity from their society. Gone from university life was the larger tragic sense. Few students learned (or were reminded) that we come into this world with limitations that we must endure with dignity and courage rather than deal with easily through greater sensitivity, more laws, better technology, and sufficient capital.

Political correctness, meanwhile, turned upside-down the old standard of inductive reasoning, the linchpin of the liberal arts. Students now were to accept preordained general principles—such as the pernicious legacy of European colonialism and imperialism and the pathologies of capitalism, homophobia, and sexism—and then deductively to demonstrate how such crimes manifested themselves in history, literature, and science. The university viewed itself as nearly alone in its responsibility for formulating progressive remedies for society’s ills. Society at large, government, the family, and religion were hopelessly reactionary.

As classical education declined and new approaches arose to replace it, the university core curriculum turned into a restaurant menu that gave 18-year-olds dozens of classes to choose from, the easiest and most therapeutic usually garnering the heaviest attendance. The result, as many critics have noted, is that most of today’s students have no shared notion of education, whether fact-based, requisite knowledge or universal theoretical methodologies. They either do not know what the Parthenon is or, if they do, they do not understand how its role as the democratic civic treasury of the Athenians was any different from—much less any “better” than—what went on atop the monumental Great Temple of TenochtitlĂĄn. Most likewise could not distinguish Corinthian from Doric columns on their venerable campuses, or a frieze from a pediment on their administration buildings. For a brief four-year period, students inherit a now-foreign vocabulary of archaic terms, such as “provost,” “summa cum laude,” and “honorarium,” which they employ but usually do not understand. While the public may not fully appreciate the role that classical education once played, it nonetheless understands that university graduates know ever less, even as the cost of their education rises ever more. Any common, shared notion of what it means to be either a Westerner or an American is increasingly rare.

The universities apparently believed that their traditional prestige, the financial resources of their alumni, and the fossilized cultural desideratum of “going to college” would allow them to postpone a reckoning. But by failing in their central mission to educate our youth, they have provoked the beginnings of an educational counterrevolution. Just as the arrogance and ideological biases of the mainstream media have made them slow to appreciate technological trends and the growing dissatisfaction of their audience, so, too, are universities beginning to fragment, their new multifaceted roles farmed out to others that can do them more cheaply and with less political sermonizing.

The most obvious challenge to university predominance is technological—in particular, Internet-based education offered by private-sector virtual campuses masquerading as traditional universities. As the American workforce increasingly needs retraining and as higher-paying jobs demand ever more specialized skills, students are beginning to pay for their education on a class-by-class basis through distance learning. Online classes, which do not require campus residence or commuting, also eliminate the overhead of highly paid, tenured faculty, campus infrastructure, and such costly elements of undergraduate education as on-campus lectures and extracurricular activities.

Unfortunately, private online schools also do away with the old notion of offering liberal arts classes to enrich citizenship and enhance technological specialization. Perhaps their unspoken premise is that if universities do not believe in the value of teaching Western civilization as part of a mandated general-education curriculum, then why not simply go to the heart of the matter and offer computer-programming skills or aeronautical-engineering know-how without the pretense of a broad education? And who is to say that paid-by-the-hour instructors at the online University of Phoenix are less responsible teachers than their traditional counterparts? After all, their market-driven employers must serve a paying constituency that, unlike traditional university students, often demands near-instant results for its fees.

At American Military University, it’s worth noting in this light, online instructors receive compensation based on the number of students they teach, rather than the number of courses they offer. Cost-cutting measures are radical in the online education world. Bookstores and libraries become almost superfluous; instead, students simply pay fees for the use of Internet resources. The University of Phoenix actually negotiates deals with textbook publishers to make all of their books available online for a flat fee. The logic is to redefine education as an affordable product that finds its value in the marketplace among competing buyers and sellers.

It’s hard to fault these companies; they are serving a need. It would be reassuring, certainly, to think that a psychology student at Smith or Occidental would receive a broader understanding of the discipline, its history, and its place within the liberal arts than would a counterpart graduating from the far cheaper online Argosy University. But it would be far from certain.

Traditional colleges and universities, seeking to compete, have started to enter the online education market. The present university system is partly subsidized by low-paid, part-time faculty without tenure who teach large classes and thereby support a smaller mandarin cohort of tenured professors with full benefits, fewer students, and little worry about the consequences of poor peer reviews or student evaluations. Indeed, since the 1970s, the percentage of tenured and tenure-track professors in the academy has declined dramatically, as the university seeks to exploit the many to pay for the chosen, though dwindling, few. Schools are now starting to complement these two tiers with a third—a new sort of distance-learning adjunct, paid even less, who offers classes via the Internet and may never venture onto campus at all, but whose courses carry the prestige of a well-known university brand. An informal survey suggests that distance learning now makes up as much as 20 percent of total offered classes at some schools.

One can also see a growing cultural reaction to the modern university in the spread of conservative Christian colleges. According to the Council for Christian Colleges and Universities, enrollment in such schools increased 70.6 percent between 1990 and 2004, versus 12.8 percent for public universities and 28 percent for all private universities. The national news media have split into genres predicated on political partisanship: network news, public radio, and large newspapers for liberals; and talk radio, cable news, and Internet sites for conservatives. So, too, have our mainstream universities, promising free thought but in reality indoctrinating their students, become increasingly distinct from religious colleges and universities that take pride in a more classical curriculum.

The religious schools are recognizing their market advantage. What was once the old Bible school has now often become the popular conservative antidote to the liberal university. Liberty University and Oral Roberts University have seen endowments and enrollments soar as they have broadened their mandates to encompass general cultural conservatism rather than solely religious orthodoxy. Liberty University is no longer Jerry Falwell’s weird and tiny Liberty Baptist College of the 1970s but has swelled to more than 20,000 undergraduate and graduate students, with another 4,500 enrolled in online graduate programs alone. Thirty years ago, Fresno Pacific College was a small evangelical Mennonite campus; today, its successor, Fresno Pacific University, is a generic traditional campus that offers an alternative to the cumbersome bureaucracy and politically charged culture of nearby California State University, Fresno. The teacher-credential program at Fresno Pacific’s education school, for example, has earned regional acknowledgment for being more rigorous, better organized, and freer from therapeutic and political biases than its much larger counterpart at CSU, Fresno.

The growth of classically minded religious colleges is not limited to the Protestant evangelical movement. Against-the-grain Catholic schools have flourished, too, offering an alternative not just to Berkeley, Wisconsin, and Amherst but also to increasingly liberal Notre Dame and Santa Clara, which have abandoned traditional Catholic themes and classical values. Thomas Aquinas College, founded in 1969, to take one example, has won recognition for its traditional curriculum. A few nonreligious schools, too, like Hillsdale College and St. John’s College, concentrate solely on the classical curriculum, offering Great Books–based courses whose very success serves as an effective critique of higher education elsewhere.

It’s no accident that millions of laypeople don’t find endowed professors at elite schools interesting or useful. Many public universities have rejected merit pay for faculty on the grounds that academic or teaching excellence is impossible to quantify. More elite private universities have embraced a star system of compensation, but in the liberal arts, the criteria of evaluation usually hinge on esoteric and jargon-laden scholarly publications, not teaching excellence. So those who wish to discover history or literature—to learn about the Founding Fathers or military history, say—often look outside the university, to public intellectuals on television and noted best-selling authors like David McCullough or John Keegan.

Private companies have made considerable profits by responding to the public hunger for inspired teaching of traditional liberal arts. The Teaching Company markets prerecorded lectures with rich content in history, literature, and other subjects from proven classroom stars, many of whom have found far less success under normal academic evaluation. Rosetta Stone’s software offers foreign-language instruction in dozens of languages, without the embedded cultural sermonizing that often characterizes foreign-language departments’ curricula. In a series of CDs from a company called Knowledge Products, marketed as “Giants of Philosophy,” the late Charlton Heston narrates excerpts from the seminal philosophers of the Western tradition. Consumers understand that they are buying the words of the philosophers themselves, read and explained by a skilled orator and actor, and skipping the postmodern jargon and leftist bias.

In the future, to learn professions, many students will enroll in specific classes to master accounting, programming, or spreadsheets, and not feel the need to study inductive reasoning or be equipped with the analogies and similes supplied by great literature and the study of history. If, later in life, graduates feel robbed of such a classical foundation, they can buy CDs and recorded lectures or take self-administered correspondence courses. Since universities are no longer places for disinterested investigation in the manner of Socratic inquiry, one can envision a future in which there will be liberal schools and conservative schools, and religious schools and antireligious schools. But the old, classical, unifying university will then have completed its transformation into a multiversity: knowledge, imbued with politics and ideology, will be fragmented, balkanized, and increasingly appropriated by for-profit companies.

Traditional colleges and universities aren’t about to die, of course. But their attractions—and especially the enticements of the Ivy League schools, Stanford, Berkeley, and such private four-year colleges as Amherst and Oberlin—will largely derive from the status that they convey, the career advantages that accrue from their brand-name diplomas, and the unspoken allure of networking and associating with others of a similarly affluent and privileged class. They are becoming social entities, private clubs for young people, certification and proof of career seriousness, but hardly centers for excellence in undergraduate education in the classical sense. For all the tens of thousands of dollars invested in yearly tuition, there will be no guarantee, or indeed, even a general expectation, that students will encounter singular faculty or receive a superior liberal arts education—let alone that they will know much more about their exceptional civilization than what they could find on the Internet, at religious schools, or on CDs and DVDs.

Once academia lost the agreed-upon, universally held notion of what classical learning was and why it was important, a steady unraveling process removed not just the mission but the mystery—and indeed, the beauty—from the American university. How ironic that the struggling university, in its efforts to meet changing political, technological, and cultural tastes and fads, willingly forfeited the only commodity that made it irreplaceable and that it alone could do well. And how sad, since once the university broke apart the liberal arts, all the religious schools, self-help courses, and CDs couldn’t quite put them together again.

Thursday, January 15, 2009

On Reading Old Books, Especially the Bible

By John Mark Reynolds
Scriptorium Daily

It would help if more people with opinions about the Bible, good or bad, had read it. The few who have read it often haven’t a clue about how to read a book older than a J.K. Rowling best seller.

Secularists reading the Bible are too often like ethnocentric tourists visiting a foreign country. The American tourist who misses great feasts by sticking to McDonalds, because the food of the nation he is visiting is different is foolish. In the same way, the person who avoids or misunderstands the Bible is also missing out.

The Bible isn’t what they are used to reading and they read it badly. They don’t begin with sympathy to see what caused the Bible to become such a great book in the first place, but instead assume that if they don’t get it nobody of intelligence would either.

First, let us state the obvious: the Bible isn’t a simple book; it is at least sixty-six different books collected into a whole. Each discrete book is different, some very different. The skills needed to read Proverbs are very different from those needed to read Revelation. Reading the whole of the Bible well begins with reading each of the books well.

Of course, as in any intentional collection of books there many good questions to ask about what these book have in common. What are the themes in the collection itself?

Good hard work, the best kind of intellectual pleasure, is to be found in understanding Sacred Scriptures, but the kind of education most modern people get, with the tendency to specialize in a field, is not likely a good preparation for reading any old books. Most educators have had the experience of anti-intellectualism from otherwise bright people who cannot learn from Hamlet, because Shakespeare did not know about the space shuttle.

The first key to reading an old book is figuring out what is trying to give us and not impose on it some other agenda or expectation.

If we demand of a book something it is not giving, then we will miss what is in it as well as creating a stupid misunderstanding in our own minds.

The Bible has to be read for what it is (or rather what each discrete book in it is), not what it is not. The Bible is not a modern novel, it is not modern history, and it is not a science book. That doesn’t mean it does not contain stories, history, and scientific truths just that is not any of those kinds of books.

Second, the Bible can be the most important book in human history without being to my taste. Liking a particular work and recognizing greatness in it are two different things. Titus Andronicus is not my favorite Shakespeare play. I came close to walking out of a particularly bloody rendition at the New Globe in London, but it is easy enough to see that for a “revenge play” it is first rate. If you have a taste for revenge plays, then Titus must be a revelation, but this is a taste that I lack.

Unfortunately for me, my line of work requires knowing a bit about Titus so my likes and dislikes are not relevant to whether I will watch the play. I have learned to appreciate what is good about it and even have caught hints of its greatness as a result.

My guess is that most students who turn to most great literature, but especially the Bible don’t like it. It is not what they know best, which is film and the story telling that goes with it. The characters, with a few exceptions like King David, are not well developed and they have a hard time “getting into it.” Like the ship lists in Homer’s Iliad, the Bible contains long passages whose interest is nearly incomprehensible to most moderns. Literary genealogy is not good television and those parts of Scripture, which often lead off a book like Luke can kill any desire to move forward.

It is easy to despise what we don’t like, since it gives an excuse not to bother with hard work. Safe to say then any rant about the Bible, such as a Richard Dawkins extended sneer, however much it may fail do justice to the glories and importance of the Bible will find an audience.

Sadly for such a person the Bible isn’t going away. A billion people base some portion of their life on it and their numbers are growing, not shrinking. Islam is connected to it and that adds hundreds of millions more, but it isn’t just the religious who need to understand Sacred Scripture. Even if the Bible stopped selling today, and it is the best seller year after year, any well educated person should read it carefully.

The Bible has motivated much of the great art, philosophy, and science of the past. If you don’t know the story of the Prodigal Son, then you are going to miss the point of a great many paintings. If you don’t read the words of Jesus in the New Testament, then you will miss a great deal of what Abraham Lincoln had to say.

If you don’t like Star Trek or Joss Whedon, then if you are like most educated people, you safely can ignore it and him. That is just not true of the Bible.

You may never learn to like the Bible as a reader, and perhaps few turn to it for pure amusement, but most readers I know learn to respect it.

Once I had a student frustrated with his failure to learn Spanish who complained that everyone should just learn to speak “real.” When pressed about the nature of this mysterious, though evidently ubiquitous tongue, the student admitted, that “real” was English. He imagined Spanish as a kind of perverse alternative English that some people insisted on speaking despite the obvious fact that everybody he knew spoke English. Sadly, this kind of attitude is not rare. If our social group does something, then it must be “normal” or “reasonable.”

In the same way, it is possible to insulate oneself in a secular media or academic culture and become incapable of understanding who any reasonable person could cherish the Bible. Isn’t that positively medieval? Isn’t the Bible full of errors, wickedness, and stupidity?

Not surprisingly the Bible is not as foolish as some popular critics pretend. It has endured generations of such critics for very good reasons. Even if it is not the very Word of God, as I believe it to be, the Bible is a stunning work of art and literature properly understood. Academics, religious and non-religious, find it worth their entire careers to study a single short book in it. Most of the charges against it are old and there are reasonable responses to them.

If you are a Christian, of course, the same rules apply. The Bible is a book and there are rules about how to read books well. Cheering for the Bible, when one has not carefully read it is as parochial as the American “patriot” all for the Constitution “whatever is in it.”

Christians think the Bible is sacred and holy. That does not mean that we read it in a weird way. It simply means that having read it carefully, men and women became persuaded by its message. Many people on reading the Bible have decided God had a hand in writing it. Thinks God does are different, set apart from the commonplace. That is why believers call the Bible holy.

If God exists, and religious believers know He does based on best reason and experience, then He is radically different and greater than humankind. Anything God does would partake of that quality. Any place God especially touched would be sacred, a God-thing. Christians believe that the Bible is a sacred book.

If God wrote the Bible, then He did so in cooperation with many different human authors. The Bible is not just holy and sacred, but is also human. If it is what it claims to be, then the collection of books called the Bible discloses God to humans as best can be done. It is a Babel-fish from God in the ear of mankind.

How do I know this is true? The Christians believes God will touch the mind of any who reads His book. The Spirit of God will try to communicate with our human spirit. The believer thinks there is a unique experience possible for the reader of the Bible.

One does not have to accept this idea, however, to grasp the essential message and importance of the Bible. The believer reads the Bible like any other book. Since each book in it had a human author, the text can be understood in a fully human way. The message can be examined and subjected to critical scrutiny using human reason.

This human experience of the Bible is open to anybody from atheists to Christians. If there is more to it than that, as I believe, then it begins simply and humanly with something open to all. The Bible’s message can, of course, be rejected by a thoughtful reader, I have known a few such souls, but at least it will have been rejected knowledgably.

As someone who has benefited from ancient literature of all kinds, pagan, Christian, and secular, this much is plain to me:

Charity to text and different times broadens thinking and enriches the reader.

What It Means When Christians Say, “Jesus Is Lord!”

By John Mark Reynolds
Scriptorium Daily

Understanding people from a different point of view can be difficult.

The first time I read the Koran or an atheist book, there was a strong temptation simply to scoff, attack, and stick with my comfortable point of view. That would have been too bad since looking at the world from their perspective, which is what a good reader should do, helped me appreciate the strengths of their position, helped me find weaknesses in my ideas, and gave me the pleasure of considering perspectives alien to my own.

What is more fun than that?

A few things are more fun, such as Disneyland with my family at Christmas, but not many. A good thing isn’t just entertaining, of course, it is good for the soul. Looking at another point of view deeply can avoid misunderstandings, help make new friends, and increase tolerance. While tolerance isn’t always good—getting along with Mr. Stalin is not necessary to live the examined life—tolerance in daily American life is usually good.

Lately I have read a few people disturbed to discover that I confess that: “Jesus is Lord.” What do Christians mean when they say “Jesus is Lord”? Are they forming militias to get ready for a theocracy governed by Biblical law?

This kind of worry is an example of the rule that knowing just a little bit about a group can be lead to big misunderstandings, but learning a bit more can be helpful. Of course, you may still not like the results, but at least you will know what you are opposing.

Why do I say Jesus is Lord? Why not just think for myself? In fact, thinking for myself is what made me bow the knee to King Jesus.

Taking the Delphic demand to know myself seriously made me realize how small and unimportant I was! At the same time, it taught me that my desires were for something greater than self. My longings were bigger than I was! My heart was filled with love and that love (to borrow from the Symposium) was for something or someone.

What could best fill my heart’s longing? What or who was this known (because I loved it) unknown beloved?

Most people never solve this problem. They live hapless lives lurching from one thing to another. This is not the way to happiness! Others find an ideology that seems to work, but too often this ideology is impersonal and destructive. Christian ideas without a personal God could be that way!

My intellectual journey was long (and continues!) and this is not the place to describe it. Instead, this is my testimony to a few things I discovered and experienced on the way that caused me to say in wonder (and surprise!): Jesus is Lord!

First, my search led me to a person. Why? It is better to submit to a person than to an ideology. An ideology is static and can hurt people. Jesus is a person and so has compassion on our weakness. His rule can cope with changing times in a way that a mere “world view” cannot.

My search led me, as it has every traditional Christian, to God. God would not be knowable for humans, He is so great and “other,” but He revealed Himself to us in the person of Jesus Christ. God became one of us.

When I met Jesus Christ, in a deeply personal experience, certain things followed. The Jesus I met was smarter than I was, He was more powerful, and He knew the best way to live because He knows the past and the future. It was senseless not to listen to him.

It goes deeper. Jesus made everything. He has the creator’s rights over His creatures. When we say Jesus is Lord, we are recognizing reality. We discovered this reality using our best experience and our best reason. We are trying to find out what He wishes of His creations. Following His design plan for our lives simply makes sense!

Of course, if God were an arbitrary or cruel creator, then rebellion against Him might be justified. Instead, like an ideal father (so often dreamed of but so rarely found!), His rule is just and elevates us. We are slaves to our passions, but in becoming His children, we are set free from this bondage.

Jesus’ goal for us is freedom in His service. He calls us his brothers and sisters and elevates us (as much as humans can be elevated) to become like He is. On earth, this power is not given to any man, because apparently no man could handle it. Absolute power appears to corrupt absolutely. There is, however, one great exception in all of history.

The Son of God emptied Himself of power and became fully human. This God-Man had both divine and human natures. He had great power, but He came to serve us and to fully grasp our humanity. He came to serve. When He took on Himself every bit of human suffering, then He was exalted.

Jesus is Lord, by nature but also by suffering. He earned the power by service to humanity that He had by nature.

God became human so that humans could become like God. Of course, we can never become God, the divine essence is not for us, but we can become fit for His companionship. That is a great thing!

No living Christian is there yet. Death is the final passage on a long journey before we can see as we ought to see, hear as we ought to hear, or do as we ought to do. When Christians cry out “Jesus is Lord” it is an act of love . . . crying out for something better than they have yet experienced.

I have experienced enough of the liberty and joy of His rule to want more. My ignorance, folly, and prejudice that hold me back from the intellectual, emotional, and physical wholeness of spiritual union with Jesus is frustrating.

That is why we often seem so emphatic in our declaration! Nobody needs to fear this declaration, since if Jesus is Lord, then I am not. No consistent Christian could ever confuse his own will for God’s will. We know we are sinners. Only on a few issues where experience, reason, revelation, and history unite do we have the courage to speak boldly. The protection of human life is one of those areas.

We have creeds and ethical systems that sum our knowledge, but they are short and modest compared to the long codes and rules of any modern state. Caesar is much more sure of himself in our age than any follower of the Christ!

Meanwhile, this side of Paradise, how do Christians know what Jesus wants them to do?

First, Christians pray. Prayer is our heart reaching out to God because His heart first reached out to ours and made such communication possible. It is communion with God. We try to hear what Jesus is saying to us.

If you don’t pray, there is no real way to describe what communion with God is like. As this site often demonstrates, I am no saint and often mistake my own ideas for His! Over the years, experience has taught me (some) modesty about knowing God’s will. I want it. I want to know it, but know that it is a great and mighty thing not easily grasped.

Before heaven it is enough for me to know His will for my own small affairs and to do the best I can to work out my relationship with God humbly.

Usually, for me, prayer consists of being open to God’s voice and will. On an average day, this is all I experience. I continue to work out what He has shown me to that point in my life, though even in that I sense His help and power. For me, this is a deep sense of His presence and power.

Sometimes during prayer, there is an impression in my mind of an idea or an emotion that will rise up inside of me. Less often, I hear His voice in my mind encouraging or rebuking me or urging me to do something I should do. This voice is like no other in my experience. I cannot prove to a skeptic that it is God’s voice, but over time this voice has proven itself to me.

When I mediate daily, I first acknowledge reality. There is a physical and spiritual dimension to reality and God is in charge of it. I ask God to make those two things work together perfectly in the world. I ask for help with my physical and spiritual needs and then I ask for the needs of those around me. When I ask for God’s rule on the earth, I am not asking for power for self, but for justice for every human being.

Second, Christians read the Bible. My personal relationship with Jesus is the most important thing, but it could be that I am deceiving myself. The Bible shows me the real Jesus so I don’t miss Him for the Jesus of my imagination. There is nothing unique in that danger, it is a problem in all relationships between people.

People sometimes cannot get through to us because of the clutter of our own mental and emotional life. Nobody who has ever tried to be in love will fail to recognize that danger! We think we love the beloved, but we are really in love with our image of him or her!

The Bible is God’s written exposition of His nature.

The Bible doesn’t change, though over the years as I have studied it, I have learned to know it better. It might seem funny to some people to find wisdom about a relationship in an ancient book, but if the Bible is God’s revelation of Himself to humankind, then it makes sense.

Words allow me to step back from my experience of Him, which can be quite overwhelming, and think about it. Thinking about a relationship is a great way to avoid misunderstandings and to deepen the experience. Also, it is easy to confuse what I wish Jesus Christ was saying with what He is saying. The Bible is a static expression of what He is like so that I don’t fool myself and mess up our conversations.

I keep my wife Hope’s cards and letters to me over the years and reread them for just this reason. She is a wise woman and I want to love her and not just the woman I imagine she is. Reading her words carefully is one way of making sure I am hearing her, and not just what I wish to hear when she talks to me.

Finally, Christians try to follow Jesus in the context of history and what other (better!) Christians have said about Him. Those of us who love Jesus are a family . . . a big family, fractious and sometimes unattractive. Over time, however, we have learned a thing or two about God and about the Bible. We have made most mistakes that can be made and fooled ourselves most ways that people can be fooled. Reading the fathers and mothers in the faith is a great way to avoid making the same mistakes!

Does this always work?

No.

Christians still make mistakes and misunderstand what they are told by Jesus. After all, my wife and I are much more alike than I am like God and I still misunderstand her!

Christians beg your indulgence for our failures and pray (or should pray!) for humility.

Sometimes Christians are hypocrites and say “Jesus is Lord” but really believe “I am Lord.” We use religion instead of letting the loving Christ use us. We want love, but we sometimes settle for personal indulgence.

What do I do when I fail?

There is a real person who keeps calling me back to my basic relationship with Him. I did not want to be a Christian at one point in my life, but reality (my best reason and experience) would not budge. Jesus is alive and He loves me. He is there and He is not silent.

If you don’t know Him, then it is hard to describe the internal sense of His presence. It is constant and not so different than the sense of my wife’s presence in the room with me now.

When I say, “Jesus is Lord” this is a bit of what I mean. Jesus Christ is real, present, speaking, and worthy of my obedience.

This Christmas when you stand in wonder listening to the Biblical language used by Handel in his glorious Hallelujah Chorus, you will briefly understand what Christians feel daily. King of Kings! Lord of Lords!

Catholicity, Race and Sunday Morning

By Matt Jenson
Scriptorium Daily

For the last 1600 years, Christians have confessed belief in the ‘one holy catholic and apostolic church’. The ‘catholic’ bit of that confession makes many Protestants fidgety, but it need not. Its etymology renders it simply ‘according to the whole’. Catholicity gets at the universal character of the church, and it does so by two ways.

First, it reminds us that the church is more than our local Sunday gatherings. It encompasses the communion of saints across time and space. That means that the eighty of us who worshiped at Fountain of Life Covenant Church in North Long Beach yesterday are not the church by ourselves. We are the church with those worshiping yesterday in Auckland and those worshipping last year, a hundred, a thousand years ago throughout the world. We are the church, only and ever with our brothers and sisters in the non-Western world, in the non-modern world. Catholicity, then, means diversity.

Second, confessing the church’s catholicity commands our allegiance to the church as well as our commitment to her unity. We have ‘one Lord, one faith, one baptism’ (Eph. 4.5), and the very acknowledgment of this implies a submission to the Lord and his church. Confessing the church’s catholicity and refusing to live a life of being reconciled together is what Karl Barth would call an ‘impossible possibility’ – possible in that it happens, but impossible in that its very happening is utterly foreign to the logic of its own existence. In other words, the blocking of reconciliation in our relationships violates the catholicity we (claim) to confess. Catholicity means unity.

***

Today America celebrates Martin Luther King, Jr.’s life and martyrdom, a life and death spent fighting segregation in the name of Jesus. (Incidentally, if you’re wondering about the application of ‘martyr’ to one whose death is not directly the result of a profession of Christ, see Craig Slane’s stirring discussion of Bonhoeffer as modern martyr.’) Here is how King describes segregation in his prison epistle:

Segregation, to use the terminology of the Jewish philosopher Martin Buber, substitutes an “I- it” relationship for an “I-thou” relationship and ends up relegating persons to the status of things. Hence segregation is not only politically, economically and sociologically unsound, it is morally wrong and awful. Paul Tillich said that sin is separation. Is not segregation an existential expression ‘of man’s tragic separation, his awful estrangement, his terrible sinfulness? (‘Letter from a Birmingham Jail’)

It was King who called 11 a.m. on Sunday morning ‘the most segregated hour’ in America. Despite marvelous strides in church and society, our churches remain largely segregated.

It used to be that churches worked on a parish model. If you lived in a particular town, you went to its church. The model reflects the organization of medieval society as well as regionally and nationally established churches. One of its blessings is its inherent catholicity. The local church in the parish model reflects the make-up of the community. Even though not everyone from the community shows up, everyone is represented. The poor are there, as are the rich. Men and women, married and single, young and old, black and white are there. In this, the local church mirrors the universal church.

What makes this model attractive is its inherent desegregating orientation. The person sitting in the chair next to me (or even worse – the pew! I might have to touch them!) could be anyone – young people and poor people and married people, oh my! The church promises to be one of the few places where people who don’t belong together, belong together. It is a place where people who we wouldn’t pick for friends become our brothers and sisters.

It’s for this reason that I’m not a big fan of catering to our felt needs to be with people just like us in church. There’s a place for singles groups, young parents groups, hip tattooed people groups, motorcycle enthusiasts groups. But only a peripheral and penultimate place, one that fills a gap and looks to groups of people who are nothing like one another. The heart of the church is difference. Our small groups should be places of difference, where we live the reconciled life with people to whom we aren’t necessarily reconciled.

Clubs are for people who want to hang out with people like them. Churches aren’t.

The thing is, as attractive as the parish model is, it doesn’t quite work here and now. American society is stratified. We have built whole communities around the idea that lookalikeness is next to godliness. Our churches set up shop, reach a community and, often enough, reflect the homogeneity of that community. But in so doing, they fail to reflect the ‘come one, come all’ character of the church universal. Structures of sin set society up in such a way that we have to be unnatural (moving to or going to church in a different neighborhood, say) to be natural (that is, truly kata holos, truly catholic).

None of this is to sideline King’s vision. Nor is it to prematurely pronounce its realization. It is, though, to remind us that the church’s catholic nature and its call to catholicity include a number of ‘natural’ divisions. The church’s catholicity is the pudding that proves the words of Paul: ‘In Christ there is no Jew or Greek, slave or free, male or female.’ And, in her catholicity, the church points to the future, to a day when every tongue and tribe and nation will be worshiping before the throne of God and of the Lamb.

Thanks be to God for his servant, our brother, Martin.

Socrates and Education: Part I

By John Mark Reynolds
Scriptorium Daily

Socrates wrote no books. Like Jesus, the only record he left was in the lives of those impressed by his life. Aristophanes, the great comic poet, made fun of him in The Clouds. The prize winning playwright made his Athenian audience laugh at learning, but The Clouds derives its chief modern interest from his assault on Socrates. Xenophan was a conservative student who left humanity a collection of fond memories of his master. In it, Socrates resembles a glorified version of Xenophan. Socrates is the wise and conventional popular sage. It is impossible from Xenophan’s account to understand why Athens would have bothered to kill Socrates.

Socrates survived his execution because Plato made him immortal. No teacher has ever had a better pupil. Plato, especially in his earliest dialogues, captured the essential Socrates. To read the Apology, the defense of Socrates at his trial is to understand the man. Were these the very words used by Socrates? It does not matter. Plato uses his memories of the trial and makes Socrates live again, but even this worshipful student began to outgrow his pain. “Socrates” in the dialogues gains his own voice and that voice matures. The voice of Plato in the Laws, where Socrates does not appear as a character, is not that of Socrates.

Though the Platonic dialogues are not historical documents, they are valuable in knowing the real Socrates. In fact, if the goal of reading them is to know Socrates, they are better than modern histories. Plato uses every tool in his unsurpassed intellectual arsenal to present a picture of the living Socrates. It is not Plato’s concern to get every historical detail right. He invents conversations between other people and Socrates that could never happened in actuality. The famous drinking party or Symposium is a prime example of the liberties with history Plato was willing to take, but he does want to defend his master against the charges brought against him. To do so, he presents a plausible picture to the public. How could Plato hope to swing intelligent opinion behind Socrates if the Socrates he described did not resemble the man Athenians had known?

Socrates did not claim to have knowledge. In fact, Socrates frequently claimed that he did not know anything . . . except that he did not know anything. Of course, by this he meant that he did not know anything he had come to think worth knowing. Socrates had served as loyal citizen of the city in the government and the army. He was a craftsman, a stone cutter, a highly skilled trade. Socrates saw, however, that these external “good” were not enough, since they did not teach him how to live the good life. This integrity came with a price since Socrates did not charge fees for his teaching or mentoring.

Socrates believed that the best way to learn was by examining the opinions of those who claimed to know. At the very least, an exposure of their ignorance would eliminate one way of going wrong. The man who debunks the inflated claims of a healing evangelist does not show that healing does not happen, but he does provide the useful service of showing that healing has not happened here and in this way. It is a negative path to a limited sort of knowledge. Socrates hoped in the process to come to a person who knows. It may also have occurred to Socrates that the communal quest for the truth like that pictured in the Symposium would bring answers. It might be that no one person could teach, but that many persons together could find the right questions.

Listening to Harry Chapin’s “Cat’s in the Cradle”

By Stephen H. Webb
First Things

Harry Chapin’s Cat’s in the Cradle is a maudlin song, meant to manipulate, and it hits me hard every time I hear it pop up, unpredictably and infrequently, on the radio. The song is a bit preachy, which is probably why it has been used in so many sermons, and why it has also been an easy target of parody and ridicule. It is about a father who is too busy to spend time with his son, who nonetheless admires him and wants to be just like him. When the son grows up, he is too busy with his own work and family to spend any time with his aging father. The son, in other words, has turned out just like his father, though in a way that the father regrets.

The irony of the song is a bit obvious, but it still packs a powerful punch, at least to guys my age. The song is partially autobiographical, because Chapin’s father, a musician, was on the road during much of his youth, but it is also about Chapin himself, since his wife is actually credited with writing the lyrics. She was worried that Chapin would not be around to help raise their newborn son, Josh. She wrote the song as a warning to her husband, and it has served as an effective wake-up call to fathers ever since. The song begins,

My child arrived just the other day
He came to the world in the usual way
But there were planes to catch and bills to pay
He learned to walk while I was away.

Chapin took his wife’s words and added a catchy melody that makes the sad lyrics almost bearable.

Cat’s in the Cradle was released in 1974, reached the top of the Billboard music charts, sold millions, and earned Chapin a Grammy nomination for Best Song. In other words, it struck a chord.

The 1970s were a divisive decade. What began in the 1960s as a fairly elite and limited rebellion against traditional moral standards became in the 1970s the social norm, with catastrophic results. College radicals failed in many of their political objectives—the backlash that began with Nixon’s election still has momentum today—but they succeeded in transforming American culture. Even teenagers raised in the Midwest with solid traditional values fought their parents over trips to the barber and what to listen to on the car radio. I know that from personal experience. Rebellion was in the air, and rock and roll provided the soundtrack.

My students, as we would have said back then, just can’t relate. Every year I teach a course for freshmen on Christianity and Popular Culture. I try to persuade my students, all of whom are usually Christian, that having faith should force them into a protracted and messy battle with popular culture, but I’ve seen that message make less and less sense with each passing year. One of the hardest things I have had to learn as a teacher is that my story is not their story. This is a good thing, of course, given how low the 1970s sunk, but it is also something worth thinking about. My generation was raised in conflict. What do students today fight for, and what do they fight against?

When I tell my students that my father and I had constant battles over the length of my hair and the span of the bell bottoms on my favorite pair of purple plaid pants, they just laugh. They have been spared the ravages of a society trying to redefine itself through bad fashion. When I tell them that rock and roll was meant to tear families apart by promoting promiscuity and drugs, however, they think I am joking. They listen to the same music as their parents. Rock and roll has been made safe by Contemporary Christian Music. Rock is simply the way the world sounds.

Many of my students (I teach at an all-male college) tell me that their father is their best friend. That is great, but I wonder what they have lost when they do not experience fathers as a source of judgment and an obstacle to adolescent excess. As one of my students was talking about his father the other day, I noticed that he has a pierced tongue. When I asked if that bothered his father, he replied that his dad paid for it and thought it was “cool.” Other students have tattoos, although some of them said they had not told their mothers. Moms don’t want their sons to look like Aztec warriors, but dads are proud of the way their sons manage to keep on the cutting edge while still looking clean cut.

Other than piercings and tattoos, my students dress pretty conservatively, which means they wear the typical uniform of jeans and knit shirts with the logos of sports teams or running shoes. They find it hard to imagine a time when blue jeans were missiles in the war between the generations. I developed some of my intellectual cunning from the years I spent negotiating with my father over what was appropriate Sunday morning attire. First, I convinced him that wearing jeans to school was okay. Then I gradually earned permission to wear my school clothes to Wednesday night services. Finally, I fought for more casual wear for Sunday evenings, hoping that the slippery slope of permissiveness would free me from the drudgery of dressing up on Sunday mornings. It was a prolonged series of negotiations that should have made me a lawyer, not a theologian.

By the middle of the 1970s, most parents had given up defending the sensibilities of modesty and prudence, but they did not join teenagers in their flouting of convention. That trend would await another decade or two, when the baby boomers grew up and decided that they wanted to act and look just like their children.

I asked my students the other day if clothing should be used as a way of indicating the seriousness of a social situation. Most of them resisted this suggestion. Clothing, they said, doesn’t really mean anything. If that is right, I continued, why are so many of your shirts walking advertisements? After some interesting discussion, we decided that logos are needed to distinguish people by what they wear, but since so many shirts have logos on them, nobody risks standing out too much. But they continued to resist the idea that clothing could symbolize the sacred. I thought about this for a couple of days and came prepared with a question for the next class. Would they be comfortable with the nine Justices of the Supreme Court deliberating about constitutional issues in T-shirts and jeans? I finally found the place where they would draw the line.

Ordinarily, however, line drawing is not their forte. I envy their easygoing intimacy with their fathers, but I wonder about their lackadaisical approach to spirituality. Until the modern era, Christianity created the high culture of the West, but then it lost much of its creative potency as the arts, one by one, declared their independence from the Church. By the 1970s, popular culture was so alienated from Christianity that just going to public school during the week was a schizophrenic experience for those of us raised in conservative churches. The language of spiritual warfare came naturally to evangelicals because we were forced to decide, almost on a daily basis, whose side we were on. Most of us made a lot of bad decisions, but we learned that decisions had to be made, and that the worse decision was deciding not to decide.

Rock and roll, for example, was not a neutral musical genre. The music at my boyhood church was still stuck in the age of barbershop quartets. My parents thought Frank Sinatra was too racy, so they were hardly prepared for Elvis, Dylan, or the Beatles. Rock has to be played loud to be enjoyed, or at least that is what we told our parents, who were constantly yelling at us to turn it down. Rock deafens the senses, including the moral sense, with its incessant beat and garbled lyrics. My parents were not alone in taking me to revivals where I would be inspired to purge my record collection. Every album I burned in the backyard, putting who knows what pollutants into the air, cost me hours of agony and regret.

When the rumor that Bob Dylan had converted to Christianity reached Indianapolis in 1979, during my senior year in high school, I was overjoyed. That one of the greatest rock stars had become one of us was too good to be true. The three Christian albums he recorded, Slow Train Coming, Shot of Love, and Saved, spoke to my ears as much as my heart, because they brought together, for the first time, my faith and my musical taste. My students, however, have never heard of these albums, and they take it for granted that rock and Christianity can go hand in hand. They listen to Christian heavy metal bands that make me feel as out of it as my parents were thirty years ago. They do not think that the style of music has any affect on its message. Christianity is malleable to them. Faith can be poured into any cultural form without diluting or altering its essence.

Part of me, of course, envies their more peaceful adjustments to the modern world. Part of me too takes perverse pride in the battles my generation fought. Rock was divisive back then, but at least it wasn’t bland! I am actually very glad that my students do not have to fight my battles, because they were destructive, and little was gained. But those battles made me who I am, and I cannot shake the idea that struggle confers our most abiding sense of identity.

Students today do not resonate with the idea of being part of a generation. The 1960s generation took so much pride in their radical ideals that they practically invented the idea of a generation of young people set apart by the accident of chronology. Those of us who grew up in the 1970s had to wrestle with the self-important weight of the 1960s. We bore the brunt of our elder siblings’ experiments in altering reality. In my freshman class on Christianity and Popular Culture, I used to teach books about the Gen X generation, which was defined by the emergence of new technologies, but all of that is old hat to today’s students. When I asked my freshmen to name their generation, the best we could come up with was the Security Generation, or the N-Security Generation, but even that did not seem to fit, since they do not feel all that threatened by overseas terrorists.

They are too secure, this generation. They are not forced to choose between popular culture and Christianity. And they are not rebelling against their parents.

That brings us back to Harry Chapin’s song. Fathers in the 1970s were too threatened by the rapid cultural changes to give their sons useful advice about how to negotiate adolescence, and anyway, it was an era when fathers went to work and left the child rearing to their wives. In the words of Cat’s in the Cradle.

My son turned ten just the other day
He said, “Thanks for the ball, dad, come on let’s play.
Can you teach me to throw?” I said, “Not today,
I got a lot to do.” He said, “That’s ok.”

It is emotionally hard for me to type those words on my computer screen. They are too real. We should thank God every day that most fathers (I hope!) have learned their lesson about neglecting their children.

Yet we should also not be too hard on the fathers of old. It is always difficult to be a father, and it must have been especially challenging in the 1970s. We teenagers did not make it easy for fathers to be fathers. We have to forgive our fathers as the first step in becoming fathers ourselves. And maybe in forgiving them, we won’t overreact to the point of giving our sons everything they want, rather than the essential things that they need.

I usually teach a section on marriage and sex in my Christianity and Popular Culture course, and I often tell my students that popular culture has robbed them of romance. Even the most modest young men today, I argue, have been so immersed in a culture of pornography that we have been inoculated to female beauty. We men (I always include myself in this discussion) have been taught to treat the female form in abstract terms and thus to use it as an ideal template by which to measure the women we meet. We no longer know the joys of sensual pleasure that springs not from visual attraction but from the yearning of true knowledge for another. We don’t know how to let a spiritual soulmate shape our desires by teaching us to see others the way God sees them.

Yet something happened the other day that made me think I have been too hard on my students. I often try to describe to them the way their ancestors, not all that long ago, would have chosen the mates of their children, a practice they associate today with some backward part of India. I try to help them see that the choice of a marriage partner should be based on wider considerations than romance alone. To focus this discussion, I ask them a hypothetical question. Suppose you were to be guided in your selection of a wife by one, and only one, of two factors, either your hormones or your parents. That is, would you let your parents pick your wife or would you rather trust your sensual desire, that spark of attraction that makes you light up with sexual longing?

In past years, my students were horrified at the thought of their parents choosing their marriage partners. This year was different. Many of them said they would trust their parents. In fact, more said they would trust their dads than their moms. They thought their moms would look for a good girl and disregard looks altogether, while they thought their dads would probably get the balance of moral and physical attributes just about right.

I found their conversation to be very moving, and wondered if my two young boys, when they reach the marrying age, will have that kind of trust in me. We lose something when we do not have to fight for what we believe, but what we have gained in father and son relationships is so much more important that I do not regret that my boys will never be able to relate to Cat’s in the Cradle. They will have other battles in their lives, but, I pray to God, they will not have to battle me.

How the city hurts your brain

By Jonah Lehrer
The Boston Globe

THE CITY HAS always been an engine of intellectual life, from the 18th-century coffeehouses of London, where citizens gathered to discuss chemistry and radical politics, to the Left Bank bars of modern Paris, where Pablo Picasso held forth on modern art. Without the metropolis, we might not have had the great art of Shakespeare or James Joyce; even Einstein was inspired by commuter trains.

(Yuko Shimizu for the Boston Globe)

And yet, city life isn't easy. The same London cafes that stimulated Ben Franklin also helped spread cholera; Picasso eventually bought an estate in quiet Provence. While the modern city might be a haven for playwrights, poets, and physicists, it's also a deeply unnatural and overwhelming place.

Now scientists have begun to examine how the city affects the brain, and the results are chastening. Just being in an urban environment, they have found, impairs our basic mental processes. After spending a few minutes on a crowded city street, the brain is less able to hold things in memory, and suffers from reduced self-control. While it's long been recognized that city life is exhausting -- that's why Picasso left Paris -- this new research suggests that cities actually dull our thinking, sometimes dramatically so.

"The mind is a limited machine,"says Marc Berman, a psychologist at the University of Michigan and lead author of a new study that measured the cognitive deficits caused by a short urban walk. "And we're beginning to understand the different ways that a city can exceed those limitations."

One of the main forces at work is a stark lack of nature, which is surprisingly beneficial for the brain. Studies have demonstrated, for instance, that hospital patients recover more quickly when they can see trees from their windows, and that women living in public housing are better able to focus when their apartment overlooks a grassy courtyard. Even these fleeting glimpses of nature improve brain performance, it seems, because they provide a mental break from the urban roil.

This research arrives just as humans cross an important milestone: For the first time in history, the majority of people reside in cities. For a species that evolved to live in small, primate tribes on the African savannah, such a migration marks a dramatic shift. Instead of inhabiting wide-open spaces, we're crowded into concrete jungles, surrounded by taxis, traffic, and millions of strangers. In recent years, it's become clear that such unnatural surroundings have important implications for our mental and physical health, and can powerfully alter how we think.

This research is also leading some scientists to dabble in urban design, as they look for ways to make the metropolis less damaging to the brain. The good news is that even slight alterations, such as planting more trees in the inner city or creating urban parks with a greater variety of plants, can significantly reduce the negative side effects of city life. The mind needs nature, and even a little bit can be a big help.

Consider everything your brain has to keep track of as you walk down a busy thoroughfare like Newbury Street. There are the crowded sidewalks full of distracted pedestrians who have to be avoided; the hazardous crosswalks that require the brain to monitor the flow of traffic. (The brain is a wary machine, always looking out for potential threats.) There's the confusing urban grid, which forces people to think continually about where they're going and how to get there.

The reason such seemingly trivial mental tasks leave us depleted is that they exploit one of the crucial weak spots of the brain. A city is so overstuffed with stimuli that we need to constantly redirect our attention so that we aren't distracted by irrelevant things, like a flashing neon sign or the cellphone conversation of a nearby passenger on the bus. This sort of controlled perception -- we are telling the mind what to pay attention to -- takes energy and effort. The mind is like a powerful supercomputer, but the act of paying attention consumes much of its processing power.

Natural settings, in contrast, don't require the same amount of cognitive effort. This idea is known as attention restoration theory, or ART, and it was first developed by Stephen Kaplan, a psychologist at the University of Michigan. While it's long been known that human attention is a scarce resource -- focusing in the morning makes it harder to focus in the afternoon -- Kaplan hypothesized that immersion in nature might have a restorative effect.

Imagine a walk around Walden Pond, in Concord. The woods surrounding the pond are filled with pitch pine and hickory trees. Chickadees and red-tailed hawks nest in the branches; squirrels and rabbits skirmish in the berry bushes. Natural settings are full of objects that automatically capture our attention, yet without triggering a negative emotional response -- unlike, say, a backfiring car. The mental machinery that directs attention can relax deeply, replenishing itself.

"It's not an accident that Central Park is in the middle of Manhattan," says Berman. "They needed to put a park there."

In a study published last month, Berman outfitted undergraduates at the University of Michigan with GPS receivers. Some of the students took a stroll in an arboretum, while others walked around the busy streets of downtown Ann Arbor.

The subjects were then run through a battery of psychological tests. People who had walked through the city were in a worse mood and scored significantly lower on a test of attention and working memory, which involved repeating a series of numbers backwards. In fact, just glancing at a photograph of urban scenes led to measurable impairments, at least when compared with pictures of nature.

"We see the picture of the busy street, and we automatically imagine what it's like to be there," says Berman. "And that's when your ability to pay attention starts to suffer."

This also helps explain why, according to several studies, children with attention-deficit disorder have fewer symptoms in natural settings. When surrounded by trees and animals, they are less likely to have behavioral problems and are better able to focus on a particular task.

Studies have found that even a relatively paltry patch of nature can confer benefits. In the late 1990s, Frances Kuo, director of the Landscape and Human Health Laboratory at the University of Illinois, began interviewing female residents in the Robert Taylor Homes, a massive housing project on the South Side of Chicago.

Kuo and her colleagues compared women randomly assigned to various apartments. Some had a view of nothing but concrete sprawl, the blacktop of parking lots and basketball courts. Others looked out on grassy courtyards filled with trees and flowerbeds. Kuo then measured the two groups on a variety of tasks, from basic tests of attention to surveys that looked at how the women were handling major life challenges. She found that living in an apartment with a view of greenery led to significant improvements in every category.

"We've constructed a world that's always drawing down from the same mental account," Kuo says. "And then we're surprised when [after spending time in the city] we can't focus at home."

But the density of city life doesn't just make it harder to focus: It also interferes with our self-control. In that stroll down Newbury, the brain is also assaulted with temptations -- caramel lattes, iPods, discounted cashmere sweaters, and high-heeled shoes. Resisting these temptations requires us to flex the prefrontal cortex, a nub of brain just behind the eyes. Unfortunately, this is the same brain area that's responsible for directed attention, which means that it's already been depleted from walking around the city. As a result, it's less able to exert self-control, which means we're more likely to splurge on the latte and those shoes we don't really need. While the human brain possesses incredible computational powers, it's surprisingly easy to short-circuit: all it takes is a hectic city street.

"I think cities reveal how fragile some of our 'higher' mental functions actually are," Kuo says. "We take these talents for granted, but they really need to be protected."

Related research has demonstrated that increased "cognitive load" -- like the mental demands of being in a city -- makes people more likely to choose chocolate cake instead of fruit salad, or indulge in a unhealthy snack. This is the one-two punch of city life: It subverts our ability to resist temptation even as it surrounds us with it, from fast-food outlets to fancy clothing stores. The end result is too many calories and too much credit card debt.

City life can also lead to loss of emotional control. Kuo and her colleagues found less domestic violence in the apartments with views of greenery. These data build on earlier work that demonstrated how aspects of the urban environment, such as crowding and unpredictable noise, can also lead to increased levels of aggression. A tired brain, run down by the stimuli of city life, is more likely to lose its temper.

Long before scientists warned about depleted prefrontal cortices, philosophers and landscape architects were warning about the effects of the undiluted city, and looking for ways to integrate nature into modern life. Ralph Waldo Emerson advised people to "adopt the pace of nature," while the landscape architect Frederick Law Olmsted sought to create vibrant urban parks, such as Central Park in New York and the Emerald Necklace in Boston, that allowed the masses to escape the maelstrom of urban life.

Although Olmsted took pains to design parks with a variety of habitats and botanical settings, most urban greenspaces are much less diverse. This is due in part to the "savannah hypothesis," which argues that people prefer wide-open landscapes that resemble the African landscape in which we evolved. Over time, this hypothesis has led to a proliferation of expansive civic lawns, punctuated by a few trees and playing fields.

However, these savannah-like parks are actually the least beneficial for the brain. In a recent paper, Richard Fuller, an ecologist at the University of Queensland, demonstrated that the psychological benefits of green space are closely linked to the diversity of its plant life. When a city park has a larger variety of trees, subjects that spend time in the park score higher on various measures of psychological well-being, at least when compared with less biodiverse parks.

"We worry a lot about the effects of urbanization on other species," Fuller says. "But we're also affected by it. That's why it's so important to invest in the spaces that provide us with some relief."

When a park is properly designed, it can improve the function of the brain within minutes. As the Berman study demonstrates, just looking at a natural scene can lead to higher scores on tests of attention and memory. While people have searched high and low for ways to improve cognitive performance, from doping themselves with Red Bull to redesigning the layout of offices, it appears that few of these treatments are as effective as simply taking a walk in a natural place.

Given the myriad mental problems that are exacerbated by city life, from an inability to pay attention to a lack of self-control, the question remains: Why do cities continue to grow? And why, even in the electronic age, do they endure as wellsprings of intellectual life?

Recent research by scientists at the Santa Fe Institute used a set of complex mathematical algorithms to demonstrate that the very same urban features that trigger lapses in attention and memory -- the crowded streets, the crushing density of people -- also correlate with measures of innovation, as strangers interact with one another in unpredictable ways. It is the "concentration of social interactions" that is largely responsible for urban creativity, according to the scientists. The density of 18th-century London may have triggered outbreaks of disease, but it also led to intellectual breakthroughs, just as the density of Cambridge -- one of the densest cities in America -- contributes to its success as a creative center. One corollary of this research is that less dense urban areas, like Phoenix, may, over time, generate less innovation.

The key, then, is to find ways to mitigate the psychological damage of the metropolis while still preserving its unique benefits. Kuo, for instance, describes herself as "not a nature person," but has learned to seek out more natural settings: The woods have become a kind of medicine. As a result, she's better able to cope with the stresses of city life, while still enjoying its many pleasures and benefits. Because there always comes a time, as Lou Reed once sang, when a person wants to say: "I'm sick of the trees/take me to the city."