Friday, August 28, 2009

Whatever Happened to the Work Ethic?

By Steven Malanga
City Journal

The financial bust reminds us that free markets require a constellation of moral virtues.
Values like thrift, which remained strong through the 1950s, eventually gave way to a culture of uncontrolled consumption and debt.
Values like thrift, which remained strong through the 1950s, eventually gave way to a culture of uncontrolled consumption and debt.

In Democracy in America, Alexis de Tocqueville worried that free, capitalist societies might develop so great a “taste for physical gratification” that citizens would be “carried away, and lose all self-restraint.” Avidly seeking personal gain, they could “lose sight of the close connection which exists between the private fortune of each of them and the prosperity of all” and ultimately undermine both democracy and prosperity.

The genius of America in the early nineteenth century, Tocqueville thought, was that it pursued “productive industry” without a descent into lethal materialism. Behind America’s balancing act, the pioneering French social thinker noted, lay a common set of civic virtues that celebrated not merely hard work but also thrift, integrity, self-reliance, and modesty—virtues that grew out of the pervasiveness of religion, which Tocqueville called “the first of [America’s] political institutions, . . . imparting morality” to American democracy and free markets. Some 75 years later, sociologist Max Weber dubbed the qualities that Tocqueville observed the “Protestant ethic” and considered them the cornerstone of successful capitalism. Like Tocqueville, Weber saw that ethic most fully realized in America, where it pervaded the society. Preached by luminaries like Benjamin Franklin, taught in public schools, embodied in popular novels, repeated in self-improvement books, and transmitted to immigrants, that ethic undergirded and promoted America’s economic success.

What would Tocqueville or Weber think of America today? In place of thrift, they would find a nation of debtors, staggering beneath loans obtained under false pretenses. In place of a steady, patient accumulation of wealth, they would find bankers and financiers with such a short-term perspective that they never pause to consider the consequences or risks of selling securities they don’t understand. In place of a country where all a man asks of government is “not to be disturbed in his toil,” as Tocqueville put it, they would find a nation of rent-seekers demanding government subsidies to purchase homes, start new ventures, or bail out old ones. They would find what Tocqueville described as the “fatal circle” of materialism—the cycle of acquisition and gratification that drives people back to ever more frenetic acquisition and that ultimately undermines prosperous democracies.

And they would understand why. After flourishing for three centuries in America, the Protestant ethic began to disintegrate, with key elements slowly disappearing from modern American society, vanishing from schools, from business, from popular culture, and leaving us with an economic system unmoored from the restraints of civic virtue. Not even Adam Smith—who was a moral philosopher, after all—imagined capitalism operating in such an ethical vacuum. Bailout plans, new regulatory schemes, and monetary policy moves won’t be enough to spur a robust, long-term revival of American economic opportunity without some renewal of what was once understood as the work ethic—not just hard work but also a set of accompanying virtues, whose crucial role in the development and sustaining of free markets too few now recall.

The American experiment that Tocqueville chronicled in the 1830s was more than just an effort to see if men could live without a monarch and govern themselves. A free society had to be one in which people could pursue economic opportunity with only minimal interference from the state. To do so without producing anarchy required a self-discipline that was, to Max Weber, the core of the capitalist ethic. “The impulse to acquisition, pursuit of gain, of money, of the greatest possible amount of money, has in itself nothing to do with capitalism,” Weber wrote in The Protestant Ethic and the Spirit of Capitalism. “Unlimited greed for gain is not in the least identical with capitalism, and still less its spirit.” Instead, the essence of capitalism is “a rational tempering” of the impulse to accumulate wealth so as to keep a business (and ultimately the whole economy) sustainable and self-renewing, Weber wrote. It is “the pursuit of profit, and forever renewed profit, by means of continuous, rational . . . enterprise.”

Weber famously argued that the Protestant Reformation—with John Calvin’s and Martin Luther’s emphasis on individual responsibility, hard work, thrift, providence, honesty, and deferred gratification at its center—shaped the spirit of capitalism and helped it succeed. Calvinism and the sects that grew out of it, especially Puritanism and John Wesley’s Methodism in England, were religions chiefly of the middle and working classes, and the virtues they promoted led to a new kind of affluence and upward mobility, based not on land (which was largely owned by the aristocracy) but on productive enterprises.

Nowhere did the fusing of capitalism and the virtues that made up the work ethic find a fuller expression than in America, where Puritan pioneers founded settlements animated by a Calvinist dedication to work. One result was a remarkable society in which, as Tocqueville would observe, all “honest callings are honorable” and in which “the notion of labor is therefore presented to the mind on every side as the necessary, natural, and honest condition of human existence.” Unlike in Europe, where aristocrats and gentry often scorned labor, in the United States, “a wealthy man thinks that he owes it to public opinion to devote his leisure to some kind of industrial or commercial pursuit, or to public business. He would think himself in bad repute if he employed his life solely in living.”

This thick and complex work ethic, so essential to the success of the early, struggling American settlements, became part of the country’s civic fabric. It found its most succinct expression in the writings of Benjamin Franklin, whose well-known maxims, now considered quaintly old-fashioned, recommended to citizens of the new country a worldview that promoted work and the pursuit of wealth. “Time is money” and “Never keep borrowed money an hour beyond the time you promised” and “Early to bed, early to rise, makes a man healthy, wealthy, and wise” voiced virtues that Franklin and his contemporaries viewed not chiefly as religious but as utilitarian. A reputation for honesty makes it easier to borrow money for new ventures, Franklin counseled. A man who displays self-discipline in his personal life inspires confidence in lenders and business partners. This constellation of virtues, which Weber described as “the ideal of the honest man of recognized credit,” is how one gets ahead in life.

Franklin’s best-selling writings had an enormous impact on America. His ideas, widely applauded, permeated popular culture and education. The leading grammar school textbooks of the nineteenth century, for example, by William Holmes McGuffey and his brother Alexander, inculcated children with the virtues of work and thrift. To dramatize the “Consequences of Idleness,” McGuffey’s Fourth Eclectic Reader told the story of poor George Jones, who frittered away his time in school and wasted the money his father had devoted to his education, winding up a poor wanderer. In fifth grade, students memorized Eliza Cook’s paean to labor, simply titled “Work,” which urged them to “Work, work, my boy, be not afraid; / Look labor boldly in the face.”

Schooled in such attitudes, America’s nineteenth-century youth embraced the rags-to-riches novels of Horatio Alger, Jr., who sold some 200 million books with plotlines that are a road map of the work ethic. In his first commercial success, Ragged Dick, Dick Hunter, 14 and homeless, impresses patrons with his honesty and industriousness and slowly rises in the world. When he teeters on the verge of losing everything because a thief pilfers his savings-account passbook, bank officials recognize him from his regular visits to make deposits, and they have the thief arrested. In a later novel, Bound to Rise, poor Henry Walton wins a biography of Ben Franklin for acing exams and, inspired by his life story, goes off to earn a fortune.

The work ethic even shaped American play. The most popular game of its time, “The Checkered Game of Life,” produced by Milton Bradley in the mid-nineteenth century and sold door-to-door, challenged players to travel through life and earn points for successfully completing school, getting married, and working hard, while avoiding pitfalls like gambling and idleness. In his patent application for the game, Bradley observed that it was intended to “impress upon the minds of youth the great moral principles of virtue and vice.” Its success spawned a whole genre. “Many games with similar moral thrusts followed,” observed Jennifer Jensen of the New-York Historical Society in an article called “Teaching Success Through Play.” These games “emphasized secular virtues such as thrift, neatness, and kindness.”

The work ethic also distinguished the northern colonies from the southern, and later helped the North win the Civil War. Many southern settlers came in search not of religious freedom but only of economic opportunity. Instead of founding villages or towns with a common civic life, southern settlers developed isolated, widely separated plantations. They cultivated a few staple crops using slave labor, instead of developing a diversified economy. They created a society where a relatively few plantation owners acted like an aristocracy. Rather than viewing all honest work as honorable, they developed what historian C. Vann Woodward calls the “Southern ethic,” which saw some work as fit only for slaves. In the end, these attitudes proved the South’s greatest vulnerability, as the North, shaped by the work ethic, brought to bear its industrial might against the narrow economy of the South, built precariously on tobacco and slave labor and a Cavalier rather than a Puritan ethic.

After the Civil War, this secularized version of the Protestant ethic served as a lodestar for millions of poor immigrants, many from countries with little experience of free markets and democracy. Their assimilation into a culture that they recognized not as Protestant but as American reinvigorated the country, helping to set late-nineteenth- and early-twentieth-century America on a distinctly different path from much of Europe.

Many of these immigrants, ironically, absorbed their Franklinesque code from the American Catholic Church. Key members of the church hierarchy—notably, New York’s brilliant, Irish-born first archbishop, John Hughes, who rose from poverty—lived by the ethic and understood its role in the country’s success. Hughes set as his task the moral and economic uplift of Gotham’s millions of poor Irish immigrants. He founded a network of some 100 Catholic schools that taught Irish children not just the three Rs but also a “faith-based code of personal conduct,” as William J. Stern wrote in City Journal (“How Dagger John Saved New York’s Irish,” Spring 1997). Hughes’s church was, as he put it, “a church of discipline.” He fostered residential schools that taught vocational skills and conduct to thousands of orphaned or abandoned Irish street children and sent them off successfully into American society. Catholic schools around the country copied his work, and many of them continue today to succeed even with at-risk kids.

By the end of the nineteenth century, the Irish had largely shaken off poverty and joined the American mainstream. Waves of Southern and Eastern European Catholics followed them, as well as Eastern European Jews—some 20 million immigrants between 1890 and 1925—who quickly replicated the success of the Irish in a country whose institutions emphasized and rewarded hard work, thrift, and self-improvement. Within a single generation, one study shows, the average early-twentieth-century immigrant family had achieved income and educational parity with American-born families, so that the children of these immigrants were just as likely to be accountants, engineers, or lawyers as the children of families rooted here for generations.

The breakup of this 300-year-old consensus on the work ethic began with the cultural protests of the 1960s, which questioned and discarded many traditional American virtues. The roots of this breakup lay in what Daniel Bell described in The Cultural Contradictions of Capitalism as the rejection of traditional bourgeois qualities by late-nineteenth-century European artists and intellectuals who sought “to substitute for religion or morality an aesthetic justification of life.” By the 1960s, that modernist tendency had evolved into a credo of self-fulfillment in which “nothing is forbidden, all is to be explored,” Bell wrote. Out went the Protestant ethic’s prudence, thrift, temperance, self-discipline, and deferral of gratification.

Weakened along with all these virtues that made up the American work ethic was Americans’ belief in the value of work itself. Along with “turning on” and “tuning in,” the sixties protesters also “dropped out.” As the editor of the 1973 American Work Ethic noted, “affluence, hedonism and radicalism” were turning many Americans away from work and the pursuit of career advancement, resulting in a sharp slowdown in U.S. productivity from 1965 through 1970. So great a transformation of values was occurring that, as George Bloom of MIT’s Sloan School of Management wrote in a 1971 essay on America’s declining work ethic, “It is unfortunate but true that ‘progress’ is becoming a bad word in virtually all sectors of society.”

Attitudes toward businessmen changed, too. While film and television had formerly offered a balanced portrait of work and employers, notes film critic Michael Medved in Hollywood vs. America, from the mid-1960s onward, movies and TV portrayed business executives almost exclusively as villains or buffoons. The era’s iconic film, the 1967 Oscar winner The Graduate, is a prime example in its tale of a recent college grad adrift and questioning adult society’s strive-and-succeed ethic. No character appears more loathsome than a family friend who counsels the graduate, “I just want to say one word to you—just one word: plastics. There’s a great future in plastics.” Such portrayals both reflected and strengthened the baby-boom generation’s attitudes. One 1969 Fortune poll, for instance, found that 92 percent of college students thought business executives were too profit-minded.

In this era, being virtuous became something separate from work. When the Milton Bradley Company reintroduced “The Checkered Game of Life” in a modern version called “The Game of Life” in the mid-1960s, it abandoned the notion of rewarding traditional bourgeois virtues like completing an education or marrying. What was left of the game was simply the pursuit of cash, until Milton Bradley, criticized for this version, redesigned the game to include rewards for doing good. But its efforts produced mere political correctness: in the new version, recycling trash and contributing to save an endangered species were virtuous actions that won a player points. Such gestures, along with tolerance and sensitivity, expanded like a gas to fill the vacuum where the Protestant ethic used to be.

The cultural upheavals of the era spurred deep changes in institutions that traditionally transmitted the work ethic—especially the schools. University education departments began to tell future grammar school teachers that they should replace the traditional teacher-centered curriculum, aimed at producing educated citizens who embraced a common American ethic, with a new, child-centered approach that treats every pupil’s “personal development” as different and special. During the 1960s, when intellectuals and college students dismissed traditional American values as oppressive barriers to fulfillment, grammar schools generally jettisoned the traditional curriculum. “Education professors eagerly joined New Left professors to promote the idea that any top-down imposition of any curriculum would be a right-wing plot designed to perpetuate the dominant white, male, bourgeois power structure,” writes education reformer E. D. Hirsch, Jr., in his forthcoming The Making of Americans: Democracy and Our Schools.

The bourgeois values, however, had helped to sustain Weber’s “rational tempering” of the impulse to accumulate wealth: they helped put the rationality in “rational self-interest,” or, as Tocqueville put it, “self-interest rightly understood.” When the schools and the wider society demoted them, the effects were predictable. In schools, for instance, the new “every child is special” curriculum prompted a sharp uptick in students’ self-absorption, according to psychologists Jean M. Twenge and W. Keith Campbell in The Narcissism Epidemic: Living in the Age of Entitlement. What resulted was a series of increasingly self-centered generations of young people displaying progressively more narcissistic personality traits, including a growing obsession with “material wealth and physical appearance,” the authors observe. Thus did the sixties generation spawn the Me Generation of the seventies. By the mid-1980s, a poll of teens found that more than nine in ten listed shopping as their favorite pastime.

The economic shocks that followed the tumultuous late 1960s, especially the devastating inflation of the 1970s, reinforced an emerging materialism. Thanks to the Johnson administration’s illusion that the country could finance massive social-welfare programs and a war without consequences, the U.S. by 1974 staggered under double-digit annual inflation gains, compared with an average annual gain of about 1 percent in the early 1960s. The inflation hit hardest those who had embraced the work ethic, destroying lifetimes of savings in unprecedented price spikes and sending the message that “saving and shunning debt was for saps,” Fortune observed. “The lesson seemed to be, buy, buy, buy, before the money visibly crumbling to dust in your hand vanishes completely.”

Once Fed chairman Paul Volcker’s tight-money policy tamed inflation in the early 1980s, America began to pick itself up. But it was a different country, one that had lost to some degree the “rational tempering” of the “pursuit of gain” that Max Weber had seen as the key to “forever renewed profit.” The corporate restructurings of the 1980s, prompted by a new generation of risk-taking entrepreneurs and takeover artists who used aggressive financial instruments with provocative names like “junk bonds” to buy and then make over big companies that failed to remake themselves, reordered corporate America, shaking it out of its 1970s complacency. But the plant closings, downsizings, and restructurings of the 1980s also stoked anxiety among workers, as the old ideal of lifetime employment at one paternalistic company gave way to a job-hopping career in a constantly changing business landscape. While the results were often salutary—innovation for companies and income gains for the most talented players—the “get it while you can” mentality that developed among some workers and investors found its ultimate expression in the “day traders” of the technology stock boom, speculators with a “right now” time horizon rather than long-term investors. When takeover-era titans Michael Milken and Ivan Boesky pleaded guilty to insider-trading charges, their confessions strengthened a growing sense that a new ethic had superseded the old standard of playing by the rules. The 1980s version of the Horatio Alger tales was not an inspiring story of uplift but the popular movie Wall Street, with Gordon Gekko’s infamous “greed is good” speech.

With government policy reinforcing the “get it now” mentality, a new era of consumption based on credit blossomed in the resurgent 1980s, and Americans turned from savers to debtors. Ostentatious displays of wealth grew more common. From 1982, the year that Volcker finally tamed inflation, to 1986, luxury-car sales doubled in America. The average age of a purchaser of a fur coat—that ultimate status symbol—declined from 50 to just 26 in the mid-1980s. To fuel such purchases, inflation-adjusted total U.S. consumer-credit debt rose nearly threefold, to $2.56 trillion, from 1980 to 2008, while the nation’s savings rate shrank from an average of about 12 percent of personal income annually in the early 1980s to less than 1 percent by 2005. Some middle-class Americans came to resemble not the thrifty bourgeoisie of the early Industrial Revolution but the landed gentry of that era who drained their real estate for cash to fund lavish living. One stark illustration of the change: by 2006, those who refinanced their mortgages were taking out in cash nearly a quarter of the equity they’d accumulated—compared with just 5 percent a decade earlier. A big reason Americans’ debt was growing, in other words, was that they were borrowing against their rapidly appreciating assets as fast as they grew.

The denouement of this transformation was the 2008 meltdown of world financial markets. America has certainly had its con artists, robber barons, and speculators before, but what distinguished the latest panic was that millions of mortgages belonging to ordinary Americans triggered it—mortgages that were foolhardy at best and fraudulent at worst. A typical case is Bradley Collin, a 27-year-old Minnesota housepainter with three kids. He decided to try to make a killing in real estate because, as he told the Minneapolis Star-Tribune last year, “I didn’t want to paint the rest of my life.” With the help of shady mortgage brokers, he and his wife simultaneously purchased four homes in new developments, intending to flip them for a profit. To buy the houses, the Collins had to make four separate mortgage applications, lie on each about their intentions, and hide each sale from the other three lenders, because no bank would have given them money to purchase four homes. When the local housing market stopped rising, the couple defaulted on their loans, abandoning the houses to the banks and helping further drive down their neighbors’ real-estate values.

The Collins were hardly alone. According to the FBI, reports of mortgage fraud soared tenfold nationwide from 2001 to 2007. No one knows precisely how deep the problem ran, but some mortgage servicers, examining portfolios of subprime mortgages that went bad in 2007, found that up to 70 percent of them had involved some kind of misrepresentation. Loans that required no verification of the borrower’s income infamously became known as “liar loans.” One mortgage lender who compared 100 of these loans with IRS tax filings found that in 60 percent of cases, the applicants exaggerated their incomes (or underreported them to the IRS). Occupancy fraud, in which investors intent on buying new homes and then quickly flipping them for a profit lied about their intentions, accounted for about 20 percent of all fraudulent mortgage applications. Since the mortgage meltdown began in 2006, builders in some regions have found that as many as a quarter of the buyers of the homes that they sold in new developments lied about their purposes.

This multitude of scams required the complicity of businesses that ultimately destroyed themselves and shattered an entire industry. The fall of America’s sixth-largest bank, Washington Mutual, which built an empire based on reckless lending, exemplifies these failings. As the housing boom heated up, WaMu raced after a piece of the action at all costs. Its supervisors chastised loan officers who tried to verify suspicious claims on mortgage applications. Executives gave loan officers flyers that said, “A thin file is a good file,” according to testimony by former employees. The lender set up phone banks, like penny-stock boiler-room operations, to sell home-equity loans. Ultimately, swamped by over $11 billion in bad loans, WaMu was seized by the federal government and sold to JPMorgan Chase, an object lesson in what Weber called the pursuit of “irrationally speculative opportunities,” which undermines capitalism rather than nourishes it.

Needless to say, this is not what Adam Smith had in mind. Smith laid the groundwork for the economic theories of The Wealth of Nations in his preceding book, The Theory of Moral Sentiments, which traces the evolution of ethics from man’s nature as a social being who feels shame if he does something that he believes a neutral observer would consider improper. Smith proposed that as societies evolve, they form institutions—courts of law, for instance—that reflect and codify these ethical perceptions of individuals, and that these institutions provide the essential backbone of any sophisticated commercial system.

Modern experiments in neuroscience have tended to confirm Smith’s notion that our virtues derive from our empathy for others, though with an important qualification: the ethics of individuals need reinforcement from social institutions and can be undermined by the wrong societal message, as neuroeconomist Paul Zak writes in Moral Markets: The Critical Role of Values in the Economy. When people find themselves bombarded by the wrong message—like the Washington Mutual employees whose supervisors constantly pushed them into riskier and riskier actions—some will resign in disgust, but others will gradually suppress what scientists call the brain’s “other-regarding” behavior and the shame that goes along with it and violate their own ethics.

This mechanism of deception pervaded the recent housing bubble; cheating to get mortgages became so commonplace that cheaters barely seemed to perceive that they were committing fraud. A vivid case in point is New York Times economics reporter Edmund Andrews’s remarkable confessional tale, “My Personal Credit Crisis.” Andrews relates how he obtained a mortgage under dubious circumstances, aided by a broker who encouraged him to lie on his credit applications and a lender that, when its underwriters caught his intended deception, nonetheless allowed him to apply for another, riskier kind of mortgage. Granted a loan so oppressive that he will eventually default, Andrews admits to feeling that he had “done something bad” but also feeling “kind of cool” for making such a big score. Even today, society continues to reinforce Andrews’s lack of shame: he received a contract to detail his credit woes in a provocatively titled book, Busted: Life Inside the Great Mortgage Meltdown, which was published this spring.

In the wake of the market crash, our national discussion about how to fix capitalism seems limited to those who believe that more government will fix the problem and those who think that free markets will fix themselves. Few have asked whether we can recapture the civic virtues that nourished our commerce for 300 years.

We’re not likely to find many churches preaching those virtues today. Though America is more religious than most industrialized countries, today’s pulpits hardly resound with the bourgeois work ethic. While John Wesley once observed that religion produces “industry and frugality,” and the American Congregationalist preacher Henry Ward Beecher declared that the way to avoid poverty was through “provident care, and foresight, and industry and frugality,” today the National Council of Churches, to which these denominations belong, advocates for a left-wing “social gospel” of redistributing wealth (see “The Religious Left, Reborn,” Autumn 2007). And though the Catholic Church once strove to assimilate generations of poor immigrants into American economic life, today its major social-welfare organization, Catholic Charities, has become an arm of the redistributionist welfare state (see “How Catholic Charities Lost Its Soul,” Winter 2000). Even our evangelical churches, whose theology most resembles that of the great Protestant reformers, have focused their energies primarily on social issues, such as fighting abortion or gay marriage, or even inveighing against welfare reform that encourages single mothers to return to work.

True, a few groups, including the Consumer Federation of America and the Institute for American Values, have launched a national campaign, modeled on World War II efforts to encourage savings, to reintroduce thrift into American life. But trying to teach adults about thrift or the patient accumulation of wealth through hard work, when they didn’t learn these things at home or in school, will be an uphill battle.

Could the schools do what they once did—create educated citizens inculcated with the ethical foundations of capitalism? That would require rededicating the schools to “making Americans,” as Hirsch proposes in his forthcoming book. Promisingly, a few public and private schools around the country have replaced the child-centered curriculum with one focused on learning about our culture and its institutions. Hirsch’s “Core Knowledge” curriculum, for instance, introduces kindergartners to the Pilgrims, Independence Day, and George Washington; first-graders to Ben Franklin and the concept of law in society; and second-graders to the Constitution as the foundation of our democracy. Other school reformers, according to David Whitman in Sweating the Small Stuff, have raised the achievement of low-income kids by using a “no excuses” model that teaches bourgeois “virtues like diligence, politeness, cleanliness, and thrift.” But these examples amount only to a tiny handful, swimming against the educational mainstream.

Late in life, Adam Smith noted that government institutions can never tame and regulate a society whose citizens are not schooled in a common set of virtues. “What institution of government could tend so much to promote the happiness of mankind as the general prevalence of wisdom and virtue?” he wrote. “All government is but an imperfect remedy for the deficiency of these.”

America in the twenty-first century is learning that lesson.

Tuesday, August 25, 2009

What the Trinity is For

By Fred Sanders
Scriptorium Daily


What is the Trinity for?

I hear this question all the time, in churches and classrooms. It comes from different kinds of people: From well-established Christians who have the basics of a life of discipleship figured out, are spiritually healthy, and who are getting along just fine without thinking often of the Trinity. From apologists who wish there were one fewer “hard doctrine” to get past in trying to commend Christ to people. From sincere seekers who understand and accept most of what they know about Jesus, but can hear nothing but pseudo-algebra in the doctrine of the Trinity. From students who have cracked their poor brainpans on their first real attempt at understanding the doctrine, and aren’t eager for any further attempts. And above all, from the pragmatically-minded evangelicals who just can’t see how understanding the Trinity would change anything about what they are already doing.

Why complicate a nice, simple religion with this Trinity stuff? What is the Trinity for?

But the first and clearest answer has to be that the Trinity isn’t ultimately for anything, any more than God is for the purpose of anything. Just as you wouldn’t ask what purpose God serves or what function he fulfills, it makes no sense to ask what the point of the Trinity is, or what purpose the Trinity serves. The Trinity isn’t for anything beyond itself, because the Trinity is God. God is God in this way: God’s way of being God is to be Father, Son, and Holy Spirit simultaneously from all eternity, perfectly complete in a triune fellowship of love.

The good news of the gospel is that God has opened up the dynamics of his triune life and given us a share in that fellowship. But all of that good news only makes sense against the background of something even better than the good news: the goodness that is the perfection of God himself. The doctrine of the Trinity is first and foremost a teaching about who God is, and God the Trinity would have been God the Trinity whether he had revealed himself to us or not, whether he had redeemed us or not, whether he had created us or not.

Obviously, these “whether or not” statements are counterfactual: they are about situations that are not the case. God has in fact made himself known, has redeemed his people, and, to say the most obvious thing, has created us. That being the case, what is the good of asking hypothetical questions about what would have been the case if God had not done these things he has done? Indeed, isn’t it even ungrateful to forget, or to pretend to forget, God’s mighty acts? No, in this case, far from being ungrateful, it is an opportunity to become more grateful.

Hypothetical questions are useful tools for understanding how things really are, by imagining how they might have been otherwise. They can be used as mental cures for sick patterns of thought. If you are tempted to think that God’s triunity is something he puts on in order to reach some further goal, or to interact with the world, you can cure yourself of that tendency by thinking away the world and asking yourself: If there had been no world, would God have been Father, Son, and Spirit? If you are tempted to think of Christmas as the time when the Son of God first began to exist, you can cure yourself by asking: If the Son of God had not taken on human nature, would he still have been the Son of God?

The answer to these hypothetical questions is yes: God would have been Trinity with no world, and the Son of God did in fact pre-exist his incarnation. God minus the world is still God the Holy Trinity. In the words of the hymn by Frederick W. Faber:

When Heaven and Earth were yet unmade
When time was yet unknown,
Thou, in Thy bliss and majesty,
Didst live, and love, alone.

The emphasis in these excellent lines is on God’s self-sufficient “bliss and majesty.” Faber would be quick to point out that the final word, “alone,” is very different from “lonely.” Otherwise God could not “love, alone.” Indeed, God is the only one who can love alone, for Trinitarian reasons: God the Father loves God the Son in the love of God the Holy Spirit.

Is it too bold of us to declare what God was like, or what he was doing, before creation? It requires boldness, to be sure, but only the boldness of the New Testament. One of the characteristic differences between the Old Testament and the New Testament is that the New Testament is bold to make such statements. Look, for instance, at the way the New Testament takes a step further back with its declaration of salvation: Where God declares in the old covenant, “I have chosen you,” the new covenant announces that “he chose us in Christ before the foundation of the world.” The prophets do not make declarations about what happened “before the foundation of the world,” but the apostles do.

The main reason for this is that the coming of Christ forced the thought of the farther down, into the ultimate foundation of God’s ways and works. When Christ brought salvation, the apostles had to decide whether the life of Jesus Christ was one more event in the series of God’s actions, or whether, in meeting the Son of God, they had come into contact with something that was absolutely primal about God himself. Christ did not leave them the option of considering him just another prophet, or servant of God. They even had to decide where to start in telling the story of Jesus: With his birth? With pre-exilic prophecies about his coming (as in Mark)? With a genealogy connecting him to Abraham (as in Matthew) or all the way back to Adam (as in Luke)? Ultimately, they knew that the best way to acknowledge Jesus as the eternal Son of God was to go back further than the foundation of the world and confess that he had been there previous even to that. That backward step beyond the foundation of the world is a step into the eternal nature of God. So the Old Testament starts with the foundation of the world: “In the beginning, God created the heavens and the earth.” But the story of Jesus starts before that, because the Son of God was already present by the time of the beginning: “In the beginning was the Word, and the Word was with God, and the Word was God.”

As a result, we are speaking from solid New Testament ground when we say that God was the Trinity from all eternity, or that God is Father, Son, and Spirit without reference to the creation of the world. Scottish bishop Robert Leighton (1611-1684), when his commentary on 1 Peter brought him to the phrase “before the foundation of the world,” (1 Peter 1:20), elaborated on this fact:

Before there was time, or place, or any creature, GOD, the blessed Trinity was in Himself, and as the Prophet speaks, inhabiting Eternity, completely happy in Himself: but intending to manifest and communicate His goodness, He gave being to the world, and to time with it; made all to set forth His goodness, and the most excellent of His creatures to contemplate and enjoy it.

Imagining God without the world is one way to highlight the freedom of God in creating. Thinking away the world makes it obvious that God didn’t have to make a world. Creation was not required, not mandatory, not exacted from God, neither by any necessity imposed from outside, nor any deficit lurking within the life of God. The Bible does not directly answer the question “Why did God create anything at all,” but it does let us know what some of the most glaringly wrong answers to that question would be. It would be wrong to say that God created because he was lonely, unfulfilled, or bored. God is free from that kind of dependence.

Before the foundation of the world, God was Father, Son, and Holy Spirit. That’s just what it is to be God, according to his revelation.

Don’t Turn Citizens Into Subjects

By John Mark Reynolds
Scriptorium Daily

Government already has the biggest guns and so we should not give them control over the scalpels as well. The reason is simple: anything big is likely to hurt us and needs something just as big to check it.

Government power in the United States is great, but it is a blessing of living in our nation that it is not as great as in some other lands. We are in less danger of tyranny here than in many other parts of the world. Tyrants do not always reduce their citizens to subjects by cruelty, sometimes they make them dependent on their “kindness.”

My fear is that if we look to government for even more services, we will reach a tipping point of dependence on the state. Some good will done for us, but at too great a cost to the character of the citizens. Once we become dependent it becomes difficult to regain liberty, even if the masters change in character. After all, history never stops moving and pharaohs can arise that never knew Joseph.

One does not have to have “faith” or put much “hope” in the private sector in order to worry first about government. Big business can be just as cruel as government, but it does not have authority over the justice system. Whatever harms that Big Business can do, it cannot declare war or sit in justice over you. Wall Street can go to jail, but it cannot send you there.

This does not mean that Wall Street is to be trusted.

Most traditional American conservatives (as opposed to extreme libertarians) have supported some government regulation of big business as a check to abuse and harm that wicked men in the private sector do. Left alone, companies were happy to sell us tainted food until they met government regulation. Pure food laws were a sane, limited response, and enjoy bipartisan support.

We did not have government take over the production of food.

Medicine, insurance, and the legal profession are certainly three big areas that cry out for better laws to protect citizens against exploitation. The cost of medicine in the United States is too high and access to quality care is too limited partly because of bad practices that should be stopped.

The more government gets involved in medicine the more the money from big medical companies flows to corrupt government. Before government even considers writing more laws it should drive those involved in graft and corruption from its midst. Many of us no longer trust government when we see big bailouts for friends and donors of politicians.

Bet on it: if government takes more responsibility over health care, the same cruel executives that make insurance companies so unpopular will get jobs in the public sector. This time they will have the power of the state behind their barbarisms.

It is bad enough fighting a big private company with some small hope of help from your elected official, but it will be hopeless to fight when all the power is absolutely concentrated in government.

This does not mean doing nothing.

The free market eventually catches up with bad actors, but in medical care too many people are quickly harmed to allow this natural process time to act. Regulations themselves are dangerous so they should address clear abuses (denial of coverage based on gender would be an easy example).

The United States has tried to balance its approach to medicine with a mostly free market system with a strong social safety net. By the end of his political career, this is the system Ronald Reagan valued and strengthened. There are good reasons to fix some problems in the system, but it has produced innovation and quality medical care for the vast majority of Americans.

Failures abound, but when tinkering with the system we should first practice the Hippocratic wisdom of “doing no harm.” Congressman Paul Ryan, a conservative and bright rising star of good government, has proposed several ideas that would help without “doing harm.”

No solution will be perfect. Wicked men in the public and private sector will continue to misuse their power. Money will corrupt both sectors, but by keeping power over life-and-death somewhat scattered, most of us have the best chance to be allowed our God given right to “life, liberty, and pursuit of happiness.”

How to Re-Frame the Conversation About Chastity

Christianity Today Book Review

Charity, community, and self-control.

Lies. Jessica Valenti talks about a lot of them in The Purity Myth: lies about statistics, lies about women, lies about sex. In fact, she talks so much about deception that the main truth she wants to advance gets pushed to the closing and opening pages of the book. That truth is that women are "more than the sum of our sexual parts," a message she desperately wants her readers to take to heart, rejecting the far more common claim that "a woman's worth lies in her ability—or her refusal—to be sexual."

How exactly does one instill a healthier sense of worth? Valenti thinks it's by taking away the shame in sex and "arm[ing] young women with the knowledge that sex should be a collaborative, pleasurable experience that has no bearing on whether they are ethical people." Except, of course, that "collaborative" and "pleasurable" are obviously deemed good, as opposed to competitive and unpleasant, selfish and painful—modes of experience that would presumably be unethical. And indeed, she's against both violent and unwanted sex, which is for her defined not just by "no" but the absence of "yes." Sex isn't really amoral, then. No, her problem is that sexual morality or the lack thereof still has such bearing on (women's) worth.

On these grounds, Valenti spends much of the book attacking those who enforce and reinforce the ties between morality and worth, a nebulous group she calls the virginity movement. Yet for all her concern with the problem of defining worth in sexual terms, Valenti spends little time considering why women have so persistently bought into such a warped system of valuation. Perhaps it seems easier to argue that the reason bad things keep happening to us is our lack of freedom, rather than a combination of the broken world we live in and our own fallible wisdom in navigating that world. Valenti is noticeably reluctant to concede the role of women's agency in shaping the lives we lead, except when our choices yield good or desired outcomes.

Toward the end of her chapter on trusting women, she writes: "Imagine a world where moral turpitude for women was based on our making decisions for ourselves … . Imagine a world where women had nothing to be ashamed of." Yet as grand and inspiring a picture as Valenti would conjure, she presumes women wouldn't ever make free decisions for which they might later feel shame or regret, even simply for failing their own standards.

Ignoring the cases where women's choices lead to bad outcomes means that Valenti never has to give much thought to the underlying motives. This is a fatal oversight, as failed efforts to stop the global drug trade underscore. Even if you can make the case that an activity or system of exchange is damaging and unhealthy, you are unlikely to actually change things unless you provide participants with an alternative means of satisfying the need or desire that drove their participation in the first place. Fail to give poor farmers an alternative crop with equal or better returns, and why should they give up growing something just because it seeds problems elsewhere? Likewise, even if women can see that basing and letting our worth be based on sex is bad for us and society, it will take a strong woman to give up a guaranteed, if ultimately unfulfilling, source of worth and attention for a healthier life that may lack precisely what made sex-based worth acceptable.

Contrary to what Valenti thinks, this is why Janie Fredell (of the True Love Revolution) told the New York Times, "It takes a strong woman to be abstinent." Sure, there's the strength it takes to keep a rein on your libido (especially hard in one's restless twenties), but there's also the strength required to watch male attention drift past you, perhaps for good, without giving in to the fragmentation of self that uncommitted sex entails.

Abstinence carries a profound uncertainty, which, if you practice it long enough, will bring you to the limits of any self-control predicated merely on hopes that someone will someday end your wait. At stake is not merely the possibility of marriage but life-fulfillment itself. As Jenny Taylor notes in her book A Wild Constraint: The Case for Chastity, "Personal completeness now means sex." As that logic goes, to die a virgin is to have led an incomplete, unfulfilled life. Couple this view of success in life with our tendency to assign worth in sexual terms, and it's not hard to see why most pan the course of chastity. Who would want to take such a gamble?

Yet in dismissing chastity, those of Valenti's persuasion jettison one of the best available means we have of critiquing of unhealthy sexual mores and their broader societal implications. In A Wild Constraint, Taylor draws our attention to the political significance of chastity for Josephine Butler, a 19th-century suffragette who campaigned against practices that forced women into leveraging sex for survival, whether through prostitution or pragmatic marriages. Butler's fight for increased vocational opportunities for women was directly tied to her fight against what Taylor calls "sexual tyranny." "If women were able—and permitted—to contain their sexuality, not just women but society as a whole would benefit … . A sense of the body in its social context was a prerequisite of the fight for women's rights."

Today, at least in industrialized, Western societies, the battle is more for women's emotional and psychological health, as seen in Valenti's framing of the issue. Yet abstinence is no less useful or radical here. Only in chastity does one fight for integrity of personhood as worth more than the fleeting hit of attention earned with entrée to one's body. If we are not ready to grant men things like access to our bank accounts or power of attorney, why would we give them free run of our bodies? Such a choice becomes possible only with a divided sense of self, whereby some parts are more valued and secured than others.

As summarized by Valenti (though she seems to rely mostly on a 2004 report for Senator Henry Waxman and the "damning" quotes favored by opposition websites like the "No More Money" project), current abstinence education texts don't place sufficient emphasis on this message of integrity. Nor, in many cases, does church instruction do better. Taylor cites an interview with two British evangelical women, one of whom is involved with a UK affiliate of Campus Crusade for Christ. Though the women represent a different church culture than America's, their critique will be familiar to one who's spent much time around Christian singles. "It's all about small rules, not the big picture," one of the women tells Taylor. "You don't admit you struggle with how you play the Christian culture game. It's then easy to sin because there are such separate worlds."

How can those who believe there is a case and a place for chastity then improve on how we explain it? I believe there are two ways. First, as I have suggested above, the practice of chastity can play a valuable role in fighting our tendency toward fragmentation. However, I do not think the church, in particular, has done an adequate job of explaining the biblical view of sex and marriage in terms of whole-self giving. With ignorance of the overriding principles and purpose of sexual self-control comes a tendency toward technical adherence—following the letter of a misunderstood and disrespected law rather than the wise and lofty spirit of it. We must do a better job of rooting our understanding of sex in the character of God and his image-bearing purpose for mankind. People may wrestle with the question of his goodness, but an honest fight with the real issues is better than brushing God off as a daft and irrelevant uncle.

Secondly, we must stop speaking of abstinence as if it has no post-marriage value. The fact is, we are talking about self-control—a virtue that matters as much to marital monogamy as it does to premarital chastity. And those are just the sexual applications! But when all we tout is abstinence, rather than sexual self-control, the connection to all other spheres of healthy restraint is lost—and with it the urgency and relevance of being disciplined people, of being adults.

If we could stress these two things in our talk of chastity—wholeness and the importance of sexual self-control—we might be able, as Taylor says, to "re-cast our bodies in terms of charity and community, rather than … self-gratification." I'd like to think even the secular Valenti could see some good in that.

Anna Broadway is the author of Sexless in the City: A Memoir of Reluctant Chastity and a contributor to Faith at the Edge.

Thursday, August 20, 2009

Bultmann and Tillich: Same Birthday, Same Problem

By Fred Sanders
Scriptorium Daily

Two of the most influential academic theologians of the twentieth century share today, August 20, as their birthday: Paul Tillich (1886-1965) and Rudolf Bultmann (1884-1976). What an odd coincidence. I wonder if they ever celebrated it together.

Both men were prolific, and their theological projects were very different: Tillich was above all a theologian of culture, seeking to interpret the symbols with which people address the object of their ultimate concern; Bultmann earned his formidable reputation as a Bible scholar with a command of the history of religions and a facility with all available critical methods.

Both men were cross-disciplinary in ironic ways: Tillich was actually a kind of continental philosopher who believed himself to be a theologian (his Systematic Theology is in fact a systematic Christian ontology, in which Schelling’s concept of being is determinative for every part); while Bultmann was actually a systematic theologian with a definite, Heidegerrian account of saving faith to proclaim, though he thought he was a historical neutestamentlicher doing objectively descriptive work.

There is one meaningful place where their work overlaps: In the absence of Jesus and the presence of Christ. Both of them taught that the actual man Jesus Christ was an artifact of bygone history, with nothing to offer to faith. But they also taught, or I should say they primarily taught, that the whole point of Christianity was an existential encounter with the spiritual presence of Christ in the here and now.

Bultmann and Tillich taught this in different ways, but for essentially the same reasons. Tillich’s account is more striking. He wanted to avoid direct contact between faith and history, to protect saving faith from the dangers of history. One of his close co-workers, Langdon Gilkey, reported that Tillich frequently remarked to his classes: “I do not wish the telephone in my office to ring and to hear from some New Testament colleague: ‘Paulus, our research has now finally removed the object of your ultimate concern; we cannot find your Jesus anywhere.’” Tillich wanted a gospel that could survive the non-existence of the historical Jesus, or even the discovery of his still-dead bones. So he taught that the Christ, which he described as “the New Being,” appears to us principally not in Jesus himself, but in the biblical picture of Jesus as the Christ. Even if Jesus were a fictional character in an imaginary story called the gospel of Mark, he could bring about salvation, according to Tillich. “Suppose,” said Tillich in 1966, “the bearer of the Spirit had another name than Jesus and did not come from Nazareth, and the New Testament picture of Jesus is essentially a creation of Mark…then Mark was the bearer of the Spirit through whom God has created the Church.”

Bultmann’s account is more complex, but it comes to the same thing. The Christ who saves is the Christ who is preached to you: “”Christ (insofar as he affects us) is the kerygma, because he is the Christ only as the Christ pro me, and as such he encounters me only in the kerygma.” In an insightful analysis of what Bultmann meant by this kind of statement, James Kay has helpfully distinguished among three referents of the term “Jesus Christ:” For Bultmann, the first meaning of “Jesus Christ” (JC1) is a mythic persona described in the New Testament using available gnostic-redeemer categories: a Son of God who descends to save, a Messiah, a God-Man, etc. But the second meaning of “Jesus Christ” (JC2) is the historical figure, a man whose career at a particular time and place the New Testament narrates. Indeed, the New Testament wants to say that JC1 really is JC2, and there are all kinds of interesting demythologizings and remythologizings to do there. Bultmann’s thought here is very subtle and easily misunderstood, but pursuing it would be a digression. Essentially he thought the New Testament was right to express its faith using mythological categories like JC1, but that we moderns would be wrong perpetuate those outworn mythologies. The most important thing, though, is JC3, the contemporary proclamation of the message about salvation through Jesus. When you hear JC3, you have it all, as you are called out by the wholly other into a field of existential decision: saved.

Is there no necessary connection between JC3 and JC2? No, Bultmann said. There really was a JC2 back in the day, but to look behind the word of JC3 for something like a JC2 who still matters today would be to try to keep knowing Christ “according to the flesh” rather than “according to the Spirit.” Another question, then: When JC2 died and was buried, did he rise from the dead? What rose from the dead and is present today, said Bultmann, is JC3. So he could say “Jesus Christ rose from the dead,” but he could say it in a way that would be utterly untroubled by the discovery of the bones of the still-dead JC2.

For evangelicals who glory in the fact that Christ is alive and present to us now, it is shocking to see Bultmann and Tillich take that truth and run off in a false direction with it. Their astonishing over-emphasis on a purely spiritual “Christ present to me now” is exposed as the blunder it is when you reflect on the relationship between the present Christ and the historical Christ. Anybody with a solid grasp of the real resurrection and ascension can affirm that the historical Jesus is the same person as the present Jesus. The cavalier dismissal of the historical Jesus (Never existed! Invented by Mark! Still in his grave! Irrelevant for saving faith!) cannot be countenanced.

It is tempting to call the Boys of August 20 a couple of liberals. As a term of abuse or as a warning label, that makes some sense. But as a historically descriptive term, “liberal” is not quite right. Both Bultmann and Tillich were intentionally setting themselves against the classic liberal theology of the late nineteenth century. Liberalism had worked out a different solution to this problem. The great liberals tried to take the historical Jesus (JC2) and stretch his influence all the way down to our time by celebrating the greatness of his personality and exploring the historical forces he had set loose in the world. Jesus is present now, they said in a thousand erudite ways, in roughly the same way other great men of the past are still present. A thinker like Adolf von Harnack could start his book on The Essence of Christianity by saying, “mankind cannot be too often reminded that there was once a man of the name of Socrates. That is true; but still more important is it to remind mankind again and again that a man of the name of Jesus Christ once stood in their midst.” That groaning sound you hear is JC2 being stretched two millennia to become JC3, still with no real resurrection and ascension in between.

Thus classic liberalism, against which there were many justified reactions: the smart fundamentalism of men like James Orr and J. Gresham Machen, and the neo-orthodoxy of men like Bultmann and Tillich. They all knew classic liberalism was not the same thing as biblical Christianity, and they all sought to overcome it with something that was more in line with the message of Scripture. Tillich and Bultmann, with their neo-orthodox account of the presence of Christ, staked out a position that was better than the view of classic liberalism.

That’s the best thing I can say about their position, even on their birthday.

Thursday, August 13, 2009

Romance Requires Showing Up and Staying Apart

By John Mark Reynolds
Scriptorium Daily

God loved the world so much that He came Himself and did not just twitter about His feelings, but God loved reason enough that He gave us a book and did not just overwhelm us with His irresistible presence. There is lesson there for humans created in His image as they think about using new media.

Love demands closeness, but often the beloved needs distance too. A true Romeo would not wish to kill Juliet by his impatient demands. A good relationship, deepest love, uses reason to moderate passion: too much closeness smothers and too much distance makes hearts wander.

The new technology is a wonderful tool to provide a sort of closeness while keeping our distance. That is a good thing. We can immediately share information and some kinds of experiences through the new media, but with some space for physical safety. We can become close while hiding important things about ourselves.

This good thing can, however, become too much of a good thing. Like a theme park so concerned about danger that it wraps its guests in bubble wrap, an “online church” would allow too much hiding and too much distance. Being there is risky, but for love to grow some increased risk will eventually be necessary. One way of viewing faith is as the compromise between being totally safe and secure with the demands of consuming passion. Faith risks, but does so based on best available evidence.

Love without reason is blind, because without reason it accidently can harm what it loves. Unfettered reason is impotent, because it seeks impossible certainty. Faith carefully and humbly takes a risk on love even though it only sees dimly the outcome.

Faith is reason’s risk on love. Part of that risk will be on-line and part, in some cases, will be in being present: body and soul. I have met dear friends through email, but like the old pen pals of my grandparent’s era, new media relationships carry limitations with their opportunities.

There is nothing new in this for Christians, since we have always embraced coming together physically and a distancing technology: the book. The book after all allows the dead to speak to us without any psychic! However, we are also people of the “church” . . . a gathering of those who share a like-minded passion. Nobody can be a Christian and avoid books or fellowship.

These general truths should guide our approach to new media. It would be foolish to reject new tools to speak to those who are distant, but it would be equally foolish to abandon person-to-person ministry. It is good to have Facebook friends, but also friends in three dimensions.

Why bother with “real” friends when on-line communication can be so satisfying?

Physically being present is necessary, because we are not just disembodied heads! In the Bible story, Adam, the first man, could walk with the spirit of God every day in the perfect garden of the newly created world. He was without sin, but God still describes this perfect man as alone. God knew people need other people.

Our knowledge about God’s action teaches us even more, because God Himself did not remain distant. He wanted a closer relationship with His children and so became the man Jesus.

God did not stay far away or stand apart form what it means to be human. In the Incarnation celebrated at Christmas, God became human and we could see in Jesus what a good and noble life looked like. That is not the whole story, however. God came, but He did not stay in the flesh.

Why did Jesus leave? After His death, He was so remarkable that no one who saw Him could do or think of anything else. If He did not hide from our view, nothing would be left to us but cursing or worship. God stays as near as He can while allowing human choice to play out.

Romance requires showing up and Christianity is romantic. The love of God forces us to love each other and those in love cannot be kept apart for very long! It is nonsense to ask if Church could be exclusively “on-line,” because those who love their spiritual fathers, mothers, brothers, sisters, and friends will demand to see them.

Reason requires calm detachment from passion and Christianity commends the development of this emotional room to be reasonable. The divine Logic of God (John 1:1) demands that we use our minds. Human beings, unlike mere animals, are capable of restraining their passions for the good of those they love. It is folly to reject the information and communication that new media can provide as tools to the creation of authentic community. The Bible can be on-line, but Christians cannot be. We can only see or hear what is on-line.

Of course, we are at the very beginning of this technological revolution and the wonders have only begun. Last year Roger Overton and I edited a book of essays, The New Media Frontier, with a group of friends and colleagues to try to understand the promise and peril of new media. We are gloriously excited about what might be created by the human imagination and this only increases our desire to really be together with fellow creators!

Monday, August 03, 2009

“Sin Boldly”

By Fred Sanders
Scriptorium Daily

Today (August 1) in 1521 is the day Martin Luther wrote the advice, “Sin boldly. But believe even more boldly in Christ, and rejoice.” Protestants have been apologizing, and simultaneously not apologizing, for it ever since. To understand what it means, you have to be familiar with three things: Luther, Melanchthon, and justification by grace through faith.

It was the summer of 1521. Luther had grown a big beard and changed his name to Junker Jörg (that is, George the Knight). He was hiding out in one of Elector Frederick’s spare castles. He had been excommunicated by the Pope since January, and for the two months since the Diet of Worms, he had been under an imperial ban that made him an outlaw, wanted dead or alive. He made good use of the time, translating the New Testament straight from Greek to German –which was also against the law.

At the end of a letter to his friend Philip Melanchthon, Luther admonished Philip:

If you are a preacher of mercy, do not preach an imaginary but the true mercy. If the mercy is true, you must therefore bear the true, not an imaginary sin. God does not save those who are only imaginary sinners. Be a sinner, and let your sins be strong, but let your trust in Christ be stronger, and rejoice in Christ who is the victor over sin, death, and the world. We will commit sins while we are here, for this life is not a place where justice resides. We, however, says Peter (2. Peter 3:13) are looking forward to a new heaven and a new earth where justice will reign. It suffices that through God’s glory we have recognized the Lamb who takes away the sin of the world. No sin can separate us from Him, even if we were to kill or commit adultery thousands of times each day. Do you think such an exalted Lamb paid merely a small price with a meager sacrifice for our sins? Pray hard, for you are quite a sinner.

Pecca fortiter, sin boldly! It is a remarkable word, written by Junker Jörg holed up in the Wartburg, and sent to the pusillanimous Philip back in a Wittenburg that was being torn apart by radicals. Heiko Oberman called this statement his “most provocative words,” and Philip Schaff called it “the boldest and wildest utterance of Luther on justification.” There is a long and ignoble tradition, among Roman Catholic polemicists, of seizing on this word from Luther and urging it as evidence that the whole Reformation was just a pretext for a wild party. “See? Luther himself wanted people to be bold sinners! Why should we bother listening to this guy?” That kind of shadow-boxing is its own reward.

But even Luther’s charitable interpreters have their work cut out for them with this delphic utterance. G. C. Berkouwer asked, “Was Luther justified in speaking as he did? … It seems to us that Luther might have been more cautious. In an overheated theological atmosphere we should strive for intelligible formulations.” He goes on to quote a historian who warns that “It is rather a sentiment of the moment, and as such neither to be celebrated nor to be regretted but to be understood.” Berkouwer does his best to understand Luther at this point:

He does not say ‘Sin till you are blue in the face,’ or ‘Sin for all you’re worth,’ but ‘Sin bravely.’” With this word –whatever the libertine may do with it– he intends to exorcise the terror of the believer who has discovered some sin in himself and has now lost sight of the grace of God. An abundance of grace can subdue the power of sin… In order to signalize the superabundance of grace, he contrasts it –Luther is a vehement man– with a thousand sinful enormities a day. His intention is not to yield quarter to Antinomianism but to upset a construction which would make sin and grace of equal weight, and therefore he exhorts the sinner to have courage.

That’s good: “to upset a construction which would make sin and grace of equal weight.” Melanchthon was indecisive, and highly prone to attacks of conscience and scrupulosity. Whatever Luther said to him about grace, Melanchthon was likely to weigh it in the scales against the common-sense necessity to do good work and to maintain a clear conscience. Luther, always an instinctively contextual theologian, knew that he couldn’t let Melanchthon do that, especially at a time when he was under so much pressure. Melanchthon was in danger of collapsing the gospel back into the law, and letting his conscience become an idol standing in the place where God’s promises should stand. To Melanchthon he said, “sin boldly.” But later, when combating real antinomianism, Luther preached nothing but the moral demands of the law:

Our view has hitherto been and ought to be this salutary one– if you see the afflicted and contrite, preach Christ, preach grace as much as you can. But not to the secure, the slothful, the harlots, adulterers, and blasphemers.

Twentieth-century Lutheran theologian Dietrich Bonhoeffer decided that he had to deal with the “sin boldly” motto right up front in his book Discipleship (previously known as The Cost of Discipleship). After his classic opening blast against cheap grace, he argues that some propositions are true as conclusions but would be false as presuppositions: “Grace as presupposition is grace at its cheapest; grace as a conclusion is costly grace. It is appalling to see what is at stake in the way in which a gospel truth is expressed and used.”

What does it mean for Luther to say: ‘Pecca Fortiter, sed fortius fide et gaude in Christ” –’Sin boldly, but believe and rejoice in Christ even more boldly!’ So you are only a sinner and can never get out of sin; whether you are a monk or a secular person, whether you want to be pious or evil, you will not flee the bonds of the world, you will sin. So, then, sin boldly, and on the basis of grace already given! Is this blatant proclamation of cheap grace carte blanche for sin, and rejection of discipleship? Is it a blasphemous invitation to sin deliberately while relying on grace? Is there a more diabolical abuse of grace than sinning while relying on the gift of God’s grace? Isn’t the Catholic catechism right in recognizing this as sin against the Holy Spirit?

To understand this, everything depends on how the difference between result and presupposition is applied. If Luther’s statement is used as a presupposition for a theology of grace, then it proclaims cheap grace. But Luther’s statement is to be understood correctly not as a beginning, but exclusively as an end, a conclusion, a last stone, as the very last word. …’Sin boldly’ –that could be for Luther only the very last bit of pastoral advice, of consolation for those who along the path of discipleship have come to know that they cannot become sin-free, who out of fear of sin despair of God’s grace. For them, ’sin boldly’ is not something like a fundamental affirmation of their disobedient lives. Rather, it is the gospel of God’s grace, in the presence of which we are sinners always and at every place. This gospel seeks us and justifies us exactly as sinners. Admit your sin boldly; do not try to flee from it, but ‘believe much more boldly.’”

Bonhoeffer’s exposition is perfect, but note the change he has slyly introduced: “Admit your sin boldly.” Pecca fortiter is not a plan of action; it’s a script for a prayer of confession. When confessing sins to God, don’t excuse your sins, minimize them, or treat them as fictitious. Things like that don’t need forgiveness, or at least not very much. Instead, identify your sins and state them boldly. Face the fact that you are not sin-free, and that, in yourself, you never will be. Keeping a perfect conscience is just not a realistic part of the Christian plan. Learning how to get daily forgiveness from God: That’s the plan.

Philip Schaff, who called pecca fortiter Luther’s boldest statement, also said that it couldn’t be used as an argument against his doctrine of justification. “It loses all its force as an argument against him and his doctrine, first by being addressed to Melanchthon, who was not likely to abuse it, and secondly by implying an impossibility; for the fortius crede [believe boldly] and the concluding ora fortiter [pray boldly] neutralize the fortiter pecca [sin boldly].” And then Schaff makes a truly insightful observation: “Paul, of course, could never have written such a passage. He puts the antinomian inference: “Let us continue in sin that grace may abound” into the form of a question, and answers it by an indignant me genoito (Rom. 6:1). This is the difference between the wisdom of an apostle and the zeal of a reformer.”

The only thing I would add to this, as neither a wise apostle nor a zealous reformer, is that I am learning something very valuable from Luther as my young children get a little older (the oldest is approaching double digits). It is very tempting for me to think that I have completed my job as a Christian father when I have taught my kids how to be good. I think it is literally a temptation: It would be a parental sin, a sin of the foolish variety, to launch my children into adulthood armed with nothing but the advice not to sin. What they really need is the knowledge of how to deal with sin and guilt as they all-too-predictably acquire it. I don’t want them to be blindsided by the fact that they are sinners, or uninformed about what to do with consciences that rightly condemn them. They need to learn the Christian skill of taking it to God, of walking in the light, of believing Christ boldly, rejoicing, and praying boldly.

======
A coincidental aside: It was also on this day, August 1, but seventeen years later, that Luther cheered up poor old Philip again by chalking these words on his table:

Substance and words: Philip.
Words without substance: Erasmus.
Substance without words: Luther.
Neither substance nor words: Karlstadt.