Page 3392 – Christianity Today (2025)

Page 3392 – Christianity Today (1)

  • Donate
  • Subscribe
  • My Account

  • Advancing the stories and ideas of the kingdom of God.

    • Subscribe
    • Give a Gift
    • Donate
    • My Account
    • Log In
    • Log Out
    • CT Store
    • Page 3392 – Christianity Today (4)
    • Page 3392 – Christianity Today (5)
    • Page 3392 – Christianity Today (6)
    • Page 3392 – Christianity Today (8)
    • Page 3392 – Christianity Today (9)

Betty Smartt Carter

Two novels about finding—or failing to find—a structure of meaning in the mess and confusion of our lives.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

I like Tim Farrington’s new novel, Lizzie’s War, but not because its Vietnam-era story is unique or surprising. Those were turbulent times, and turbulent times catch up writers in their wake. Many novelists have now written about the struggles of soldiers in an unpopular, possibly unjust, war. What makes Lizzie’s War distinctive is the way it treats the battlefield and the homefront as parallel and equal fields of conflict. A soldier’s bitter struggles are no less grueling than his wife’s struggles at home. Eventually the marriage itself becomes a third battlefield, where husband and wife have to fight their way through years of distance and resentment in order to save their union.

Page 3392 – Christianity Today (11)

Lizzie's War: A Novel

Tim Farrington (Author)

384 pages

$25.11

Page 3392 – Christianity Today (12)

The Missing Person

Alix Ohlin (Author)

304 pages

$8.98

Farrington paints an emotional picture of the O’Reillys, a Marine family so absurdly loyal that they celebrate the anniversary of the Marine Corps’ founding every year with a green cake and a blaze of candles. Captain Mike O’Reilly is in Vietnam, leading his first solo company through some of the worst combat of the war so far. His son Danny knows Corps lore backward and forward and imagines battle in a glow of glory. For Danny’s mother, though, war is a rival, a cruel mistress that claims her husband, body and soul. What Liz O’Reilly fears most in life is a knock on the door, Marines in dress greens on the front step, waiting to tell her that Mike is dead or wounded. She’s always listening, always clinging to the edge of the normal and familiar.

On the other side of the world, Mike sees all the war’s absurdities: men dying because of bad decisions, hard-won victories negated by politics. Yet he knows that it’s his calling to lead men into battle, and he knows he’s good at it. How can he put his true thoughts into words in a letter home? How can he sum up the horror of war and his own mixed feelings without distressing his wife? He chooses to shield her from the truth, writing breezy letters that downplay the danger he’s in. To Liz, Mike’s letters become tokens of his infidelity.

The tables turn, though, when Liz finds herself in a parallel battle that she can’t share with Mike any more than he can share the war with her. Doctors tell her that the child she’s carrying won’t come to term. She decides to continue with the pregnancy anyway, but eventually gives birth to a daughter who only lives a little while. Without Mike there to support her, Liz depends on the comfort of a young priest, Father Germaine, another Vietnam vet with his own share of cruel memories. Germaine wants to love Liz but has to be content to do the work of God, giving last rites to Liz’s child and taking her older children under his wing.

A photograph of his own dying daughter in a young priest’s arms leaves Mike both anguished and jealous. It turns out that the hardest loss of all is the one he’s not there to share. And when he finally comes home, he still has a long way to travel into his wife’s heart. Sitting on a calm beach, Mike and Liz turn from their separate battles and face each other in battle posture, Mike protecting himself, Liz becoming the aggressor. She finds a new scar on his leg (one of many) and feels a flash of rage:

Her husband’s damaged body shocked her still—this changed, scourged, compromised thing he had brought home to her, flesh of her flesh. She had a sense of ongoing violation and even, strangely, of jealousy, at his wounds, at the violent intimacies of them, in which she had played no part.

She thinks of Mike’s refusal to talk about the war, she thinks of their dead daughter, and then she bares her anger (and fingernails) and gives him a wound of her own. “I want to hear the truth,” she says while he examines his bloody leg in awe. “That’s the point. I just want to hear the truth.”

As in his previous book, The Monk Downstairs, Farrington weaves themes of faith and divine love into the texture of his novel. The O’Reillys are not only a military family but also a devoutly Catholic family, intent on fighting for what’s good in the world, whether it’s the preservation of a company of weary soldiers or the sanctity of a child’s life. It’s their almost innocent impulse toward goodness that makes their story so meaningful. And Lizzie’s War is nothing if not “meaningful”—sometimes (I hate to say it) too much so, as Farrington dives headfirst into his narrative, letting us know exactly how and what to feel about his characters’ embattled lives.

Still, it’s good to believe that stories (and lives) mean something, even if the writer occasionally gives in to sentiment. A writer who has no patience for anything sentimental is Alix Ohlin, author of the debut novel The Missing Person. Ohlin writes about some sad things—a lost father, an estranged mother and son, an act of vandalism that ends in a tragic death—yet the style of The Missing Person is as dry as the Albuquerque landscape where it’s set.

Lynn Fleming is a floundering graduate student in New York. She’s trying to get on with her dissertation about feminist art and modernism but can’t seem to find the right jumping-off point. Her life consists mostly of sitting around depressed in Brooklyn and occasionally sleeping with her married academic advisor, who’s alternately sick of her and solicitous.

When Lynn’s mother calls to ask for help with her brother Wylie, an environmentalist crusader who’s recently gone off the deep end, Lynn heads home to Albuquerque. She’s immediately sorry she did. Her mother annoys her; Wylie considers her the enemy; and she’s bothered by memories of her dead father, the “missing person” whom she never really knew.

Two things happen that change Lynn’s attitude toward Albuquerque. First, in her mother’s house she discovers old paintings by a talented local artist and decides that this woman may be a hidden genius ripe for discovery—a catalyst for Lynn’s long delayed doctoral work. It fascinates her that her own father purchased the paintings thirty years before. Had he known this woman? Could he have been her lover? If Lynn can find the artist, will she reveal some secret knowledge that explains (i.e., resurrects) Lynn’s father?

While pursuing that mystery, Lynn begins an affair with one of Wylie’s friends, a plumber named Angus, who inducts her into the strange world of eco-terrorism. Getting involved with a small band of radicals, she discovers a comradeship and feeling of community that’s always eluded her. For a little while, it seems as if the different threads in Lynn’s life will tie together (providing us with a satisfying plot), but this illusion of continuity fades when Lynn sees things for what they are: her radical friends as ineffectual and self-centered, her search for her father in a feminist artist’s work as mostly wishful thinking. What she gains from her adventures is neither redemptive knowledge of the past nor meaning for the present; at best she gets a greater degree of self-knowledge, and maybe the beginnings of a better relationship with her mother and brother.

How does Ohlin want us to feel about all of this? Given her detached presentation, it’s hard to know. She has a poetic style and a love of odd details that bring people to life. But in making her characters so palpable, she sometimes makes them repulsive (Angus always smells like ammonia, yet we’re supposed to believe that Lynn feels madly attracted to him). In refusing to manipulate her story, Ohlin leaves the reader with too little. The most heroic character in the book, Wylie, stays too far from the center: we finish without ever really understanding what his beliefs mean to him. Like his sister, we feel disconnected from him and discover little to change that.

Which leads me to why, in a funny way, The Missing Person works. Set in a desert landscape, it mirrors the world’s desertion of its heroine. Lynn lacks a father, a close family, a community, and for each of those losses receives something less than what she wants. If her story doesn’t seem to mean much, that may be the way the author wants it. By leading us toward mysterious connections between people and events, only to abandon them, Ohlin seems to suggest that things are fundamentally unconnected—that although we long for a deep structure of meaning that links the living and the dead, the past and the present, in some enduring fashion, our hope is in vain, and perhaps we’d best begin to acknowledge the bleak truth and get on with our lives.

As an opponent of forced meaning in novels, I sympathize with Alix Ohlin. I like her dry style and her lack of sentiment. She seems less predictably optimistic, less manipulative with our emotions, and therefore more honest than many writers. On the other hand, isn’t storytelling all about finding relationships between things? Isn’t that why we write and read novels—to prove to ourselves and each other that the world means something?

It’s not just sentiment, I think, that makes a novel like Lizzie’s War ultimately more appealing: It’s the author’s hopeful vision. Farrington portrays life not as we experience it, but as it looks beyond our experience, in a place where events and people do tie together in mysterious and even sacred ways. That transcendent viewpoint trumps even style, though we can always hope (being foolishly optimistic, I guess) to find more novelists who will give us both.

Betty Smartt Carter is a novelist who lives in Alabama.

Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromBetty Smartt Carter

James L. Guth, Lyman A. Kellstedt, John C. Green, and Corwin E. Smidt

Religion and the Bush Doctrine.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

During the past four years a growing number of political analysts have connected the emerging “Bush Doctrine” in foreign policy to the influence of evangelical Protestants. For example, one recent review claimed that

The influence of Christian evangelicals now extends to many essential matters of foreign policy, quite apart from the Middle East. Dogmatic, unilateralist, and radically nationalistic, this influence ignores international law and is particularly hostile to international organizations.1

Indeed, it is hard to find a critique of administration foreign policy in publications such as The New Yorker, The New York Review of Books, the Atlantic, or the New York Times without a similar complaint.

Such assertions arise in part because of perceptions that conservative evangelicals are involved in virtually every aspect of American politics, from campaigning for George W. Bush in the 2004 election to mounting the recent “Justice Sunday” rally backing the president’s judicial nominees. What is missing, however, is any systematic evidence that evangelicals—or other religious communities for that matter—actually support or oppose the Bush Doctrine.

In fact, such assertions fly in the face of much of the existing research. Scholars have found little evidence that religion is a major factor shaping public attitudes toward foreign policy. True, a few researchers (including the authors) have shown that religion is a powerful predictor of attitudes toward the Israeli-Palestinian conflict, and once contributed to anti-communist sentiment, probably stiffening America’s posture toward the former USSR.2 But that was about it. Has the situation really changed? Is religion now influencing the public’s understanding of the United States’ role in the world?

To answer these questions, we use the fourth quadrennial National Survey of Religion and Politics, conducted at the University of Akron in the spring and fall of 2004 and sponsored by the Pew Forum on Religion and Public Life. This survey of a national random sample of 4,000 respondents asked a range of religious questions seldom available in other surveys and, fortunately, also had a large battery of foreign policy questions. It is just this sort of evidence that has been lacking in the debate over the role of religion in foreign policy.

To go to the core of recent arguments, we examine the backing that America’s diverse religious communities provide for the Bush Doctrine, the president’s stress on military strength, preference for unilateral rather than multilateral action, willingness to engage in pre-emptive war (as in Iraq), and a tilt toward Israel in the Middle East.3 To measure this support, we use five items: an approval rating for Bush administration foreign policy, an assessment on whether the Iraq war was justified, whether pre-emptive war is ever justified, whether the United States should stress unilateral or multilateral action in international affairs, and, finally, whether America should favor the Israelis over the Palestinians. Although these questions tap different aspects of foreign policy, people respond to the package in consistent ways. In the jargon of social science, the questions scale nicely, forming a single dimension.4

To simplify presentation in the accompanying table, we report the percentage of each religious group that falls in the top half of public support for the Bush Doctrine. Thus, a score above 50 percent is more favorable than average, a score below 50 percent is more opposed.

The religious groups listed represent, first of all, America’s historic religious traditions. Although recent critiques of administration policy have concentrated almost exclusively on evangelicals, other traditions may also have distinctive foreign policy views. And, as every student of American religion knows, there are bitter theological divisions within the major religious traditions. We have therefore divided evangelicals, mainline Protestants, and white Catholics into traditionalists, centrists, and modernists based on adherence to classic Christian orthodoxy.

Finally, for all the religious categories, we look at three distinct subgroups: citizens (all respondents), voters (those who voted for president in 2004), and activists (voters who also engaged in at least two other political activities) to consider the impact of greater political engagement on attitudes.

We find vast religiously correlated differences among citizens in support for the Bush Doctrine. As the first column shows, Latter-day Saints are most positive, with 82 percent falling in the top half of the scale. Aside from the Mormons, evangelicals as a group do, in fact, provide disproportionate backing for the president’s policies, as critics contend. Interestingly, Hispanic Protestants, largely evangelical in theology, also exceed the sample average. Mainline Protestants follow, barely scoring on the positive side, and white Catholics are split right down the middle. Virtually all other religious groups (including Jews) are much less favorable toward administration policy, with black Protestants, the agnostic/atheist coterie, and other non-Christians (Muslims, Hindus, Buddhists, etc.) concentrated toward the bottom of the scale.

In addition to these differences among traditions, we find striking divisions within the larger Christian traditions. In each, traditionalists are most in favor of the Bush Doctrine, centrists less so, and religious modernists dissent in large numbers. (The same divisions can be seen even within smaller traditions, such as Hispanic Protestants and Catholics, Jews, and black Protestants, but the sample numbers are too small to report with confidence.) Altogether then, both membership in a religious tradition and theological traditionalism within the Christian traditions have important consequences for foreign policy attitudes.

Of course, politicians don’t exhibit equal solicitude for the views of every citizen: they are much more attuned to voters and, especially, to political activists. The scores for voters (column 2) and activists (column 3) reveal some interesting findings. Voters overall are actually a bit more supportive of the Bush Doctrine than the citizenry at large (52 percent), but activists are less favorable (only 44 percent).

The same basic religious patterns hold among voters and activists that we saw among citizens generally, but with some important modifications. First, for both evangelical and Catholic traditionalists endorsement of the Bush Doctrine rises as political engagement increases. (Among mainline traditionalists it goes up among voters, but retreats among activists.) In contrast, for mainline and Catholic modernists the president’s backers decline in strength as engagement increases, a pattern that also appears among the smaller faiths, and especially in the secular and agnostic/atheist groups. Thus, religious divisions over foreign policy exhibited by citizens generally are even wider among voters and, especially, activists.

The cause of these patterns is a complicated issue that we cannot fully address here. We can, however, identify three important factors, all of which have some influence. First, there may be a doctrinal basis for these differences. Thus, evangelicals’ distinctive posture may reflect the influence of dispensational theology, biblical literalism, Christian exclusivism, or perhaps moral dogmatism—”black or white” thinking. Conversely, the absence of such beliefs—or the presence of liberal religious or secular perspectives—may explain opposition to the president’s policies.

Second, religious leaders may have directed their flocks toward or away from the Bush Doctrine. Here, too, evangelicals provide a good example, given the strong support many clergy voiced for the Iraq war, and their suspicions about international institutions such as the United Nations. On the other side, the criticism that many mainline and Catholic clergy, including the Pope, directed toward facets of the Bush Doctrine may have attenuated support in those communities, at least among those hearing the cues.

Finally, foreign policy attitudes may simply be an artifact of partisanship and ideology. For example, evangelicals are a core GOP constituency and naturally endorse policies adopted by their conservative president and party leadership. Other religious groups may react in much the same fashion, depending on their own location in the current party lineup. In this context, it is worth noting that support for the Bush Doctrine matches very closely the share of the vote each religious group gave the president in 2004.

We did a modest test of these possibilities by incorporating measures of religious doctrine, attention to religious cues, and partisanship into a statistical analysis. All else being equal, religious doctrine makes a substantial contribution to support for the Bush Doctrine: Biblical literalists, dispensationalists, believers in the existence of Satan, and those who see salvation exclusively in Jesus score higher on the scale. And moral dogmatism plays a role: citizens who argue that there is a single standard of right and wrong for all times and places are much more likely to support the president.

Religious cues also make an independent contribution, largely reflecting the policy stance of the “governing” authorities in each tradition. Among evangelicals, those whose ministers preach on the Iraq war and terrorism are more supportive of the Bush Doctrine, as is the case among Hispanic Protestants. Among virtually all other religious groups—including mainline Protestants and white Catholics—those hearing pastoral discourses on these topics are less supportive of the president, sometimes substantially so. This effect is often even greater among political activists than among voters.

Finally, partisanship and ideology also shape assessments of the Bush Doctrine, even aside from the impact of religious doctrine and leadership cues: Republicans and conservatives score high on the scale, Democrats and liberals, much lower. In this regard, remember that President Bush actively courted evangelicals and other traditionalists before and during the 2004 campaign, and foreign policy was part of the pitch. Senator Kerry and the Democrats countered with appeals to other religious groups, apparently with some success.

In sum, American religious groups—and not just evangelicals—do indeed hold distinctive views on the Bush Doctrine. Evangelicals and traditionalists of all sorts are the strongest adherents, while the non-religious, religious modernists, and minority faiths are the most negative. These divisions become sharper as political engagement increases, and theology, religious leaders, and political identifications all play a role in deepening the chasm.

Our findings challenge the conventional wisdom on religion and foreign policy attitudes. Religion was strongly linked to such attitudes in 2004, in a pattern very similar to that seen in the presidential vote. Indeed, support for the Bush Doctrine appears to be another part of the much-discussed “polarization” of American politics. Were these patterns to persist beyond the Bush administration, they would represent a significant change in American political alignments.

Our findings raise even more provocative questions for Christians seeking to live out their faith in the world. Does the Bush Doctrine reflect the goals of security, peace, and democracy, as the president insists, or does it embody unilateralism, aggression, and religious zealotry, as his critics claim? Can pre-emptive and anti-terrorist wars be consistent with the Christian concept of a just war, or are they destructive without the hope of redemption? Is the United States dedicated to spreading God’s gift of freedom to all humankind, or has it wrongly identified its own selfish interest with Divine Purposes? Do traditional proponents of the Gospel now worship the nation state, or are the modern interpreters of faith incapacitated in the face of evil? These are questions for all Christians to ponder.

James L. Guth, Lyman A. Kellstedt, John C. Green, and Corwin E. Smidt have collaborated on many projects, including (with Margaret M. Poloma) The Bully Pulpit: The Politics of Protestant Clergy (Univ. Press of Kansas).

Page 3392 – Christianity Today (14)

1. Brian Urquhart, “Extreme Makeover,” New York Review of Books, February 24, 2005, pp. 4-5.

2. See our Religion and the Culture Wars (Rowman and Littlefield, 1996), chapter 17 for further discussion of the links between religion and foreign policy attitudes.

3. The Bush Doctrine is summarized in James M. McCormick, “The Foreign Policy of the George W. Bush Administration,” pp. 189-223 in Steven E. Schier, ed., High Risk and Big Ambition: The Presidency of George W. Bush (Univ. of Pittsburgh Press, 2004).

4. For the cognoscenti, the alpha reliability coefficient for the scale is .75.

Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromJames L. Guth, Lyman A. Kellstedt, John C. Green, and Corwin E. Smidt

John Wilson

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

Writers are a thrifty lot. They like nothing better than to salvage an old piece for a new readership. (Sometimes even for the same readership, as Erle Stanley Gardner did with bits of boilerplate in his Perry Mason books.) The latest moves in the stem cell debate reminded me of something I wrote for Christianity Today several years ago. I am using it again here, slightly altered.

Page 3392 – Christianity Today (16)

Remaking Eden: Cloning and Beyond in a Brave New World

Lee M. Silver (Author)

317 pages

$2.72

In 1997, when a mention of embryonic stem cells would elicit blank stares from all but a handful of readers, a Princeton University biologist, Lee Silver, published a remarkable book that addressed head-on the issues raised by the prospect of “engineering life.” The title of Silver’s book is instructive: Remaking Eden: Cloning and Beyond in a Brave New World. Let’s be done with the old superstitions that would have us in thrall to a fictitious divine Creator. We—human beings—are as gods, and we had better get on with the job.

Silver must be a superb teacher; his explanations of “reprogenetic technologies” are exceptionally lucid. He is certainly an unabashed enthusiast, exulting that “we have gained the power to control the destiny of the species,” and he impatiently dismisses the fears and moral scruples that might hinder the march of “research” in any way. Hence his book offers an invaluable opportunity for the reader to see these issues through the eyes of the typical mainstream scientist, whose collective authority our national opinion-setters invoke in countless references to that infallible oracle, “science.”

After all, as Silver remarks while brushing aside arguments from the Vatican about the status of human embryos, “Most people do not want to admit that their views are based on spiritual beliefs because in an advanced technological society like ours, with its foundation in science, arguments based on faith alone are not given much credibility. Scientific arguments are required for a cloak of respectability.”

And to have a little fun, to tickle knowing secularists and provoke hidebound believers, Silver introduces his first chapter with an epigraph from Genesis 1 and begins his epilogue with a verse from Revelation: “I am Alpha and Omega, the beginning and the end, the first and the last.” Clever!

But surely this mocking appropriation of religious language is rare in serious science writing? Well, no. Of course there are many scientists who don’t go in for that sort of thing, but many others relish the opportunity to take a jab at the pious and the faithful.

Such mockery can turn up in the most unexpected places. Consider, for example, the extremely influential 1966 book, Adaptation and Natural Selection, by George Williams, one of the preeminent evolutionary biologists of our time. Unlike Silver’s book, written for a popular audience, Adaptation and Natural Selection was intended in the first instance for Williams’ peers and students. Here is the very last paragraph in his book:

Perhaps today’s theory of natural selection, which is essentially that provided more than 30 years ago by Fisher, Haldane, and Wright, is somewhat like Dalton’s atomic theory. It may not, in any absolute or permanent sense, represent the truth, but I am convinced that it is the light and the way.

This lightly mocking appropriation of Scripture ends the book on an urbane note: no blunderbuss blast at the dunderheaded creationists but rather an artfully ironic allusion that flatters the reader: We’re in the same club, you and I.

Not all appropriations of religious language in science writing are intended to mock. In the same year that Lee Silver’s Remaking Eden appeared, the distinguished cosmologist Lee Smolin published a book called The Life of the Cosmos. Smolin is not a religious believer. In his conclusion, he compares the universe to a city, “an endless construction of the new out of the old. No one made the city; there is no city maker, as there is a clockmaker.”

It follows that “there never was a God, no pilot who made the world by imposing order.” But this is no cause for despair, for existential angst: “Nietzsche now also is dead,” Smolin writes. Instead of brooding, he wants us to celebrate the evolutionary “logic” at the heart of the cosmos: “the logic of life is continual change, continual motion, continual evolution.”

Still, Smolin is not as naïvely utopian as Silver. Here are the last sentences of Smolin’s book:

All we have of natural law is a world that has made itself. All we may expect of human law is what we can negotiate among ourselves, and what we take as our responsibility. All we may gain of knowledge must be drawn from what we can see with our own eyes and what others tell us they have seen with their own eyes. All we may expect of justice is compassion. All we may look up to as judges are each other. All that is possible of utopia is what we make with our own hands. Pray let it be enough.

Pray? And to whom shall those prayers be directed?

Books & Culture is about to celebrate a notable birthday. Ten years ago, the July 17, 1995 issue of Christianity Today featured a preview of B&C, with articles and reviews by Roberta Bondi, Frederica Mathewes-Green, Mark Noll, Cornelius Plantinga, Jr., Will Willimon, and Philip Yancey. In September of that year, the first issue of the new magazine appeared. Flash forward a decade. With our next issue, September/October 2005, we’ll mark the magazine’s 10th anniversary. We’re planning a feast—a feast of words and pictures—for that occasion, and we’d love to include reflections from our readers on those first ten years. But don’t delay. By the time you receive the issue you’re reading, we’ll already be well into the editorial cycle for September/ October. You can reach us via email at bceditor@booksandculture.com.

Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromJohn Wilson

Eugene McCarraher

Meet Mark C. Taylor, the virtuoso of Nietzschean boosterism.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

Say you’re a theologian in the religion business who’s concluded that your company’s oldest and most trusted product doesn’t really exist. What do you do after the death of God? You could lie to the customers and stockholders, continue writing copy, and ruefully await retirement. But if you’re more imaginative, you could turn your crisis into an opportunity, as consultants like to say, and spin God’s death as a new form of life, an “entrance of divinity fully into the human.” You could “re-tool” and jump to another firm—English, philosophy, or perhaps a start-up in something-or-other studies.

Page 3392 – Christianity Today (18)

Or, like Mark C. Taylor, you could become an entreprofessor, a broker in the emerging intellectual markets, trading in some of the hottest stocks in cultural capital. Pooling your dwindling fortunes in theology and philosophy with venture capital from postmodernism, you nimbly navigate the volatile and bubbling markets in profundity, hang out with the rich and famous, and after a while you’re a pioneer in internet education, adulated in the Sunday New York Times. As long as the bubbles don’t burst, and as long as the old business doesn’t revive, you’re as safe as a tenured academic—which, of course, you are already.

Lest anyone recoil from these remarks as ad hominem, consider that Taylor himself (a professor of humanities at Williams College) blithely endorses the opportunism they evoke. “For the canny player,” he writes at the end of Confidence Games, life is a “game of poker.” Since “one can never be sure that the chips can be redeemed”—redemption being a game for losers and suckers—”the best strategy is to keep the game going as long as possible.” And since the game is, according to Taylor, increasingly played as a “complex adaptive system” of global information networks, “we” need to “cultivate an appreciation for the resources and limitations of many religious traditions.” In short, don’t invest too heavily in any one company—diversify. You can’t be saved, but you’ll live longer.

That’s a sinister notion, however winsome Taylor makes it seem, and it epitomizes the super-cool sophistry that characterizes Confidence Games. A consummate bourgeois bohemian (or Bobo, to borrow from David Brooks), Taylor embodies the merger, or perhaps the now-unmistakable fraternity, of countercultural iconoclasm and late-capitalist business culture. (Call it the hipness unto death.) So even if it’s read only by a few thousand academics, Confidence Games is symptomatic of the tony nihilism that pervades the American professional and managerial classes.

Taylor tells us that Confidence Games marks the culmination of his search for a “philosophy of culture” after God’s demise. Not that he ever accepted the “secularization” narrative now discredited among scholars: in Erring (1984), he advanced an “atheology” that rejected transcendent divinity and relocated it in the processes of nature and culture. “Religion never disappears,” he now writes; rather it “takes different forms.” Those forms, he contended in Imagologies (1994) and The Moment of Complexity (2001), are increasingly those of deconstruction, cybernetics, and advanced communications technology. For Taylor, telecommunications represents an End of History without Francis Fukuyama’s mandarin disdain for the Last Man’s consumerism. “The net,” he proclaimed in Imagologies with a characteristically aphoristic flourish, “wires the world for Hegelian Geist.”

That geist has been decidedly capitalist since 1989, and it moved Taylor, in the late 1990s, to join with the financier Herbert Allen in founding Global Education Network (GEN), an online “service provider” of education in the humanities and sciences. A century after Thorstein Veblen scoured the Victorian pecuniary pieties regnant amidst “the higher learning,” Taylor represents a brash new style in academic profiteering, fusing post-Christian religiosity with the glitzy banalities of info-capitalism. As Taylor observes, “the distance between Haight Ashbury and Silicon Valley is not as great as it often appears.” Tom Frank has labeled this sort of convergence “the conquest of cool,” and it names the transformation of Hip into the latest style of corporate hegemony.

Confidence Games is certainly brisk and truthful enough to seduce the uninformed or the intellectually fashion-conscious. Religion, art, and economics form an “intricate interplay.” (Stroke chin, furrow brow, don’t insist on distinctions.) Since religion “never disappears” but assumes new forms, God has, in our time, been “reborn as the market.” Once ploddingly industrial and material, that market has evolved “from a manufacturing to an information economy” driven by computerized networks of growing speed, complexity, and scope. These vertiginous webs of information comprise a “self-organizing system” whose incessant metamorphosis generates its own forms of chaos and order. Drawing on “complexity studies,” recent financial theory, and investor-philosophers like George Soros, Taylor argues that markets are not arenas of individual agents but elastic, adaptable, and “self-reflexive” webs of interrelationship.

All attempts to impose order from outside, whether political or religious, will inhibit and pervert the creative energy that surges through life, and especially through contemporary capitalism. Indeed, the very instability and evanescence of the info-capitalist order can enrich our lives if we embrace rather than fear uncertainty and ephemerality. If we welcome unreservedly the volatility of financial and technological change, life can be “a confidence game in which the abiding challenge is not to find redemption but to learn to live without it.” Postmodern jouissance and market calculation form a lucrative partnership, sponsored by Hayek and Nietzsche. As Zarathustra spoke, “joy is everlasting flow,” and economics morphs into a gay and not a dismal science.

It’s hard not to swoon to Taylor’s often lissome and buoyant prose, and there’s much to savor in this exuberant affirmation of contemporary capitalism. We get elegant and illuminating accounts of window design, Times Square architecture, the origins of NASDAQ, and the fall of the gold standard. We follow the intellectual history of markets from John Calvin and Adam Smith to Robert Shiller. We witness the conflation of aesthetics, providence, teleology, and economics in Hutcheson, Kant, Schiller, Hegel, Marx, and Simmel. We trace the cultural history of money, from its religious origins, to its psychoanalytical status as projected anality, to its accelerating dematerialization. What’s more, Taylor is savvy enough to know his derivatives from his IPOs. Like the protagonist of William Gaddis’ JR—who’s cited at the outset of almost every chapter—Taylor has “picked up the lingo of investors, traders, and LBO experts.” (That’s leveraged buy-out, for you incognoscenti.)

But Taylor is most engrossing when he borrows, not from neo-disciplines like “complexity studies,” but from vintage thinkers who’ve already explored the religious nature of economics. He covers the work of Georges Bataille, whose ruminations on gift exchange and unproductive expenditure still don’t receive their due. He elucidates Marx’s perception of money as the perpetually mobile “god among commodities,” who bestows, mediates, and incarnates value. It was the historical materialist, Taylor reminds us, who discerned that money was fundamentally “a mental form,” irrelevant and even hostile to materiality. And Taylor reprises Simmel’s Philosophy of Money (1900), a ponderous masterpiece in which money becomes “a coincidentia oppositorum worthy of Nicholas of Cusa,” reconciling all worldly oppositions and establishing “an inconceivable unity of being.”

Alas, Taylor has also inhaled deeply of the intellectual helium that inflates the “New Economy,” and a fusillade of obfuscation permeates the text. There’s the fustian of “complexity”: “growing complexity,” “increasingly complex,” “infinitely complex.” (“Complexity” and its cognates rank with “interesting” and “problematic” as tokens of mediocrity.) There’s the Everything-Has-Changed School of Portentous Social Commentary: “When computers are networked, everything changes”—they get, well, more complex. Did you know that reality and image are becoming indistinguishable? Taylor muses, without a trace of facetiousness, that “the Vegas copy [of Venice] might prove to be as good as, if not better than, the European original.” (Think about sex in this regard—that’s all I’m writing.) There’s even a We-Learn-from-the-Students scene, as Taylor and some dot-com whiz hang with undergrads from The-Kids-Are-Alright Academy, replete with backpacks and bottled water.

This is brain candy for middlebrows, and it makes Confidence Games a learned but utterly conventional artifact of New Economy literature. Other samples include Ray Kurzweil’s wonder at “spiritual machines”; George Gilder’s libertarian celebration of info-technology as “the overthrow of matter”; Michael Lewis’ paeans to youthful and avaricious techno-geeks; Bill Gates’ heralding of “friction-free capitalism.” Heir to Italian Futurism and ’60s “social forecasting,” it’s a heady and intoxicating genre, swelling the imagination with a frisson of movement, a synaptic romance of micro-circuitry, and a utopian promise of global communion. Bogus but beguiling, it demonstrates that an unrequited eschatological hope has always fueled the engines of accumulation.

Yet if Taylor had his way, it would remain unfulfilled, for in good postmodern fashion he rejects any final resolution or transcendence of our historical maelstrom as an authoritarian closure of possibility. Taylor’s essentially neo-liberal affirmation of financial markets as self-reflexive and self-correcting reads, to me, like a micro-waved rehash of Smith and Hayek, and it dovetails nicely with his nihilist celebration of immanence as the unbounded play of desire. So Augustine, it seems, was wrong to desire a home for his restless heart. “The endless rustle of desire,” Taylor writes, renders impotent and illusory all attempts at salvation and permanence. Thus can Taylor affirm the plenitude of life, and conclude with a rapturous vista of middlebrow gusto: “the interplay of light and darkness,” he writes, is “inescapably disruptive, overwhelmingly beautiful, and”—you guessed it—”infinitely complex.”

What does it take to write with such insouciance about failure, suffering, and death? I don’t think it’s flippant to respond: tenure, medical insurance, and a pension, the oblivious possession of which provides the Bobo set with security to neglect some intractable material and social realities. Focusing on the “financial-entertainment complex” (a nice phrase, I must admit), Taylor obscures the persistence of the military-industrial complex which remains indispensable and even central to the corporate economy. It’s simply not true that information has “displaced” manufacturing: motion pictures, radio, and amusement and recreation services comprise 2 percent of gdp, for instance, while manufacturing makes up 20-25 percent. (What’s more, business journalist Doug Henwood, writing of the 1990s, has characterized productivity growth in the financial and services sectors as “underwhelming.”) It’s also worth recalling that the “dematerialized” economy rests on a very concrete infrastructure of computers and fiber-optic cables, the production, use, and disposal of which entail enormous amounts of coal, oil, and uranium. And besides, what is “dematerialization,” anyway? Aren’t all those cyber-blips still matter, however “insubstantial” they seem?

To be even more pointedly materialist, Taylor’s economy is unpopulated by office temps, dishwashers, cable installers, Filipino assembly-line workers who destroy their eyesight soldering computer circuits. There are no adjuncts who staff the courses that the tenured are too busy or proud to teach, no janitorial staff who dump their trashcans and vacuum their carpets, no domestic laborers who clean their kitchens, bedrooms, and toilets. There are no sweatshops, no overworked and underpaid counter help, no factory operatives or data-entry clerks without pensions or health insurance. In short, because they don’t show up in the network—except perhaps as the cost of doing business—most of the world’s population doesn’t merit Taylor’s attention.

This bone-deep solipsism, increasingly endemic to the suburban middle class, follows directly from an inability to acknowledge our humble and fragile materiality, the substance of which involves us, on this side of paradise, in painful and exploitative bonds as well as connections of felicity and flourishing. As Barbara Ehrenreich has observed, “to be cleaned up after is to achieve a certain magical weightlessness and immateriality.” Such indifference to the world without quotation marks enables palaver about the capitalist economy as an “information-processing machine” of “complex adaptation.” In the same vein, exotic pedantry about the joy of untrammeled desire conceals the coercive nature of capitalist markets and workplaces; marvel at the insubstantiality of money deflects attention from the commodification of activities once performed without pecuniary exchange; pabulum about “webs” and “processes” camouflages the mundane indignities and brutalities of class, power, and war.

Structured by the imperatives of profit and loss, this is the world of immanence in which postmodern sages would have us place our confidence—which is to say our faith—and for all his declamation against “fundamentalism,” Taylor’s confidence game seems to me one of the most credulous of faith-based initiatives. It’s a rigged and merciless game, deliverance from which requires faith in a rigorous but forgiving transcendence who cancels debts, announces jubilee, and redeems all restless desires. Taylor’s Nietzschean boosterism is but a modish apologia for the sway of power, and it deserves a resounding vote of no confidence.

Eugene McCarraher is professor of humanities at Villanova University. Next year, he will be a fellow of the American Council of Learned Societies and will complete his second book, The Enchantments of Mammon: Corporate Capitalism and the American Moral Imagination.

Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromEugene McCarraher

John Utz

Sherlock Holmes returns—again.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

To a degree perhaps unmatched by any other fictional character, Sherlock Holmes continues to capture our imagination. Sir Arthur Conan Doyle’s 56 short stories and four novellas featuring Holmes have never been out of print and continue to sell in countless editions. The New Annotated Sherlock Holmes, Volumes 1 and 2, released last year, distills and applies Sherlockian scholarship to the short stories with all the seriousness of an annotated Bible. (A third volume, with the novels, is forthcoming.) And this is just to mention Doyle’s original tales. Even before Doyle put down his pen in 1927, an army of Holmes imitators arose, including such execrable upstarts as Picklock Holes, Hemlock Jones, Holmlock Shears, and Sherlaw Kombs.

The most popular Baker Street spinoffs are the pastiches in which Holmes himself appears as protagonist. Many a mystery writer has tossed off a Holmes story in affectionate homage or parody or both. The latterday vogue for the novel-length pastiche began with Nicholas Meyer’s 1974 bestseller The Seven-Per-Cent Solution, in which Holmes meets Sigmund Freud. Also notable in this vein is Laurie King’s popular series about Mary Russell, a prodigious young woman who apprentices herself to Holmes and eventually marries him, despite being forty years his junior. In their latest adventure, Locked Rooms, just published, Holmes and Russell travel to San Francisco, where they encounter Dashiell Hammett. And new entries keep coming: Mitch Cullin’s A Slight Trick of the Mind and Caleb Carr’s The Italian Secretary also appeared earlier this year.

Even given this cultural obsession with the great detective, it might come as something of a surprise that the winner of the 2001 Pulitzer Prize for fiction would write a novella with Sherlock Holmes as its protagonist. Writers of serious literature rarely risk forays into genre fiction, let alone derivative genre fiction. But in Michael Chabon’s hands, a Sherlock Holmes story becomes an opportunity for much deeper reflection—on the profound human needs that drive us to detective fiction in the first place, and the real problems that lie outside our powers of comprehension.

Chabon is a tremendously gifted writer. He combines an unapologetically lyrical prose style with acute observations and uncannily apt figures of speech. Every page he writes contains memorable turns of phrase that are at once delightfully imaginative and so deadly accurate that they seem to have been inevitable. Yet, for all his intelligence and imagination, he is not a typical literary novelist. He has made no effort to hide his love for popular arts, particularly comic books and old-fashioned “adventure” stories—indeed, he lent a hand to the screenplay for Spider-Man 2 and edited McSweeney’s Mammoth Treasury of Thrilling Tales in 2003. And the first novel he published after winning the Pulitzer was an outlandish fantasy for children, called Summerland.

The Amazing Adventures of Kavalier & Clay, the sprawling epic for which he won the Pulitzer, has comic books at its heart. In it, two young Jewish men, one from Brooklyn and the other his cousin, who has recently fled the Nazi invasion of Czechoslovakia, invent a comic book hero called the Escapist. In the days before Pearl Harbor, the Escapist wages a kind of proxy battle against the Axis powers, delivering an “immortal haymaker” to the jaw of Adolf Hitler. But for Josef Kavalier, the artist who has left his family behind in Prague, the joys of such symbolic action against the Nazis soon seem hollow:

The surge of triumph he felt when he finished a story was always fleeting, and seemed to grow briefer with every job. This time it had lasted about a minute and a half before turning to shame and frustration. The Escapist was an impossible champion, ludicrous and above all imaginary, fighting a war that could never be won.

Kavalier & Clay is about a great many things, but it is Chabon’s daring admission of the limits of art as symbolic action—in the midst of a passionate celebration of the comics and, by extension, of art more generally—that gives that book real significance. This complex gesture of simultaneous celebration and apology is repeated in Chabon’s latest work, The Final Solution: A Story of Detection. The book’s title echoes the famous Holmes story, “The Final Problem” (in which Doyle killed off his fictional creation, only to be forced by popular clamor to resurrect him nine years later), but it also clearly refers to the Third Reich’s genocidal plan of the same name. But if The Amazing Adventures of Kavalier & Clay pitted popular arts against political action, The Final Solution, though more concise, aims at something greater. Chabon pays his dues to the mystery genre, but he is also concerned with mystery itself, and the meanings that we project against it.

The book takes place in England in 1943, where we find the elderly Holmes cultivating honey bees in the bucolic Sussex countryside. His idyll is disturbed when he observes a boy walking along the railroad tracks on the perimeter of his property with an African parrot perched on his shoulder. Because Holmes fears that the boy may electrocute himself on the third rail, he races from his house to intercede, but he fails to elicit an answer from the boy. We soon learn that young Linus Steinman is a Jewish orphan recently placed with an English family, the parrot his only connection to the past. The trauma that Linus has undergone has seemingly rendered him mute, but the parrot speaks in his stead, voicing endless German lieder and, more provocatively, strings of numbers.

Chabon doesn’t adhere strictly to the classic Sherlock Holmes formula. Whereas Doyle’s tales were consistently narrated by Watson, the very model of Victorian rectitude, Chabon doesn’t allow us such stable footing. Here the setting and perspective shift from chapter to chapter, and though most of the narration favors the perspective of “the old man” (Chabon never refers to him by name, but his identity is clear), there are also four chapters told from the perspective of other characters, including one that divulges the thoughts of the parrot. Moreover, the crime isn’t presented as a straight-forward puzzle for Holmes to solve. Indeed, Holmes does not agree with the local authorities on what the mystery actually is. We learn off-handedly in the third chapter that one of the characters we have just met has been “struck in the back of the head with a blunt object,” but Holmes isn’t much interested in this case. Instead, he is troubled that the boy’s parrot has gone missing in the process. He will aid the authorities “To find the boy’s parrot. … If we should encounter the actual murderer along the way, well, then it will be so much the better for you.”

Though Holmes pretends to be reluctant about playing the game one last time, he is clearly more alive when conducting this investigation than he was making honey. Though he fears dying in a meaningless or absurd manner, he is pleased by the idea of expiring in the midst of an investigation—”to amount to no more in the end than a single great organ of detection, reaching into blankness for a clue.” Chabon hints here that Holmes should be taken not merely as a representative of the mystery genre but, more abstractly, as a figure for the human need for comprehension, for answers. For what is Sherlock Holmes but an engine for the making of meaning itself, the fashioning together out of disparate bits of seemingly meaningless ephemera an answer, an end to mystery? In typically beautiful prose, Chabon proffers this description of the process:

A delicate, inexorable lattice of inferences began to assemble themselves, like a crystal, in the old man’s mind, shivering, catching the light in glints and surmises. It was the deepest pleasure life could afford, this deductive crystallization, this paroxysm of guesswork, and one that he had lived without for a terribly long time.

Over the course of his investigation, the old man enjoys such pleasures several times, and in the end order has been restored. But the great detective is left troubled by some key questions that go unanswered:

One might conclude … that meaning dwelled solely in the mind of the analyst. That it was the insoluble problems—the false leads and cold cases—that reflected the true nature of things. That all the apparent significance and pattern had no more intrinsic sense than the chatter of an African gray parrot.

The question, of course, is whether there is an intrinsic sense to that chatter, and if so, what does it signify? Although The Final Solution involves a murder, and the uncanny boy and his parrot hold our attention until the final page, the real mystery that underlies the book is less about machinations of plot than the limits of human understanding, and our deeply held desire that our efforts at comprehension will win the day against meaninglessness. In the end, Chabon has crafted a gem that satisfies on many levels, embedding within a conventional mystery story a profound reminder of the depths to which mystery can reach.

John Utz teaches literature and writing at Duke Divinity School.

Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromJohn Utz

Karl W. Giberson

The science of love.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

Science is often at odds with common sense. In fact, some would read the history of science as the steady retreat of commonly held misperceptions about the world in the face of controversial but ultimately compelling scientific explanations. Did not the moving earth have to displace the commonsense stationary earth? A bizarre quantum physics replaced the intuitive classical physics; relativistic time and space replaced their everyday counterparts; and so on. Albert Einstein was once challenged by a critic, upset that his theories flew in the face of common sense. The great scientist was dismissive: “Common sense is a body of prejudice laid down in the mind prior to the age of eighteen.”

Page 3392 – Christianity Today (21)

There is, to be sure, some truth in this simple picture of an uninformed common sense steadily retreating in the face of scientific advance. But the reality is much more complex, and there are some interesting counterexamples. I suspect that the current enthusiasm for multiple universes will eventually wane and return to the traditional commonsense view; likewise the genetic determinism of some scientists will give way to the old-fashioned idea that parenting, friendships, and life experiences are critically important.

But the most striking counterexample to the simplistic picture of “science trumping common sense” would have to be the early 20th-century conviction that physical affection, human contact, and love were irrelevant to infants. For a rather long period of time, the psychology of early childhood went completely off the rails and ran at right angles to common-sense notions of childrearing.

Alas, this particular departure from common sense was not so benign as Galileo’s discussion about the motion of the earth. Far from it. This misunderstanding resulted in the death of tens of thousands of children, victims of a profound confusion about the nature and importance of love. Unknown to the science of the time was a central “mystery” that is still being unraveled—namely, that little children need lots of love. They need to be held, hugged, kissed; they need someone to play peek-a-boo with them and swing them in a circle. There is something in these natural, primitive activities that strengthens little children in mysterious ways, making their immune system more robust, giving them the strength to fight off childhood illnesses.

Beloved, let us love one another. For love is of God, and everyone that loveth is born of God, and knoweth God. He that loveth not, knoweth not God, for God is love. 1 John 4:7,8

The data supporting this are both horrible and incontrovertible. Consider the Hospital of the Innocents in Florence. In two decades in the middle of the 18th century, this orphanage took in more than fifteen thousand babies. Ten thousand of them died before they reached their first birthday. Nineteenth-century America witnessed similar tragedies. More than half of the unhappy orphans assigned to an institution in Buffalo between 1862 and 1875 died before the age of one.

Convinced that the deaths were the result of infections spread by touch, the homes developed sophisticated procedures to reduce the chances that the babies would get germs of the sort that might be spread by hugging, rocking, or that most ghastly and irresponsible act of germ warfare—kissing. One hospital devised a special box with inlet sleeves that would allow an attendant to interact with the child—change a diaper, for example—without actually touching the child. Similar boxes are used today by technicians who handle dangerous chemicals.

The sterile environments recommended for medical reasons, which must surely have horrified some of the caregivers, fit nicely with the prevailing wisdom in psychology. In the early 20th century, the president of the American Psychological Society, John B. Watson—famous as the founder of behaviorism—warned of the “Dangers of Too Much Mother Love,” insisting that responsible parents refrain from kissing and hugging their children, lest they become emotionally needy or—horrors—get germs. Watson’s bestseller on raising children was praised by everyone from Bertrand Russell to Parents Magazine.

But still the children kept dying, germs or no germs.

We know that we have passed from death to life, because we love. 1 John 3:14

The dark world of child psychology was deeply and clearly in need of a revelation. And like another revelation about love 2000 years ago, this one was heralded by a lone voice calling from the wilderness. The lonely voice calling American psychology to repentance was that of Harry Harlow, an eccentric psychologist who spent most of his controversial career at the University of Wisconsin.

Harlow’s story is told with elegance and passion by Deborah Blum in Love at Goon Park. It is a tale of love or, more accurately, the absence of love and the tragic consequences that ensue when love does not flow naturally and freely into the nooks and crannies that Mother Nature has provided for this most basic of human emotions.

Love at Goon Park chronicles the exposure of this shocking and demoralizing state of affairs as it slowly gave way to our modern celebration of parental love. Credit for this overdue scientific revolution goes largely to Harlow, whose work appears in just about every introductory psychology text. You may recall the touching photos of a baby monkey clinging to an artificial cloth-covered “mother.” Harlow’s highly original experiments on baby primates revealed an unimaginably profound need for love—a love that could only be communicated by touch. Forced to choose between a cloth mother that felt “maternal” or a wire mother with a supply of milk, baby monkeys always chose the former, abandoning her only momentarily to feed.

Harlow fought an entrenched establishment led by luminaries like John Watson. He needed powerful weapons to dislodge the near universal scholarly consensus that parental love for children should be checked. Harlow’s intellectual weaponry came in the form of highly illuminating experiments on primates. What happens when a baby is raised with no love? What happens when a baby is raised in total isolation? What happens when comforting sources of love are removed? And so on.

There is no fear in love. But perfect love drives out fear. 1 John 4:18

Harlow led psychology away from the paradigm of clinical sterility that had (mis)guided a century of research into child-rearing. Given the tragic state of children in “scientifically informed” institutions, there can be no doubt that a great many lives were saved by the work of Harlow and his colleagues. The life-saving revelations came with a price: Harlow’s primate subjects were treated with extreme cruelty—not gratuitously, but by the very design of his experiments. Was it worth it? Read Love at Goon Park and decide for yourself.

There is however, a much deeper question here than Harlow’s experimental procedures. How was it that something as natural and commonsensical as the importance of love for children could be so thoroughly misunderstood by the scientific community? Picture a sophisticated, well-educated, high society mother listening to classical music while her baby cries in the next room. An expensive table lamp illuminates the pages of the book she is reading—a parenting book warning against the dangers of giving her baby too much attention. She chides herself for the primordial instinct that tells her to go to her child, pick him up, and offer some comfort against the terrors of the night. Juxtapose this image with that of an illiterate rural farmer’s wife comforting her newborn at her breast. She is completely ignorant of the scholarly consensus that her actions will ultimately undermine her child’s development. She is unaware that her actions require much thought for she is simply doing what comes most naturally. She is doing what every mother would do, unless instructed by science to do otherwise.

There are lots of “natural” behaviors, of course, and certainly no case to be made that indulging tendencies simply because they come “naturally” is a good idea. Middle-age college professors should not be encouraged to go chasing after attractive freshman coeds just because it seems “natural”. But the profound love that parents have for children, a love that almost always requires sacrificial and altruistic behavior to put into practice, is precisely the kind of love that has consistently been promoted, celebrated, even demanded by Christianity. God became incarnate because he loved the world; God, as revealed in Jesus, is a God of love; Jesus commanded his followers to love; Christians are to be known by their love and so on. Never mind how far short we fall and how often we fail to love as we should; we all know that Christianity calls us to embrace a profound, all encompassing, love—a love of the sort that, when received by infants and baby monkeys, literally gives life.

Science does not always advance by boldly going where nobody has gone before. Science sometimes advances by finally getting to where everybody has already been.

And now abide faith, hope, love, these three; but the greatest of these is love. 1 Corinthians 13:13.

Karl W. Giberson is editor-in-chief of Science & Spirit magazine, editor of Science & Theology News, and professor of physics at Eastern Nazarene College.

Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromKarl W. Giberson

Donald A. Yerxa

Taking the long view.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

My parents’ library was not exactly well stocked, unless sentimental devotional literature was your fancy. But below the Funk & Wagnall’s encyclopedia (purchased in weekly installments at the grocery store) was a thick book that captivated me as a young boy: H.G. Wells’ The Outline of History. There it all was: the history of everything, so it seemed. Learn that, and you’ve mastered history!

Page 3392 – Christianity Today (23)

Maps of Time: An Introduction to Big History

David Christian (Author), William H. McNeill (Foreword)

University of California Press

664 pages

$84.69

Universal history, the grand project of a single overarching story of the past, appealed both to my curiosity and to an untutored conviction that the past had to be a story full of meaning. As I became better acquainted with the methods and practices of academic historians, I realized that my childhood fascination with universal history was at best naïve. Not only does universal history lie beyond the grasp of the historian’s method, it is also beyond the historian’s appointed task. Because the past has no shape other than what we impose on it, I learned that no reputable historian should presume to write such a speculative story. Still, I would on occasion pause to peruse the impressive multi-volume works by Toynbee and the Durants and privately marvel at the boldness of their vision; could it be that universal history was not such an intellectually bankrupt enterprise as I had been taught to believe?

In recent years some of the impulses behind universal history seem to be undergoing rehabilitation. In part in response to globalization, historians are scrambling to find the conceptual tools (units of investigation, periodization schemes, etc.) to make better sense of humanity’s past. This is an extraordinary intellectual development still very much in its initial stages. What is already clear is that historians and scholars from other disciplines, particularly the natural sciences, are asking questions that force us to examine the past in very large chunks.

Indeed, what makes this venture more respectable than the old universal history is an increasingly sophisticated set of tools borrowed from the sciences. One of the most fruitful uses of the new scientific evidence is in the exploration of how climate and environmental factors have influenced history. As was the case with universal history, for decades historians were suspicious of anything that smacked of environmental determinism, especially since early practitioners of this approach—viz., Ellsworth Huntington, whose Civilization and Climate (1924) created a hierarchy of civilizations based upon climactic advantage or disadvantage—came dangerously close to racist conclusions. But many of these suspicions melted away with the enormous success of Jared Diamond’s Pulitzer Prize-winning Guns, Germs, and Steel: The Fates of Human Societies (1997) and his current bestseller Collapse: How Societies Choose to Fail or Succeed.

Also notable in establishing the new universal history were Brian Fagan’s Flood, Famines and Emperors: El Nino and the Fate of Civilizations (1996) and The Little Ice Age: How Climate Made History, 1300-1850 (2000). Now Fagan has completed a “climate and history” trilogy with his most ambitious offering to date: The Long Summer. Drawing from an impressive array of studies (from standard archaeological finds and tree ring analyses to deep ice core samples, the study of pollen grains (palynology), and amazing forensic techniques that one might expect to see on an episode of CSI, Fagan assesses the impact of various climatic shifts on the long sweep of history.

Careful at the outset not to overstate his case, Fagan cautions against a simplistic view that climate drove history. But he makes a convincing argument that climate change has been “a major historical player,” and certainly not a benign one. For example, Fagan explains how the sudden shift of the Gulf Stream forced people in the ancient Near East to switch from hunting and gathering to early forms of agriculture; how the implosion of the Laurentide ice sheet in Canada c. 6200 bc triggered a rapid rise in the world’s oceans that separated England from the continent; and how another rise in the waters of the Mediterranean around 5600 bc led to a catastrophic flood that transformed the Euxine Lake into the Black Sea in only two years. Fagan’s scientific toolkit serves him well when he recounts the interplay between climate and prehistoric peoples and early civilizations. His chapters on the early Mesopotamians, Hittites, Egyptians, and Mayans are riveting. Fagan is less convincing, however, when he reaches the more familiar historical terrain on the Romans. Here he seems to exaggerate the impact of climate and undervalue the complex roles played by leaders and cultural forces—always a danger when dealing with the past from such a lofty vantage point. To be fair, this book is primarily about prehistory, and Fagan’s treatment of Rome is almost a postscript. Still, we may wonder whether, if our knowledge of those earlier societies were more extensive, we might not find his accounts of them equally unsatisfactory.

But Fagan’s bold attempt to use climate to understand history seems downright timid compared to the project of “Big History,” the attempt to offer nothing less than a grand unified story of natural and human history. In Maps of Time, Australian historian David Christian, now based in San Diego, has given us a state-of-the-art instance of Big History, a work of breathtaking synthesis wherein Fagan’s prehistory is but a blink of the eye—well, actually, a couple of chapters. It’s all here: big bang cosmology, the formation and drift of galaxies, the origins of the Earth, the origins and evolution of life and the biosphere, human evolution, prehistory, the emergence of agriculture, settled communities, agrarian civilizations, global networks of exchange, the birth of the modern world, and the “great acceleration” of the 20th century.

Christian has done his homework. He freely draws from the best and brightest science writers, economists, sociologists, and world historians. A very truncated list includes such luminaries as Stephen Hawking, Stephen Weinberg, Paul Davies, Lynn Margulis, Lee Smolin, E.O. Wilson, Freeman Dyson, Stephen Jay Gould, Ernst Mayr, William and J.R. McNeill, Alfred Crosby, Jared Diamond, Robert Wright, Eric Wolf, Anthony Giddens, Joel Mokyr, Daniel Headrick, Charles Tilly, Geoffrey Parker, and, of course, Brian Fagan.

This catalogue of topics and sources does not do justice to the sophistication of the argument in Maps of Time. For example, Christian’s chapter “Globalization, Commercialization, and Innovation” is not only a brilliant synthesis of the period from 1000-1750, it also contains the best treatment of Europe’s distinctive role in the modern world I have ever encountered. His discussion of the changing topography of global exchanges, whereby Western Europe moved from the margins to the hub of exchanges within the Afro-Eurasian world zone, is especially helpful.

Yet such insights do not satisfy Christian’s ambitions. In a brief appendix, he reflects on the “endless waltz of chaos and complexity” at the very core of Big History. Drawing from the theoretical work of Ilya Prigogine and Isabelle Stengers, he offers a thermodynamic explanation for why similar patterns operate at multiple scales. None other than William McNeill, the doyen of world history, hails this as “the supreme achievement” of the book. Indeed in his foreword, McNeill calls Maps of Time an “intellectual masterpiece,” likening it to the breakthroughs of Newton and Darwin. Christian’s account of the rise of complexity (increased order) in an overall entropic cosmos is clearly motivated in part by a desire to steer clear of religious explanations for the emergence of complex and durable patterns, and with all due respect for the judgment of William McNeill, we may want to wait a few decades before anointing the next Newton.

What is gained when we attempt to render the past on such expansive canvases? For Fagan the payoff is didactic. By taking a very long view—and it’s precisely here that he’s most persuasive—he is able to argue that civilizations arose during a remarkably long summer, one of the longest periods of relatively stable climate on record. We have no idea when this summer will end. But we do know that the greater the complexity of human societies, the more vulnerable they become to climatic events. They cannot swing with the climatic punches. While the developed world has gained a measure of security from short-term events, it would be foolish to suppose that we are now immune from disastrous climatic changes. It is not just drought and famine that should concern us. Fagan reminds us how vulnerable the crowded coastlines of the world, where millions live and work, are to changes in sea level brought about by climatic shifts.

For Christian the rewards are theoretical and even border on the religious. In the same appendix that delighted McNeill, Christian provides an intriguing discussion of how the concept of emergent properties, borrowed from complexity theory, can be useful for historians using a very wide lens. Emergent properties are features or rules that emerge at one level of complexity but are not present at other ones. For example, one could say that at one level all humans—indeed, all living organisms—are molecular beings, but certainly chemistry cannot come close to exhausting the complexity of human cognition or interaction. What all this suggests is that history on a very large scale is more than the sum of any number of local histories. This has significant implications, as one of Christian’s colleagues, Marne Hughes-Warrington, has noted recently. The hierarchical notion that large-scale history is simply derivative of small-scale histories is called into question, as is the presumption that the latter has more methodological rigor and greater access to historical meaning. I would add that an emergentist approach to historical inquiry would imply that methods and explanations appropriate at the smaller scales may well be overly reductionistic and unsatisfactory when history is done at larger scales.

Tellingly, Christian admits that in Maps of Time he is doing much more than synthesizing a lot of scholarly work. He is, in fact, composing what amounts to a modern, scientific creation myth. In doing so, he consciously addresses a fundamental need of humans to raise and offer answers to the big questions. The academic disciplines have failed in this regard, offering at best fragmented accounts of reality. By carving up the intellectual world into separate disciplines we have made it all but impossible to offer a unified account for why things came to be the way they are. Worse still, the studied avoidance of the big questions results in an academic culture wherein many historians find the meaning of the past in things like a 16th-century miller’s testimony before the Inquisition, a massacre of cats in Paris in the late 1730s, or the journal of a Maine midwife in the late 18th and early 19th centuries. Creative, even brilliant, as these historical excursions are, they testify to how deflated the notion of meaning becomes when we ask too little of the past.1

And this brings us back to that discredited notion of universal history. With the appearance of books like The Long Summer and especially Maps of Time, it is fair to ask a number of questions: Should universal history still make us blush with embarrassment? Is it really too pretentious for historians to raise the big questions? Or is the pretension primarily a function of the answers one gives? Does history as it is presently practiced exhaust what we can know about the past? What are the limits of historical inquiry?2 If humans have a basic need to render the chaos of the past—yes, maybe even all of it—into some sort of coherence, who should do the heavy lifting? Philosophers? Science writers? Theologians? Historians? This is not to argue for the enduring value of what Wells, Toynbee, or the Durants wrote. Far from it. But perhaps Louis O. Mink got it right when he claimed in 1978 that “the concept of universal history has not been abandoned at all, only the concept of universal historiography.” The big questions never do go away.

Donald A. Yerxa is professor of history at Eastern Nazarene College and editor of Historically Speaking: The Bulletin of the Historical Society.

1. I am indebted to Bruce Mazlish, who made a similar observation in the service of a very different argument in a 1999 review essay, “Big Questions? Big History?”, History and Theory, Vol. 38, No. 2 (May 1999), pp. 232–48.

2. For a provocative examination of this question, see Constantin Fasolt, The Limits of History (Univ. of Chicago Press, 2003).

Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromDonald A. Yerxa

Robert L. Millet

The memoir of a disaffected Mormon.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

In Leaving the Saints, Martha Beck, popular “life coach” and author of a regular column in Oprah Winfrey’s monthly magazine, recounts her disillusionment with the Church of Jesus Christ of Latter-day Saints, in which she was raised. Her memoir commences with her journey back to Utah (after receiving undergraduate and graduate degrees at Harvard), where Beck and her husband John reunite with family and friends and become employed at Brigham Young University, and concludes with their painful exodus from BYU and the Church some five years later. Latter-day Saints and those somewhat familiar with the faith who choose to read the book will cringe, roll their eyes, and even chuckle, while those who are thoroughly unacquainted with the faith will cringe, roll their eyes, and chuckle. But for different reasons. Beck is a fine writer who blends her eloquent prose with a nifty wit. So if one is not terribly concerned with what really took place, this book is a good read: it would make for great fiction.

Beck seeks to equate weird anomalies in Mormon culture with the norm. For example, the “niceness” of Mormon folk is really only the “top layer” of the LDS lifestyle; Latter-day Saints are robotic and Pollyannaish; they “make the Trapp Family Singers look like Hell’s Angels.” LDS women believe that making cakes from scratch will lead to a higher reward hereafter than using a mix. When not baking cakes, they keep “grinding away at the one occupation recommended for Mormon females: breeding well in captivity.” All Mormons agree with the current president of the Church that mothers should not work under any circumstances. And their menfolk? “Men [at BYU] must … wear socks, on the premise that the hair on human ankles can be thought of as an extension of pubic hair.” Mormon speakers often get weepy when talking about “stockpiling ammunition for the Apocalypse.” This is plain ole nonsense.

How did Beck arrive at this point? She tells us. During a period of excessive frustration over what she perceived as stifled academic freedom at BYU, she retired to the university library to read all she could find on Sonia Johnson, the former Mormon who had fought the Church so vigorously over the Equal Rights Amendment during the 1970s. To Beck’s utter amazement, there was nothing there. All of the articles, essays, and press releases on Johnson were nowhere to be found; they had all been carefully and mysteriously removed from the library, Soviet-style. This is ludicrous. Imagine the futile effort required to do such a thing. Imagine what a story the discovery of such a cover-up would have made. Thankfully, the stolen goods have been returned since then; the browser can read of Sonia Johnson to his heart’s content!

Further, Beck speaks of a sneaky group of sleuths innocuously known as the “Strengthening the Membership Committee” but in reality—so Beck says—”a squad of investigators who work for the Church. Very hush-hush. A lot of ex-CIA guys.” This is paranoia at its best (or worst). The Strengthening Church Members Committee is actually made up of a small group of general Church authorities who evaluate polemical materials against the Church and find ways to reach out to the disaffected. But Beck seems to be a magnet for improbable happenings. Consider her account of a time when she decided to have her hair cut short. A good Mormon woman, she notes, always has long, curled hair until middle age. Hence, she relates, “The stylist checked my left hand for a wedding ring, then reported my request to the owner of the salon, who asked me to call my husband to ascertain that I had his permission to change my hairstyle.” Get out of town! I’ve never heard of anything like that in 57 years of Church membership.

Memoirs by disaffected members of this or that religious group appear regularly, generally to little notice. But apart from her platform, which guarantees an audience, and her superior skills as a writer, there’s another reason Beck’s memoir has received far more attention than the typical product of the deconversion genre. Her father, Hugh Nibley, is known throughout Mormondom as perhaps the most significant LDS apologist of the 20th century and one of our finest social commentators. In Leaving the Saints, Beck accuses her father of academic fraud and sexual abuse. Beck claims to have learned from a strange man in a tweed sport coat (called “Tweedy,” a kind of Mormon Deep Throat) that her father’s books are a hoax and that a good 90 percent of his footnotes are totally made up. The problem for Beck, of course, is that the books are still in print, still available for examination. If they weren’t checked properly thirty years ago, they can be checked today. Further, I know personally many if not all of the source checkers; they are outstanding academics from such BYU departments as Ancient Scripture, Asian and Near Eastern Languages, Law, the Library, English, and Classics.

But at the very heart of this book is Beck’s effort, in a Provo motel room, to confront her father about a kind of Egyptian ritual abuse (which she claims took place between ages five and eight) and elicit from him an admission of the heinous deed, to allow him to rid himself of the guilt before he dies. (Hugh Nibley did, by the way, pass away in late February of 2005.) Beck artfully weaves this uncomfortable scene of confrontation throughout the narrative.

How trustworthy a narrator is she? About herself Beck appears to be candid to a fault. She speaks of bouts of anorexia, thoughts of suicide, and insomnia mixed with nightmares. Yet one matter was completely left out of the book—a terribly important omission. Contrary to the impression she seeks to leave with the reader, Martha and John Beck did not leave BYU and the Church merely because of the “purge” of campus dissidents and a spirit of paranoia pervading the institution. They left because they both chose to come out of the closet as practicing homosexuals, which lifestyle is in violation of the BYU Honor Code and the teachings of the Church. Beck has since chosen to strike out against a Church that maintains a moral standard with which she obviously disagrees. As Hugh Nibley observed to his daughter in a conversation recorded in this book, so often “people leave [the Church], but they can’t leave it alone. Always attacking, always lashing out, because you can’t get away from the fact that it’s the Lord’s work.”

On February 22, 2005, Beck’s siblings issued a statement that said in part:

We are saddened by the book’s countless errors, falsehoods, contradictions, and gross distortions. … Martha’s most egregious accusation—that our father molested her over several years and the family covered up the crime—is not true. While salacious accusations sell books, the reader should know that in this case it simply did not happen. These allegations dishonor real abuse survivors who lose credibility and suffer increased anguish when false accusations are exposed. … Intellectual honesty is a fundamental value of the Nibley family, and sadly we do not see that tradition reflected in Leaving the Saints.

As one of my colleagues recently pointed out, there are only so many options when it comes to evaluating Beck’s claims of abuse at the hands of her father: (1) it happened, just as she said it did; (2) she was sexually abused, but by someone else; (3) she was not abused, but believes she was; and (4) she was not abused, and knows she was not. I have served for many years in a pastoral capacity, and so I am prone to come to full attention when a person claims to be a victim of vile or perverse activity. I am not one to make light of such accusations, and in many cases the truth can never be determined with certainty. In this instance, however, given the statement of Beck’s own family, the massive misrepresentations throughout the book, and the noble character of Nibley, I choose to disbelieve the accusations of abuse.

Hugh Nibley’s funeral was held on March 2, 2005 in the Provo Tabernacle. The messages by friends and Church leaders were laudatory in behalf of a beloved champion of the faith. The thing that touched me most deeply, however, was the remarks of Hugh’s children, seven of whom were present (Beck did not attend). Each one of them paid moving tribute to “Daddy.”

According to Peggy Fletcher Stack of the Salt Lake Tribune, Beck “felt her father’s presence for two hours Thursday morning [the day he died]. ‘He was so beautiful, full of love and joy,'” Beck said. And then came this fascinating remark: “‘I hope I can live the rest of my life to honor his memory, as paradoxical as that seems.'” Paradox is seeming contradiction. Beck’s book is certainly more than a seeming contradiction. It is a slap in the face of one of Mormonism’s greatest intellectuals and yet another roadblock to a religious tradition seeking to be better understood in a world that is desperately in need of understanding.

Robert L. Millet is Richard L. Evans Professor of Religious Understanding at Brigham Young University and the author of A Different Jesus? The Christ of the Latter-day Saints, just published by Eerdmans. Readers may contact robert_millet@byu.edu for a more extensive version of this review.

Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromRobert L. Millet

Andrew P. Morriss

On misdiagnosing the problem.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

Economics is the science of understanding choice in conditions of scarcity. We have scarce resources available to us (even Bill Gates, for whom time rather than money is likely the most constraining factor, experiences scarcity) and must decide how to spend those resources. Should I work more to earn money for a vacation at the beach or spread my leisure over the course of the year? Should the country buy guns or butter? Should we invest more in the search for the perfect mate or marry our current sweetheart? Economics has given us powerful insights into all these specific questions and many more. We know that incentives matter, that higher prices mean lower demand and greater supply, and that markets are the most effective means of allocating goods and services to their highest valued uses.

Page 3392 – Christianity Today (26)

The Paradox of Choice: Why More Is Less

Barry Schwartz (Author)

Harper Perennial

306 pages

$13.59

Page 3392 – Christianity Today (27)

Preference, Belief, and Similarity: Selected Writings (Mit Press)

Amos Tversky (Author), Eldar Shafir (Editor)

MIT Press

1040 pages

$65.00

Despite these successes, economic reasoning generally and markets in particular have been under attack for centuries for getting choices “wrong.” An early wave of criticism, only now receding, centered on markets’ inability to get prices “right.” Religious critics argued (and some continue to do so) that market prices varied from the “just price” and that excessive interest charges constituted usury. Karl Marx attacked the market economy for its extraction of “surplus value” from workers. Communists in the early 20th century attempted to substitute central planning for market-determined outcomes; some attempted to do so by substituting administratively-set prices for market prices to “correct” the price mechanism. Even in predominantly market economies such as the United States, regulators have “adjusted” regulated prices to accomplish various ends-regulated utility prices, for example, traditionally included extensive subsidies for favorite groups, paid for by higher charges for the less favored. Green critics of markets today seek to adjust prices of commodities such as oil to include “social” costs. For example, the green Left sees markets as undercharging consumers for the cost of their purchases—my consumption of a gallon of gasoline imposes costs on the rest of society that I do not pay, leading me to consume “too much” gasoline. Although the green critique is that prices are too low, while the red critique is that workers are paid too small a share of the price, these criticisms share a common theme: markets do not work because the prices of goods and labor are “wrong.”

The problem with this type of critique is the lack of a consistent alternative to replace market prices. Calculating a “just” price turns out to be extraordinarily difficult. If the price is below market, then producers will make too little to satisfy demand, and if the price is above market, producers will produce more than demanded. The result is either rationing by means other than price or warehouses full of unwanted goods. Today, a similar problem plagues the green Left’s attempts to calculate “social prices” for commodities. There is simply no non-arbitrary manner of calculating prices other than making use of the market. Of course not everyone is convinced, and one can still find plenty of advocates of just prices, surplus value, and social pricing. Nonetheless, price-based criticisms of market economies are no longer real threats to the dominance of markets.

As a result of this failure of price-based critiques of market economics, and the resounding success of market-based economies relative to the competition after the collapse of the Soviet empire revealed the truth about socialist economies, critics of markets have had to develop a new set of objections. Rather than focusing on a divergence between some hypothesized “true” price and observed prices, and thus a process flaw in the market mechanism, modern market-critiques focus on humans as market actors, postulating flaws in human nature that prevent markets from functioning efficiently. (To distinguish them from the price critiques, let’s call these the “psychology-based criticisms.”) Since the observation that human beings lack perfect reasoning is so obviously true as to be trite, the psychology-based critiques of market actors are much more powerful than the process criticism of the price critics. The two recent books reviewed here make strong cases for these psychology-based criticisms.

The first argument rests on the claim that markets provide too many choices and that making all those choices causes us psychological pain. In The Paradox of Choice, Barry Schwartz, a psychology professor at Swarthmore College, relates an experience familiar to many of us: he set out for the mall one day to buy jeans and found himself enmeshed in a lengthy process of choosing among a bewildering array of options. Did he want “slim fit, easy fit, relaxed fit, baggy, or extra baggy? … stonewashed, acid-washed, or distressed? … button-fly or zipper-fly? … faded or regular?” The problem with this array of choices, which could in theory allow him to get a pair of jeans that closely matched his style, color, and fit preferences, was that he now had to invest in making the decision. “Before these options were available, a buyer like myself had to settle for an imperfect fit, but at least purchasing jeans was a five-minute affair. Now it was a complex decision in which I was forced to invest time, energy, and no small amount of self-doubt, anxiety, and dread.”

At this point, many of the readers of this article will be thinking what I thought when I first read this account: Thank God I am not a psychology professor at Swarthmore! The ability to buy jeans without experiencing even a small amount of self-doubt, anxiety, or dread, something I had taken for granted before reading this book, now appears to be a major accomplishment. Nonetheless, Schwartz does set out a compelling case, built on psychological research and experiments, for the possibility that increased choice makes decisions more complex and so more difficult. As a result, Schwartz argues that people are better off at times with restricted choice.

Consider the following example, used by Schwartz. Until recently, American universities offered comparatively few curricular choices to students: “a largely fixed course of study, with a principal goal of educating people in their ethical and civic traditions. Education was not just about learning a discipline—it was a way of raising citizens with common values and aspirations.” Today, in contrast, colleges offer an extensive array of courses (he notes Princeton has more than 350 that satisfy its general education requirement alone; the much smaller Swarthmore has about 120). There certainly is little common learning going on among either Princeton or Swarthmore students, spread out among these disparate offerings. While recognizing the benefits of the expanded choices offered, Schwartz also argues that the intellectual freedom comes at a price: “Now students are required to make choices about education that may affect them for the rest of their lives. And they are forced to make these choices at a point in their intellectual development when they may lack the resources to make them intelligently.”

Choices do have consequences, and to the extent we are offered meaningful choices we will not only reap the benefits but also bear the costs of our choices. Schwartz identifies some more subtle problems with expanded choice: with more options, the information costs of decision making go up; the psychological costs of commitment rise; and the disappointment of not having the best increases. (Schwartz does a fine job of explaining the underlying psychological literature on which he bases his case for restricting choice, making it understandable for a lay audience without leaving out important details.)

The power in Schwartz’s critique is the resonance it has with the feeling of despair we may experience, at least briefly, when stepping into a massive shopping center and the overwhelming array of varieties of the item we seek turns a five-minute visit into an hour of comparison shopping. But is the problem really “too much” choice being generated by the market? In the case of jeans, it is easy to make fun of the vast array of options available if one is a fashion-impaired professor like myself and, presumably, Schwartz. For some consumers, such as my 16- and 12-year-old daughters, the expansion of jeans-options is a great benefit. The 16-year-old wears only boot-cut, traditional Wranglers—a negative fashion statement that clearly states her preference for the company of horses and her self-image as a cowgirl. My 12-year-old favors jeans with embroidered butterflies, fashionable colors, and accessory belts. The 16-year-old cares only about finding the appropriate size. The 12-year-old enjoys the hunt for the perfect pair of jeans. I find it remarkable that both can satisfy their preferences for the type of jeans and shopping experience. One buys hers from Western wear catalogues, the other from the mall. Each gets exactly what she seeks.

Many of the examples Schwartz gives fall into this category—choices he doesn’t value and so finds not worth the effort to make. It may be that he is correct, that expending effort on choosing the right pair of jeans is a waste of intellect. Yet it is also a chance for people to learn about making choices in an environment in which the wrong choice carries few consequences. The act of choosing, so painful to Schwartz, may bring joy to another with different preferences. Moreover, the market provides the means for those who dislike clothing shopping to avoid choices. Many clothing stores provide salespeople who know the stock and fashion trends. I know from experience that confessing ignorance to such a salesperson will enable me to quickly accumulate a reasonable wardrobe. Thus the market can solve the problem of choice while still providing the power of choice to those who seek it in a particular area. Once we recognize the diversity of human desires that prompts the provision of both boot-cut and embroidered jeans, the market’s responses to those desires are more understandable.

The second set of psychological criticisms comes from the work of Amos Tversky, accurately described by the introduction to Preference, Belief, and Similarity—a massive volume of his selected writings—as “a towering figure in the field of cognitive psychology and in the decision sciences.” He would surely have shared the Nobel Prize in economics won by his frequent collaborator Daniel Kahneman in 2002 had he not died of cancer in 1996. (The Royal Swedish Academy of Sciences does not award posthumous prizes.) Tversky documented carefully and thoroughly numerous instances in which people make recurrent and systematic errors in judgment, and offered a theoretical framework (known as prospect theory) to explain why such errors exist. This collection brings together papers spanning his career, which amply display both Tversky’s insights and his skill as a writer. Although the papers are technical ones, they are among the most accessible such writing I have ever encountered and most readers will be able to understand the important points even if some of the details of the underlying experiments or psychological theories are obscure.

To give but one example, Tversky and Kahneman’s 1974 paper “Judgment under Uncertainty: Heuristics and Biases” examines a series of common mistakes in reasoning. Consider the following description of an individual: “Steve is very shy and withdrawn, invariably helpful, but with little interest in people, or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail.” When asked to rank in order the probability that Steve is engaged in a list of professions (e.g. librarian, farmer, airline pilot, etc.), people regularly rank “librarian” more highly than farmer because they associate the description of Steve’s personality with librarians. There are so many more farmers than librarians (at least there were in 1974) that the correct ordering is farmer, librarian rather than the reverse. Because the information about the base rate of librarians and farmers does not affect our stereotypes of personalities, people overestimate the likelihood that Steve is a librarian. Tversky and Kahneman did several clever experiments that tested variations on this and found that there is a general problem in reasoning that causes people to ignore base rate information and make decisions based on representativeness. Even worse, when subjects were given information that conveyed no relevant information (e.g. an uninformative description), they made similar errors.

The more than 1,000 pages of articles reprinted in Preference, Belief, and Similarity document numerous examples of flaws in human reasoning that conclusively establish that people are remarkably bad decision makers under a startlingly wide range of circumstances. (Indeed, those looking for a catalog of tricks to use at cocktail parties can generate a good sized list while reading this book.) How we decide is influenced, Tversky found, by how the problem to be decided is framed for us.

This insight can be used as the basis for a powerful critique of market economies. If the actors in the marketplace are systematically making mistakes in judgment, then how can the market outcome derived from their decisions be “optimal,” “efficient,” “just,” etc.? Tversky’s critique is powerful because, like Schwartz’s, it taps into a deep sense of unease. Before us is spread a vast array of choices on everything from retirement investments to the type of spread to put on our morning toast (butter? regular margarine? what about trans fats? the new cholesterol-lowering spreads?). How can we be certain we’ve made the right choice? Some mistakes are unimportant, but when even the choice of what to spread on our toast affects our long-term health, some level of serious consideration seems warranted. It would be so much easier if someone else would make the choice simpler for us.

I think one reason both Schwartz’s and Tversky’s critiques strike a chord is our guilt over the material abundance that our vast number of choices reveal. We have many choices because we are rich; even the poorest American today lives a life of unimaginable luxury compared to Americans in the 19th century. Diseases are vanquished, fresh fruit and vegetables are available year round, uncountable books are available for free at public libraries, and clean water pours from taps. Can it be right to have such luxury when, as our grandparents regularly remind us, they walked uphill both ways three miles in the snow to fetch water every day? Or, more seriously, when so many in the world lack food and medicine, should we have the choice of fifty styles of jeans?

It is right, and we should. We have this abundance of material goods and choices in part because we live in a society that has made some crucial choices itself. Our ancestors chose a regime of property rights and rule of law that led to material abundance beyond their wildest dreams. With that abundance we have our own choices to make: should we buy those $100 jeans or drop the money in the mission box at church? We are fortunate to have the choice but, to paraphrase Spiderman in last summer’s blockbuster film, “with many choices comes great responsibility.” We are ultimately responsible for our choices and, as flawed human beings, we (or at least I) often make bad choices.

Schwartz’s and Tversky’s accounts are attractive to many in our society because they are completely secular critiques of the problem of choice. There are too many choices, and we feel bad because it is too hard to make so many. Schwartz’s prescription is that we should make fewer choices and be happier in those we do make, not that we take responsibility for the choices we make. Tversky relocates the blame to unconscious decision rules. (Although these papers do not address many policy prescriptions, others have relied on Tversky’s work to justify a variety of interventionist policies.) Market forces and psychological characteristics beyond our control are to blame for offering us too many choices and limiting our capacity to make good choices.

This secularism is ultimately the major flaw in the psychological critiques of the market. God gave humans free will to allow us to make choices as a moral matter. Because of our fallen nature, we often make bad choices with that freedom. To pin the blame on the market or on our psychological makeup is to blur our responsibility for our choices and to attempt to wriggle free from accountability for those choices. In contrast, Christianity asks us to make choices according to some relatively straightforward but difficult-to-live-up-to principles. With the vast array of easy, bad choices before us, that’s a tough thing to do. But we can’t evade that responsibility as easily as these accounts would lead us to believe.

Just as the price critiques of the market ultimately foundered on their inability to provide a plausible account of what prices should be, the psychological critiques founder on their inability to offer a convincing alternative account of human nature. Economics succeeds at explaining much of human behavior in making choices because of the power of its simple and straightforward assumptions about human nature (e.g., when the price of something goes down, people want more of it). Enriching the economic model with the insights of psychological research, as Schwartz and Tversky have undoubtedly done in their careers, offers the potential to extend that explanatory power to previously inexplicable cases. The temptation to embed our own preferences for the good into others’ lives—while we “fix” those choices that the enriched theory predicts people will get “wrong”—is almost overwhelming. It must be avoided if we take individual freedom seriously. Just as we cannot know the “just” price, we cannot know the “just” number of varieties of jeans (or anything else) that ought to be on sale. What we can know through resort to prayer, study, and meditation on our religious obligations is how to make our choices when next we walk into the mall to buy jeans. Focusing on getting our own choices right, something Schwartz’s and Tversky’s insights can help with, is surely going to be difficult enough that we can leave others’ choices to others.

Andrew P. Morriss is Galen J. Roush Professor of Business Law and Regulation at Case Western Reserve University School of Law and Senior Associate, PERC—The Property and Environment Research Center.

Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromAndrew P. Morriss

Mark Packer

Collective memory personalized.

  1. View Issue
  2. Subscribe
  3. Give a Gift
  4. Archives

Jewish-born artists flourished in the United States throughout the 20th century. A survey of their work reveals a rich and diverse range of styles and attitudes toward Jewish themes, from unconscious identifications with, to explicit dissociations from, Judaism and Jewish culture. This has engendered an intriguing puzzle that has haunted art historians for close to a hundred years: Is there such a thing as “Jewish art”? If so, what are its defining characteristics?

Page 3392 – Christianity Today (29)

Raphael Soyer and the Search for Modern Jewish Art

Samantha Baskind (Author)

The University of North Carolina Press

280 pages

$10.72

By midcentury, the number of Jewish-born artists attracting market and critical attention had proliferated. Some were figurative painters. Others made their reputations as abstract expressionists. A few warmly avowed their Jewish heritage, while many feared rejection if they did so too explicitly. To complicate matters further, by century’s end a new generation of Jewish-born painters had come to maturity meditating on the Holocaust. In comparison with their immediate predecessors, their work reflects far less anxiety about reconciling Judaism with the demands of a secular society, and more with making sense of a mechanized world where murder can instantly reach epic proportions.

It would seem, then, that with each new generation and every new movement and style, any hope of resolving the enigma about Jewish art grows fainter. Yet, the scholarly mission to define Jewish art continues. Several new books address these questions with a freshness of presentation that merits close attention.

Fixing the World: Jewish American Painters in the Twentieth Century, by Ori Z. Soltes, is fairly brief but beautifully illustrated with color plates that trace the full scope of Jewish painting throughout the 20th century, especially in New York. The sweep of Soltes’ study is nothing less than epic. Beginning with 19th-century Europe, he examines paintings by Moritz Daniel Oppenheim and Camille Pisarro before turning his attention to 20th-century New York. Soltes presents a lavish series of plates that includes work by artists many of whom you probably never knew were Jewish: Philip Guston, Morris Lewis, and Larry Rivers, for example. His central premise is that Jewish art is motivated by the ancient religious injunction to fix the world—in Hebrew, tikkun olam. What holds these disparate stylists together in a single, identifiable category, Soltes argues, is their passionate concern for bringing justice to a fallen world.

Soltes is right on at least one point. Social commentary, even pungent critique, is often found in the work of Jewish artists. But he is not entirely convincing when arguing that tikkun olam is necessary or sufficient for identifying art as Jewish. Beginning in the mid-19th century, for example, many non-Jewish painters, such as Courbet and Daumier, assumed the role of social commentator, calling attention to the hypocrisy and injustice they perceived in the contemporary social order. On the other hand, there is barely an intimation of tikkun olam in paintings by some of the Jewish artists included in Soltes’ book, such as Modigliani, Samuel Halpert, and Louis Lozowick. Soltes’ emphasis on tikkun olam as the defining theme of Jewish art thus becomes strained to the point of incredulity at times. The chief merit of his study consists rather in its panoramic and well-illustrated survey, and the aesthetic insight he brings to individual paintings when he describes their purely visual elements. At least as he handles the subject, the Jewishness of the artists included in this volume appears to be little more than an incidental fact about them.

A far more comprehensive response to the question about Jewish art may be found in Masterworks of the Jewish Museum. That there exists something called Jewish art is left beyond doubt by the introductory essays and arrangement of material from the museum’s extraordinary collection. In response to the question, “What, then, are the defining characteristics of Jewish art?” the text provides a pluralistic answer through magnificent reproductions and insightful essays that explore four thousand years of religious, historical, cultural, stylistic, and political themes. The richness and diversity of works included in the book represent only a very small sample of the 28,000 individual pieces in the museum’s permanent collection. This contextual perspective alone should serve as an advisory to scholars who attempt a more simplified answer to questions about what constitutes Jewish art, especially those who concentrate primarily on New York.

This is a potential danger that faces Samantha Baskind, author of Raphael Soyer and the Search for Modern Jewish Art. Baskind focuses on just one painter from the 20th century, using very few color plates and only a handful of black-and-white illustrations of his work. An artist with credentials lesser-known to the general public, Raphael Soyer (1899-1987) arrived in America with his family at age ten. Throughout his life, he remained profoundly self-conscious and insecure about his situation: a Russian-born, heavily accented Jewish immigrant living in New York when anti-Semitism was still an active part of the local culture. In comments he made throughout his career, Soyer overtly disavowed the Jewishness of his art, preferring to identify himself instead as “a New York painter.” It was not until 1970, when he agreed to illustrate a collection of stories by Isaac Bashevis Singer, that Soyer consciously took up Jewish themes in his work. By then, he was 71 years old. Nevertheless, Baskind argues that Soyer’s art was explicitly Jewish throughout his life, his disclaimers on this point notwithstanding.

It is well to beware of art historians who audaciously claim to understand the work of their subjects better than these artists understood themselves. Baskind thus assumes a formidable burden of proof when she states that she will not take Soyer at his word when he claims that his art is not Jewish. But much to her credit, she does avoid hubris by paying meticulous attention to the evolution of Soyer’s style—which reflects his personal struggle, endured over an entire lifetime, with what it actually means to be Jewish.

The intellectual problem that art historians encounter when trying to define Jewish art is a difficulty for Jewish artists too when they bring the spiritual dimensions of their souls to artistic expression. Jewish art is so profoundly hard to define because there is no obvious or logical sense to being Jewish in the first place. Is someone a Jew as a matter of birth, even if that individual does not understand herself to be a Jew? Is someone Jewish because another identifies him as such, whether it is a Jew or anti-Semite handing out the labels? If one does recognize him- or herself to be Jewish, what constitutes the identification? Is it religious? Cultural? A sense of belonging to a Jewish community, or political kinship with Israel? And how are such modes of identity to be expressed aesthetically? Must tikkun olam play an essential part? How about the Holocaust, or Jewish religious observance and history?

To the best of my knowledge, no one has yet been able to answer these questions. But perhaps the resistance of the problem to aesthetic, philosophical or historical solution is part of the mystery, and thus the very sense, of what it means to be Jewish. Jews are the people who have to live and struggle with the question of what it means to be Jewish. To non-Jews who take up the problem, this is an intellectual puzzle at best or a political question at worst. But the personal and existential struggle with the meaning of the question, like Jacob wrestling with the mysterious stranger who eventually names him Israel, is part of what it is to have a Jewish identity at all.

Baskind understands this dimension of the problem as it manifests itself in the work of Raphael Soyer. This is an issue the artist wrestled with throughout his life, sometimes most poignantly in the manner in which he chose to avoid it. If it took seventy years for him to make a genuine discovery of his Jewishness, then Soyer was clearly ahead of many of us. As demanding as her burden of proof is, Baskind does her subject justice when she surveys Soyer’s conflicts with questions about his Jewish identity, and shows how the magnificent art he bequeathed to us as the legacy of this struggle was itself a contribution to tikkun olam.

In the end, though, I must return to Masterworks of the Jewish Museum as my aesthetic and historical compass. New York in the 20th century certainly posed unique challenges and availed extraordinary opportunities to artists of Jewish birth. But what about the preceding 39 centuries, and other Jewish communities scattered across the globe? What are they? Chopped liver?

To reduce the entirety of Jewish art to that produced in 20th-century New York is to miss a major point. Demanding that a single criterion be satisfied to identify art as Jewish in the first place misses a bigger point still. As Masterworks makes clear, whatever else it may be, Jewish art is collective memory personalized, consisting of diverse struggles with and meditations on the existential significance of Jewish identity, tragedy, joy, and transcendence. Four thousand years of such art can not be reduced to a single definition or category of meaning. By itself, the collection of the Jewish Museum testifies magnificently to this fact.

Mark Packer teaches in the Departments of Art History and Philosophy at University of South Carolina, Upstate in Spartanburg.

Copyright © 2005 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

    • More fromMark Packer

Page 3392 – Christianity Today (2025)

References

Top Articles
Latest Posts
Recommended Articles
Article information

Author: Aron Pacocha

Last Updated:

Views: 5321

Rating: 4.8 / 5 (48 voted)

Reviews: 95% of readers found this page helpful

Author information

Name: Aron Pacocha

Birthday: 1999-08-12

Address: 3808 Moen Corner, Gorczanyport, FL 67364-2074

Phone: +393457723392

Job: Retail Consultant

Hobby: Jewelry making, Cooking, Gaming, Reading, Juggling, Cabaret, Origami

Introduction: My name is Aron Pacocha, I am a happy, tasty, innocent, proud, talented, courageous, magnificent person who loves writing and wants to share my knowledge and understanding with you.