Friday, February 1, 2008

How to Talk About Books You Haven't Read



Carrying this book around recently I’ve caught more than a little flak, not least from my kids, who once thought of me as a literary intellectual, or at the very least as a guy who espoused the virtues of reading. Hey, really, I told them — as well as my wife and the guy sitting next to me on the subway — no kidding, it’s a serious book, written by a professor of literature who’s also a psychoanalyst. A French professor/shrink, no less, who’s written books on Proust, Maupassant, Balzac, Laclos and Stendhal, among other canonical heavyweights. So lay off.

It seems hard to believe that a book called “How to Talk About Books You Haven’t Read” would hit the best-seller lists in France, where books are still regarded as sacred objects and the writer occupies a social position somewhere between the priest and the rock star. The ostensible anti-intellectualism of the title seems more Anglo-Saxon than Gallic, an impression reinforced by the epigram from Oscar Wilde: “I never read a book I must review; it prejudices you so.”

Bayard’s critique of reading involves practical and theoretical as well as social considerations, and at times it seems like a tongue-in-cheek example of reader-response criticism, which emphasizes the reader’s role in creating meaning. He wants to show us how much we lie about the way we read, to ourselves as well as to others, and to assuage our guilt about the way we actually read and talk about books. “I know few areas of private life, with the exception of finance and sex, in which it’s as difficult to obtain accurate information,” he writes. There are many ways of relating to books that are not acknowledged in educated company, including skimming, skipping, forgetting and glancing at covers.

Bayard’s hero in this enterprise is the librarian in Robert Musil’s “Man Without Qualities” (a book I seem to recall having read halfway through, and Bayard claims to have skimmed), custodian of millions of volumes in the country of Kakania. He explains to a general seeking cultural literacy his own scheme for mastery of this vast, nearly infinite realm: “If you want to know how I know about every book here, I can tell you! Because I never read any of them.” If he were to get caught up in the particulars of a few books, the librarian implies, he would lose sight of the bigger picture, which is the relation of the books to one another — the system we call cultural literacy, which forms our collective library. “As cultivated people know,” Bayard tells us, “culture is above all a matter of orientation. Being cultivated is a matter of not having read any book in particular, but of being able to find your bearings within books as a system, which requires you to know that they form a system and to be able to locate each element in relation to the others.”

Musil’s librarian is a purist, but a perusal of the reviews in this and other publications would probably yield, if only we had the proper instruments, many less extreme examples of literate nonreading. Book reviewers generally imply that they have read the entire oeuvre of the author under discussion, as well as those of his peers, and I have no doubt they will continue to do so. You’d think Nicholson Baker’s “U and I” (a short book I read in its entirety), in which the younger novelist writes a kind of critique of John Updike based on his admittedly fragmentary and incomplete reading, would have cured us of the omniscient stance in book reviewing. But I don’t see many phrases like “From what I’ve read about ‘Moby-Dick ...” or “the part of ‘Finnegans Wake’ that I tried to read ...” in the review pages. Bayard, though, regards such disclaimers as understood. He doesn’t blame us for fudging, and he doesn’t want us to blame ourselves.

He proposes, and employs, a new set of scholarly abbreviations to go along with op. cit. and ibid.: UB: book unknown to me; SB: book I have skimmed; HB: book I have heard about; and FB: book I have forgotten.

For Bayard, who is well served by Jeffrey Mehlman’s fluid and elegant translation, skimming and sampling are two of the most common forms of reading behavior, particularly with regard to Proust. Paul Valéry, in his funerary tribute in La Nouvelle Revue Française, makes a virtue out of his admittedly sketchy knowledge of Proust by claiming: “The interest of the book lies in each fragment. We can open the book wherever we choose.” Bayard defends skimming as a mode of reading. “The fertility of this mode of discovery markedly unsettles the difference between reading and nonreading, or even the idea of reading at all. ... It appears that most often, at least for the books that are central to our particular culture, our behavior inhabits some intermediate territory, to the point that it becomes difficult to judge whether we have read them or not.”

Lest the reader, or the nonreader, think that Bayard underestimates the power of reading, he proposes that we are all essentially literary constructs, defined by our own inner libraries: the books we’ve read, skimmed and heard about. “We are the sum of these accumulated books,” he writes. (And make no mistake about it, this prof is far more literate and widely read than he pretends to be.)

After anatomizing the different types of nonreading, Bayard addresses the social implications in a section called “Literary Confrontations.” I commend his advice for meeting an author and being forced to say something about his or her new book: “Praise it without going into detail.”

The funniest section in the book describes the encounter between the anthropologist Laura Bohannan and an African tribe, the Tiv, whom she has been living among. She tries to read “Hamlet” to them in the hopes of demonstrating the universality of the story, but the way in which the tribe rejects those parts of the tale that don’t square with their own cultural traditions — they don’t believe in ghosts, for instance — renders the attempt ludicrous.

Bayard proposes the term “inner book” to designate “the set of mythic representations, be they collective or individual, that come between the reader and any new piece of writing, shaping his reading without his realizing it.” This notion coincides with Stanley Fish’s concept of “interpretive communities” of readers, although Bayard’s own inner book may be more indebted to home-team text destabilizers like Derrida and Lacan. Indeed, Bayard sounds more French in the later pages as he employs phrases like “consensual space” and dissolves the boundaries and false oppositions between reader and writer and book into one big sloppy pool of écriture.

To what end? Bayard finally reveals his diabolical intent: he claims that talking about books you haven’t read is “an authentic creative activity.” As a teacher of literature, he seems to believe that his ultimate goal is to encourage creativity. “All education,” he writes, “should strive to help those receiving it to gain enough freedom in relation to works of art to themselves become writers and artists.”

It’s a charming but ultimately terrifying prospect — a world full of writers and artists. In Bayard’s nonreading utopia the printing press would never have been invented, let alone penicillin or the MacBook.

I seriously doubt that pretending to have read this book will boost your creativity. On the other hand, reading it may remind you why you love reading.

Wednesday, December 12, 2007

I Am America (and So Can You!)




Books are for pantywaists. Or at least that's how ''Stephen Colbert,'' the excitable commentator played to rock-star perfection by Stephen Colbert, viewed them before he became a published author. Now comes the flip-flop, as Mr. Colbert brings the gale-force power of his promotional talents to the hawking of ''I Am America (And So Can You!),'' a booklike object with his face plastered on its cover. Books are still for pantywaists, but now they're for souvenir-seeking denizens of what is modestly called the Colbert Nation.

The fans are primed because the energy level of Mr. Colbert's television show is soaring. ''The Colbert Report'' -- with a title that's eponymous, the way Mr. Colbert prefers everything -- currently beams with irrational exuberance. The show is sharp and innovative in ways that could have followed it to the coffee table, but that hasn't happened. The full-monty Colbert television brilliance doesn't quite make it to the page.


''I Am America (And So Can You!)'' certainly has its moments. (''You Can't Hurry Love -- but you can certainly take the shortcut. Instead of paging through Match.com, try flipping through the family photo album.'') They expand upon the Colbert persona, that of a self-loving loudmouth perched on the famous fine line between stupid and clever. The book is divided into chapters on big topics (''The Family,'' ''Religion,'' ''The Media,'' ''Race'') and stresses the exclusive Colbert pedigree of its thoughts on each of them. ''You won't find these opinions in any textbook,'' he says, ''unless it happens to be one I've defaced.''

''America (the Book),'' the ''Daily Show'' spinoff that is the prototype for ''I Am America,'' was also the collective effort of television staff writers trying to replicate their on-the-air style. But it was neither inspired by nor tethered to a single stellar character. That gave it room to maneuver through a wide range of subjects, as well as a gleeful, anything-goes spirit of adventure. The narrower ''I Am America'' sticks to ravings suitable for a mock Colbert memoir and further limits its range by avoiding explicit talk of government or politics -- though it culminates in a reprint of Mr. Colbert's blistering political speech delivered at the 2006 White House Correspondents' Dinner.

''I Am America'' describes ''heroes'' as ''people who did not skip ahead'' to that speech ''but read the book from start to finish as intended.'' Heroism aside, to experience the speech in print is to understand what ''I Am America'' is missing.

Mr. Colbert and his staff write for a particular character with impeccable, deadpan delivery, and there is no book-worthy equivalent of what happens when the real McCoy gets near a microphone. The printed speech falls surprisingly flat. Neither this chapter nor the rest of ''I Am America'' is helped by little red annotations in the margins, though these, too, mimic a tactic that happens to be funny on TV.

Still, the sharp-elbowed Mr. Colbert will deservedly work his way toward the top of best-seller lists, no matter what he has to do to current competitors like Alan Greenspan, Ann Coulter, Oprah Winfrey, Eric Clapton or Mother Teresa. His book may not replicate a winning formula, but it's certainly a valentine to his proven success. Its tone is typically dictatorial (this, to him, means a person whose book is dictated), as when it warns readers that ''no image of me should ever be removed from this book for any purpose, including, but not exclusively: book reports, decorating walls, or placing in your wallet to imply our friendship.'' Not for nothing does this book's reproduction of Leonardo da Vinci's Vitruvian Man feature Colbert eyeglasses and enlarged testicles as bonus features.


Among the funnier sections is the ''Higher Education'' chapter. It includes what purports to be Mr. Colbert's college application essay, featuring ripe malapropisms, overuse of a thesaurus (''the apex, pinnacle, acme, vertex, and zenith of my life's experience'') and the lying claim that his great-great-uncle's name is on a building at Dartmouth. There are also fake course selections with student annotations, among them ''Ethnic Stereotypes and the Humor of Cruelty'' (''A professor will tell you a bunch of hilarious jokes, and you're not allowed to laugh'') and ''Dance for Men.'' (''Go ahead. Break your mother's heart.'') Heterosexuality that protests too much is a big part of the official Colbert attitude.

A glossary on science is another high point. (On cloning: ''No free labor source is worth all of this trouble.'') And it well suits Mr. Colbert's opposition to all forms of progress. (The smallpox vaccine ''may have saved a few thousand lives, but it also destroyed the magic amulet industry.'')

The ''Sex and Dating'' chapter also heavily emphasizes science, since Mr. Colbert is in some ways the Tom Lehrer of his day. Mr. Lehrer's sharp satire and erudite academic stunts, like his classic musical rendition of the Periodic Table, are forerunners of Mr. Colbert's subversive whiz-kid humor. ''I often think back fondly on the memories I haven't repressed,'' the book says in this sneaky spirit.

When it refers to the American family as ''a Mom married to a Pop and raising 2.3 rambunctious scamps'' or to a cat named Professor Snugglepuss, ''I Am America'' gets lazy. The same goes for a sophomoric crack about why books are scary: ''You can't spell 'Book' without 'Boo!''' And this book is capable of better witticism than: ''Now I'm not the smartest knife in the spoon.'' But it doesn't take the smartest knife in the spoon to understand the point of this undertaking. If ''I Am America (And So Can You!)'' had nothing but its title, its Colbert cover portrait and 230 blank pages instead of printed ones, it would make a cherished keepsake just the same.

Thursday, April 12, 2007

Evolution for Everyone: How Darwin's Theory Can Change the Way We Think about Our Lives

Sociable Darwinism
Just as in the classic clashes of nature, where every mutational upgrade in a carnivore’s strength or cunning is soon countered by a speedier or more paranoid model of antelope, so the pitched struggle between evolutionary theory and its deniers has yielded a bristling diversity of ploys and counterploys. The heavyhanded biblical literalism of creationist science evolves into the feints and curlicues of intelligent design, and the casual dismissiveness with which scientists long regarded the anti-evolutionists gives way to a belated awareness that, gee, the public doesn’t seem to realize how fatuous the other side is, and maybe it’s time to combat the creationist phylum head on. And so, over the last few years, scientists have unleashed a blitzkrieg of books in defense of Darwinism, summarizing the Everest of supportive evidence for evolutionary theory, filleting the arguments of the naysayers or reciting, yet again, the story of Charles Darwin, depressive naturalist extraordinaire, whose increasingly pervasive avuncular profile has lofted him to logo status on par with Einstein and the Nike swoosh.

David Sloan Wilson, an evolutionary biologist at Binghamton University, takes a different and decidedly refreshing approach. Rather than catalog its successes, denounce its detractors or in any way present evolutionary theory as the province of expert tacticians like himself, Wilson invites readers inside and shows them how Darwinism is done, and at lesson’s end urges us to go ahead, feel free to try it at home. The result is a sprightly, absorbing and charmingly earnest book that manages a minor miracle, the near-complete emulsifying of science and the “real world,” ingredients too often kept stubbornly, senselessly apart. Only when Wilson seeks to add religion to the mix, and to show what natural, happy symbionts evolutionary biology and religious faith can be, does he begin to sound like a corporate motivational speaker or a political candidate glad-handing the crowd.

In Wilson’s view, Darwin’s theory of evolution by natural selection has the beauty of being both simple and profound. Unlike quantum mechanics or the general theory of relativity, the basic concepts behind evolutionary theory are easy to grasp; and once grasped, he argues, they can be broadly applied to better understand ourselves and the world — the world both as it is and as it might be, with the right bit of well-informed coaxing. Wilson has long been interested in the evolution of cooperative and altruistic behavior, and much of the book is devoted to the premise that “goodness can evolve, at least when the appropriate conditions are met.” As he sees it, all of life is characterized by a “cosmic” struggle between good and evil, the high-strung terms we apply to behaviors that are either cooperative or selfish, civic or anomic. The constant give-and-take between me versus we extends down to the tiniest and most primal elements of life. Short biochemical sequences may want to replicate themselves ad infinitum, their neighboring sequences be damned; yet genes get together under the aegis of cells and reproduce in orderly fashion as genomes, as collectives of sequences, setting aside some of their immediate selfish urges for the sake of long-term genomic survival. Cells further collude as organs, and organs pool their talents and become bodies. The conflict between being well behaved, being good, not gulping down more than your share, and being selfish enough to get your fair share, “is eternal and encompasses virtually all species on earth,” he writes, and it likely occurs on any other planet that supports life, too, “because it is predicted at such a fundamental level by evolutionary theory.” How do higher patterns of cooperative behavior emerge from aggregates of small, selfish units? With carrots, sticks and ceaseless surveillance. In the human body, for example, nascent tumor cells arise on a shockingly regular basis, each determined to replicate without bound; again and again, immune cells attack the malignancies, destroying the outlaw cells and themselves in the process. The larger body survives to breed, and hence spawn a legacy far sturdier than any tumor mass could manage.

As with our bodies, so with our behaviors. Wilson explores the many fascinating ways in which humans are the consummate group-thinking, team-playing animal. The way we point things out to one another, for example, is unique among primates. “Apes raised with people learn to point for things that they want but never point to call the attention of their human caretakers to objects of mutual interest,” Wilson writes, “something that human infants start doing around their first birthday.” The eyes of other apes are dark across their entire span and thus are hard to follow, but the contrast between the white sclera and colored iris of the human eye makes it difficult for people to conceal the direction in which they are looking. In the interdependent, egalitarian context of the tribe, the ancestral human setting, Wilson says, “it becomes advantageous for members of the team to share information, turning the eyes into organs of communication in addition to organs of vision.” Humans are equipped with all the dispositional tools needed to establish and maintain order in the commons. Studies have revealed a deep capacity for empathy, a willingness to trust others and become instant best friends; and an equally strong urge to punish cheaters, to exact revenge against those who buck group rules for private gain.

Of course, even as humans bond together in groups and behave with impressive civility toward their neighbors, they are capable of treating those outside the group with ruthless savagery. Wilson is not naïve, and he recognizes the ease with which humans fall into an us-versus-them mind-set. Yet he is a self-described optimist, and he believes that the golden circles of we-ness, the conditions that encourage entities at every stratum of life to stop competing and instead pool their labors into a communally acting mega-entity, can be expanded outward like ripples on a pond until they encompass all of us — that the entire human race can evolve the culturally primed if not genetically settled incentive to see our futures for what they are, inexorably linked on the lone blue planet we share.

Toward the end of the book he offers a series of evolutionarily informed suggestions on how we might help widen the geometry of good will, beginning with the italicized, boldface pronouncement that “we are not fated by our genes to engage in violent conflict.” Our bloody past does not foretell an inevitably bloody future, and violent behaviors that make grim sense in one context can become maladaptive in another. “The Vikings of Iceland were among the fiercest people on earth, and now they are the most peaceful,” he observes. “In principle, it is possible to completely eliminate violent conflict by eliminating its preferred ‘habitat.’ ” For their universal appeal and basal power to harmonize a crowd, he recommends more music and dancing and asks, “Could we establish world peace if everyone at the United Nations showed up in leotards?” He also believes that the world’s religions should be tapped for their “wisdom.” This is a fine idea in the abstract, but given current events and the fissuring of the world along so many theo-sectarian lines, I wish we could forgo the sermon and just strike up the band.

Sunday, February 4, 2007

A triumvirate for our time : 'The President, the Pope and the Prime Minister: Three Who Changed the World'

The basic moral drive of Western civilization in the twentieth century has been appeasement. The civilization that has come to define what modernity actually is spends a lot of its time apologizing for its sins and accommodating the grievances of complainants. The West—and these days particularly America and the English-speaking parts of it—is where the rest of the world looks for cures for disease, charitable help, technical innovation, and the creation of respect for human rights. One might expect some recognition for such benefits, but everywhere we find complaints about our overbearing ways. What we seem to have done is to teach other cultures how to make capital out of grievances against us they never knew they had because they had previously taken such events as the small change of a violent world. We in the West are scourged for our involvement in slavery, for example, but get no gratitude (nor even recognition) for the fact that we alone abolished it. Again, much of socialist collectivism is the abandonment of Western economic dynamism in favor of allowing electorates to vote themselves rich with other peoples’ wealth. The ultimate paradox is that much of this demand for appeasement comes from within Western Civilization itself. What we might usefully call the academico-media complex is loud in its demands that whites should apologize to blacks, Christians to Muslims, the English to the Irish, imperial powers to those they colonized, men to women, and so on. No doubt many regrettable actions were performed by members of these abstract classes, as is true of all human groups, but only among Europeans and Americans do we find this remarkable propensity to beat one’s own breast.

Appeasement is the attempt to pacify someone aggrieved, and it usually requires concessions to the aggrieved. If the grievance is genuine, then appeasement might plausibly be thought what any moral agent ought to do, though there is a strong case for following the muscular advice of the Victorian Benjamin Jowett: “never apologize, never explain.” To behave better is always best of all. And appeasement recognized as a political vice got its bad name in the 1930s when a set of politicians, not perhaps cowardly exactly, but also not very smart, thought that apologizing to the Germans for the Versailles Treaty and handing them over chunks of other people’s territory might pacify Hitler. When in 1956 Anthony Eden in Britain came a cropper in refusing to appease Nasser, the popular view that it’s always better to negotiate than to stick to your guns became part of conventional wisdom.

Appeasement became the standard Western response to the world, a moral sickness all the more sinister because it came dressed up as a form of moral generosity. It particularly dominated Anglospheric politics in the 1970s. At this time, a set of rulers who had been elected on radical free market policies were thrown off course by the oil crisis, and they turned into high-taxing redistributionist Keynesians who thought they could manage economies. None indeed was as disastrous as Edward Heath in Britain (who sold out British sovereignty to a European bureaucracy), but Nixon and Carter in America, Muldoon in New Zealand, and Frazer in Australia all left a lot of trouble behind them. The conventional wisdom explained to them that governments should subsidize the economically incompetent, that there was no option but to accept that half the world should be tyrannized by vicious Communist regimes and that the overriding imperative of policy should be helping those in less fortunate circumstances, whatever the reasons might have been that had got them into that state.

It is against this background that we should understand John O’Sullivan’s book on Reagan, Thatcher, and Pope John Paul II, a lucid and brilliant account of the 1980s when, perhaps briefly, the appeasement tendency of our civilization was checked by three politicians with a firm grip on the realities of the human condition. Since the heritage of these figures has, in the case of British politics at least, fallen into confusion, and the truths to which they held firm seem less clear today, understanding their achievement is at the heart of political wisdom, and it could hardly be done more effectively than O’Sullivan has done it.

The sheer improbability of these three getting to the top of their various trees is itself a lesson in the importance of personality in history; it almost seems providential, and hints of such a view may be detected in O’Sullivan’s narrative. John Paul was a Pole, and the first to inhabit the Vatican, Thatcher was a woman in a largely male club, and Reagan an aging Hollywood actor, much mocked as a lazy political simpleton. They all won against long odds, and indeed they won partly because they opposed the conventional wisdom of their time. Keynes and containment were looking pretty stale as inflation and disorder started getting out of hand.

John Paul was an orthodox Catholic in a Church in which a kind of Ostpolitik had developed, a policy recommending accommodation with rather than resistance against entrenched Communist power. Even liberation theology was coming to seem a necessary adjustment of Catholicism to modern times. Reagan was a conservative in a political tradition long thought to be basically liberal, while Thatcher became an advocate of the free market where clever economists all knew that an economy was an instrument of political policy.

The success of these three conviction politicians was thus a tribute to the power of moral honesty against a feeble pragmatism. All of them took risks, and all of them were victims of assassination attempts. Their capacity for rising above such difficulties, their “grace under pressure,” merely dramatized their special qualities. And the result of their courage and honesty was the collapse of the “evil empire,” and the creation in Britain and the United States (and in other countries that followed their lead) of the conditions necessary for economies that could deal with inflation and unemployment.

O’Sullivan has the advantage of writing from the inside, being a Catholic who was part of Margaret Thatcher’s policy unit, and who, in the 1990s, became editor of National Review, the political fortnightly that, though based in New York, had close and abiding ties to the Reagan White House. His inside knowledge of politics means that, like the figures in his story, he has a very clear idea of the balance between principle and pragmatism. Margaret Thatcher, for example, was determined to resist the attempt of British unions to dictate government policy, but she did not accept the challenge from Arthur Scargill’s Miner’s Union until the power stations in Britain had stocks adequate to resist a long strike. She knew, as indeed all O’Sullivan’s characters did, when to temper principle with discretion.

The story he has to tell is, of course, a terrific one. From the drama of assassinations to the subtleties of diplomatic subterfuge, he brings out very clearly the dizzying set of problems at different levels of abstraction with which a contemporary ruler must simultaneously juggle. And this means that he is acutely aware of one feature of politics that is almost unknown to the average news reader: namely the centrality of off-stage problems in dealing with the matter in hand. Thus Reagan can hold his ground at the Reykjavik summit not only because he knows what he is doing, but also because his position in the polls back home is secure. He did not need a diplomatic triumph.
The world historical event at the center of O’Sullivan’s story is the collapse of the Soviet Union. The remarkable thing is that until the signs of that collapse became inescapable, most Western commentators (academics, journalists, economists—especially economists) did not believe it possible. They illustrated Machiavelli’s view that the basic illusion in politics is the belief that the way things are is the way they are always going to be.

Another reason for this remarkable obtuseness in what O’Sullivan calls “liberal opinion” flowed from an illusion even more specific than that which so impressed Machiavelli: namely, that social systems can be judged by their officially declared intentions. Communists may have been murderous thugs, but they meant well, and ought not to be confused with really bad people, such as Fascists and racists. Ultimately they believed in social justice at the end of the rainbow. Hence they couldn’t be all bad. But this is merely to be taken in by the public relations departments of Marxism and its cognates, indulging a lazy moralism of international benevolence and combining it with an uncritical realism about the status quo. Such simplicities persuaded the Galbraiths and the Samuelsons of Harvard and the bien pensants of the academico-media complex that they were tough realists above the absurd moral passions of Reagan and Thatcher. Here in the imagined sophistication of these apologists for the Soviet Union was a replay of the Emperor’s New Clothes, and I agree with O’Sullivan in thinking that such obtuseness should not be forgotten. In new versions, it is with us still.

The author’s most remarkable argument is that Reagan was actually a kind of unilateralist, with more in common with the peace movements of Europe (and the women of Greenham Common) than his image (at that time) of a war-mongering cowboy would indicate. The point of his much derided proposal to develop the Strategic Defense Initiative was ultimately the complete removal of nuclear weapons and an end to Mutual Assured Destruction. But it was certainly true, as everyone quickly realized, that it was also a direct strike at Soviet superpower pretensions by demonstrating that the Soviets could not match American defense spending. Reagan simply abandoned détente and destroyed the basis for the Cold War between the superpowers.

O’Sullivan’s account of these events relates personal relations—how Reagan and Gorbachev responded to each other—to the complex maneuvering of diplomatic summitry. Indeed, in the later stages of his story, the tragic Gorbachev virtually joins the three principals as one who also changed the course of history. In this story, Reagan’s remarkable offer to share SDI technology with the Russians in no way mitigated the terror Soviet leaders experienced in recognizing that the game was up. In a 1980s in which millions were preoccupied with the spread of nuclear missiles across Europe in the wake of the Soviet deployment of SS 80s, personal chemistry, diplomatic daring, and calculations about how each move would play in terms of Western electorates became part of the multidimensional chess on which all our fates depended.

Recognizing how deadly was the threat of the SDI to their power—however derided it was as mere science fiction in the Western media—the Soviets put their faith in a bold strategy to get them off the hook. They appealed to Reagan’s idealism, by offering an end to all nuclear weapons—as long as it would be combined with a complete ban on all real testing of the SDI. It was so dramatic a move that some of Reagan’s advisers thought they should accept, and of course it was just the kind of thing to play well in Western opinion. But Reagan himself was cautious: “I’m afraid,” he remarked about Gorbachev to his team over lunch, “he’s going after SDI” and Reagan refused. All the goodwill built up beforehand in good personal relations could not for the moment survive that bleak moment in East-West diplomacy.

Out in the real world, however, the legitimacy of Communist power was in any case crumbling throughout Eastern Europe. The Polish case had been simmering since 1980 when Solidarity came into being, an initiative carefully nurtured by John Paul. It became over time almost a shadow regime under the Communist rule. The Soviets considered and rejected the option of invasion, but in December 1981, martial law was declared and Jaruzelski took over the country. Dissident movements were, however, becoming more confident all over Eastern Europe, and in Russia itself. A line of caretaker geriatrics succeeded each other in the Soviet Union until Gorbachev took over with a mandate to reform the system. This event too is no less an illustration of how central personality is to politics.

Perhaps the most charming element in O’Sullivan’s story is the Thatcher-Reagan relationship, which cannot be cheapened even if it does totter on the edge of being a romcom. Here were two figures who shared each other’s moral instincts, being both patriots and decent people, who admired each other, who spoke honestly to each other, and, while sympathizing and helping each other politically, also never forgot that their basic responsibility was to stand for the national interests of their respective countries. O’Sullivan’s account of the Falkland’s War comes from the inside, and it makes for an exciting narrative. Fortunately, British and American interests ran largely parallel during these years, and the experience has given fresh life to the idea of the Anglosphere as a distinct moral and political realm in world politics.

The President, the Pope and the Prime Minister ends with a coda on the reputations of these figures, and necessarily so, because political wisdom depends on truth in politics, and subsequent judgments have varied. Reagan was travestied by journalists and academics at the time, but is now recognized for his achievements. Margaret Thatcher, by contrast, became a public relations disaster because the necessary hardships involved in turning Britain around were mythologized as a reign of greed and selfishness. This was a distortion that ushered in the decade of New Labour politics that has undone much of her achievement. That, of course, is politics, but it is depressing to see how extensively malign orthodoxies can blanket out an entire historical experience.

Tuesday, January 23, 2007

THEY CALL ME NAUGHTY LOLA :Personal Ads From The London Review of Books.

For some of us, self-deprecation is the olive in the martini of romance. It’s that little something extra — a blast of salt and texture in a pool of cool velvet. The practice of demoting oneself has a counterintuitive power: it takes a truly secure person to self-flagellate. But in “They Call Me Naughty Lola,” a highly jaunty collection of personal ads from The London Review of Books, the intense self-deprecation among lovelorn Brits is less like an olive resting comfortably at the bottom of a martini glass and more like a peacock that has set itself on fire to flag down passing motorists. One ad runs, “Official greeter and face of Dalkeith Cheese Festival, 1974, seeks woman to 50 who is no stranger to failure, debt-consolidating mortgages and wool.” Another states: “Your buying me dinner doesn’t mean I’ll have sex with you. I probably will have sex with you though.” A third comes from the pen of a woman able to “start fires with the power of her premenstrual tension.”

When the benchmark for self-flagellation is set this high, several interesting things happen. Even the rare instances of self-puffery take on a dark or twisted aspect. (“Romance is dead. So is my mother. Man, 42, inherited wealth.”) The deprecation sometimes starts to cover an area larger than the self. (“I like my women the way I like my kebab. Found by surprise after a drunken night out and covered in too much tahini.”) Finally, some of the ads become self-referential and meta (“How can I follow that? Man, 47. Gives up easily. Box No. 9547.”), if not outright jokes. (“117-year-old male Norfolk Viagra bootlegger finally in the mood for a bit of young totty.”) In his introduction, David Rose, the advertising director who assembled the collection, addresses this last tendency when he writes that “for some LRB advertisers, meeting a partner is no longer even the main objective of placing a personal ad. ...They’re a frolic, a bit of whimsy. ...The silliness, in this sense, becomes a sleight of hand, a trick done with mirrors to disguise the machinery beneath the stage.”


Indeed, the tricksy nature of this collection, despite its laugh-out-loud gems, is perhaps what keeps these ads from engaging us emotionally or getting under our skin. (For a more affecting if less amusing look at a similar topic, see Sara Bader’s 2005 book on classified ads throughout America’s history, “Strange Red Cow,” which includes items like this one from an 1865 issue of The New York Herald: “J.A.R. — Sarcasm and indifference have driven me from you. I sail in next steamer for Europe. Shall I purchase tickets for two, or do you prefer to remain to wound some other loving heart? Answer quick, or all is lost. Emelie.”)


Given that loneliness and the search for love can be two of life’s most heartstring-pulling topics, you’d ideally finish a collection like this feeling slightly disquieted. But in the end, these ads are probably more effective as literary style exercises than as portraits of longing. Note, for instance, the pitch-perfect genre parody at work in “Blah, blah, whatever. Indifferent woman. Go ahead and write. Box No. 3253. Like I care.” I wanted to.

Monday, December 18, 2006

'How to Read a Novel: A User's Guide':Criticism for Beginners

So much for the death of the book. People have been predicting the demise of the hardback for over a century now — in his novel “The Time Machine” (1895), H. G. Wells imagined whole libraries turned to dust — but if book production is anything to go by, Wells was a worrywart. In the 1600s the total number of books available to a literate Englishman was about 2,000. Now, more than 2,000 are published a week, with 10,000 new novels every year. Given a 40-hour reading week, a 46-week working year and three hours per novel, you would need 163 lifetimes to read them all, John Sutherland calculates in his new book, “How to Read a Novel,” which aims to be a user-friendly guide to negotiating this morass. “Done well, a good reading is as creditable as a 10-scoring high dive,” he writes. “It is, I would maintain, almost as difficult to read a novel well as to write one well.” A dubious but nevertheless terrifying proposition, calling to mind the girl at the cocktail party in Woody Allen’s “Manhattan” who declares to the room, “I finally had an orgasm, and my doctor said it was the wrong kind.” Is Sutherland serious? We can be scored on this stuff? By whom?

Sutherland is a well-known academic, critic and scholar. He was also chairman of the 2005 Man Booker judging panel that caused an uproar by giving the award to John Banville’s novel “The Sea” — “an icy and overcontrolled exercise in coterie aestheticism,” according to a writer in The Independent, as well as “a travesty of a result from a travesty of a judging process.” It’s hard not to read this book as a riposte of sorts, although anyone expecting the low-down on the judging process, or even an impassioned defense of Banville, will be disappointed. Sutherland recounts the affair with a tone of lofty perplexity: “How can a novel, examined so dutifully and on so many fronts, be judged at the same time as utterly bad and outstandingly good?” Later, he runs over Banville, taking him to task for misreading a squash game scene in a review of Ian McEwan’s novel “Saturday.” “Banville does not understand the game of squash,” he writes, indisputably.

Would that the novel were so easily dispatched. What we have here is a biopsy of the form, from intertextuality (“it does give the reader a pleasing sense of ascendancy”), to CIP and ISBN numbers (“Do you know what those acronyms stand for? Do you care?”). This is not a bad idea, but Sutherland’s book seems a little unsure of itself, or its readership, and frequently courts the obvious. I liked his point about the first sentences of “Anna Karenina” (“All happy families are alike. All unhappy families are unhappy in their own way”) and “Pride and Prejudice” (“It is a truth universally acknowledged ...”)being neither universally acknowledged nor even true for the novels they begin. But he also points out, in his chapter on titles (“titles play an important role”), that “The Odessa File” seems a title “reminiscent less of fiction than of an MI5 top-secret report.” Er, isn’t that the point? And is it right to praise the opening sentence of “1984” (“It was a bright cold day in April, and the clocks were striking 13”) as an example of narrative suspense, on the grounds that it “will not yet divulge” the fact that Winston Smith is, in the coming pages, doomed to be captured and brainwashed by Big Brother? Is Sutherland really praising the first sentence of “1984” for not giving away the entire plot? Don’t most first sentences achieve that modest aim?

None of this would matter if Sutherland didn’t set such store by intelligence as the high bar we all have to aim for. “The ability to read a novel intelligently, I would maintain, is the mark of a mature personal culture,” he writes. The word recurs: “A clever engagement with a novel is ... one of the more noble functions of human intelligence.” And again: the key to choosing what to read is “intelligent browsing.” Does anyone go near the word “intelligent” without an armed escort these days? Until properly defined, it’s a word of use only to those in the business of spreading fear; and indeed, Sutherland’s book is curiously fretful and anxious, rising to a ringing endorsement of an actual novel only in its final pages: Sutherland loves Thackeray’s “Vanity Fair” — good for him! — but winces at an offhand comment by Alain de Botton that the novel is “the most overrated ever.” “Perhaps he has a superior critical sensibility which, more correctly than mine, judges Thackeray inferior,” Sutherland worries. “I hope not.” A curiously sweaty entreaty with which to end his book: please God, let me be the superior one!

Anyone interested in the way people really read novels ought to turn to Nick Hornby’s “Stuff I’ve Been Reading” columns for The Believer magazine: they’re a real-time, on-the-ground account of one man’s monthly battle to square the number of books he buys with the number of books he actually reads, while fighting off the competing demands of TV, kids and soccer. Cultural anxiety is a good subject for a book; but Sutherland is, perhaps, too much its creature.

Thursday, December 14, 2006

The Lost World of British Communism: Life and soul of the party


It's fashionable now to talk about British communism in the tone one might use for the Inquisition: something dreadful, but long dead. Even former communists often pass it off as a shameful indiscretion, only excused by youth and naivety.


But Raphael Samuel, one of Britain's most interesting historians, though he left the party (it was always just "the party" to those close to it) in 1956, never regretted, or felt the need to apologise for, the years he spent trying to convert his schoolfellows to communism. In the 1980s he was amused, but not ashamed, by the young man he had been, who "attempted to 'clarify' an aunt and uncle who were 'confused' about Yugoslavia and had come to doubt the Moscow trials".


These essays have been collected to mark the 10th anniversary of Samuel's death on December 9 1996, when he was 62 and at the height of his powers. (The publisher's promise on the press release that "The author is available for interview" will be hard to keep.) They describe the Communist party of the 1940s from the vantage point of the 1980s. The pressure to adopt the "how could I have been so blind?" tone was nothing like as strong then as it is now. In 1985 you could still say "I used to be a communist" or even "I'm still a communist" in almost any company. So these essays are as illuminating about the decade in which they were written as the decade they describe. They also have interesting things to say about what, in the 1980s, one was still allowed to call "the labour movement", of which the Communist party saw itself as a key part.


Even when they have broken away, the children of communists, as Samuel was, still share ways of thinking that the rest of the world does not understand, like former Catholics. Earlier this year I found myself chairing a debate about British communism and - as it turned out - mediating between two of the panellists, Bea Campbell and David Aaronovitch, writers, former communists, and the children of communists. I did this clumsily (and made things rather worse) because the sudden conflagration caught me by surprise. I did not share their past, and did not anticipate, or even quite understand, their hostility to each other.


Samuel describes, with wonderful anecdotes and supporting detail, a top-down organisation in which ultimate authority stemmed from the Comintern - the Communist International, firmly controlled by Stalin - and in which policies and strategies filtered down through the national executive to the rank and file.

He describes the rigidity of party discipline, quoting a worried letter sent in 1926 from the party secretary in St Pancras to the London District asking for advice: "... Mrs Kingston, although she has passed party training, and is therefore a full member of the party, does not accept the materialist conception of history, and she believes that communism is founded on idealism and not on materialism ... She is trying to form a group of people who think the same." The secretary had tried to reason with Mrs Kingston, but without success.
Communists were ambassadors for the party. So they were always soberly dressed and, according to Samuel, always sober, too. I wonder if he knew that the party leader, Harry Pollitt, loved his whisky, which was brought to him in a teacup whenever the abstemious Communist MP Willie Gallacher was present; or that the first chairman, Arthur MacManus, died young of alcoholism. MacManus had the honour of being buried in the walls of the Kremlin, the scene of many evenings when he out-drank even the hard-drinking Russians.

Fast-forward to the 1980s. Samuel's insights into the state of left-wing politics under Thatcher are some of the best things in this book, all the better for having been written at the time, not glimpsed through the prism of the new millennium.


The Communist party was by then divided into two warring camps, the old class warriors whom he remembered from his childhood, and the young men and women of Marxism Today. His sympathies were mostly with the old guard. He looked back on the ideologues of his childhood, not with the newly fashionable scorn, but with respect and affection. It was Marxism Today, he says, that reproduced the party's worst characteristics - its intolerance and ideological rigidity. "For the first time in its history the party is staging something approaching a full-scale purge. It is an odd way to celebrate the advent of pluralism," he writes.

The old guard talked of "correct politics". The Young Turks talked of "political correctness" (this was before every boring right-wing columnist tried to make himself interesting by saying he was "non-PC"). When correct politics clashed with political correctness, the explosion blew away the party of Samuel's youth. But you can't know the times without knowing the party, and Samuel makes a fascinating guide to it.



  • Francis Beckett's Stalin's British Victims is published by Sutton.