NY Times: Sunday Book Review

October 4, 2015

NY Times: Sunday Book Review

Sherry Turkle’s ‘Reclaiming Conversation’

Sherry TSherry Turkle is a singular voice in the discourse about technology. She’s a skeptic who was once a believer, a clinical psychologist among the industry shills and the literary hand-wringers, an empiricist among the cherry-picking anecdotalists, a moderate among the extremists, a realist among the fantasists, a humanist but not a Luddite: a grown-up. She holds an endowed chair at M.I.T. and is on close collegial terms with the roboticists and affective-computing engineers who work there. Unlike Jaron Lanier, who bears the stodgy weight of being a Microsoft guy, or Evgeny Morozov, whose perspective is Belarussian, Turkle is a trusted and respected insider. As such, she serves as a kind of conscience for the tech world.

Turkle’s previous book, “Alone ­Together,” was a damning report on human relationships in the digital age. By observing people’s interactions with robots, and by interviewing them about their computers and phones, she charted the ways in which new technologies render older values obsolete. When we replace human caregivers with robots, or talking with texting, we begin by arguing that the replacements are “better than nothing” but end up considering them “better than anything” — cleaner, less risky, less demanding. Paralleling this shift is a growing preference for the virtual over the real. Robots don’t care about people, but Turkle’s subjects were shockingly quick to settle for the feeling of being cared for and, similarly, to prefer the sense of community that social media deliver, because it comes without the hazards and commitments of a real-world community. In her interviews, again and again, Turkle observed a deep disappointment with human beings, who are flawed and forgetful, needy and unpredictable, in ways that machines are wired not to be.

Her new book, “Reclaiming Conversation,” extends her critique, with less ­emphasis on robots and more on the dissatisfaction with technology reported by her recent interview subjects. She takes their dissatisfaction as a hopeful sign, and her book is straightforwardly a call to arms: Our rapturous submission to digital technology has led to an atrophying of human capacities like empathy and self-­reflection, and the time has come to reassert ourselves, behave like adults and put technology in its place. As in “Alone Together,” Turkle’s argument derives its power from the breadth of her research and the acuity of her psychological insight. The people she interviews have adopted new technologies in pursuit of greater control, only to feel controlled by them. The likably idealized selves that they’ve created with social media leave their real selves all the more isolated. They communicate incessantly but are afraid of face-to-face conversations; they worry, often nostalgically, that they’re missing out on something fundamental.

Conversation is Turkle’s organizing principle because so much of what constitutes humanity is threatened when we replace it with electronic communication. Conversation presupposes solitude, for example, because it’s in solitude that we learn to think for ourselves and develop a stable sense of self, which is essential for taking other people as they are. (If we’re unable to be separated from our smartphones, Turkle says, we consume other people “in bits and pieces; it is as though we use them as spare parts to support our fragile selves.”) Through the conversational attention of parents, children acquire a sense of enduring connectedness and a habit of talking about their feelings, rather than simply acting on them. (Turkle believes that regular family conversations help “inoculate” children against bullying.) When you speak to people in person, you’re forced to recognize their full human reality, which is where empathy begins. (A recent study shows a steep decline in empathy, as measured by standard psychological tests, among college students of the smartphone generation.) And conversation carries the risk of boredom, the condition that smartphones have taught us most to fear, which is also the condition in which patience and imagination are developed.

Turkle examines every aspect of conversation — with the self in solitude, with family and friends, with teachers and romantic partners, with colleagues and clients, with the larger polity — and reports on the electronic erosion of each. Facebook, Tinder, MOOCs, compulsive texting, the tyranny of office email, and shallow online social activism all come in for paddling. But the most moving and representative section of the book concerns the demise of family conversation. According to Turkle’s young interviewees, the vicious circle works like this: “Parents give their children phones. Children can’t get their parents’ attention away from their phones, so children take refuge in their own devices. Then, parents use their children’s absorption with phones as permission to have their own phones out as much as they wish.” For Turkle, the onus lies squarely on the parents: “The most realistic way to disrupt this circle is to have parents step up to their responsibilities as mentors.” She acknowledges that this can be difficult; that parents feel afraid of falling behind their children technologically; that conversation with young children takes patience and practice; that it’s easier to demonstrate parental love by snapping lots of pictures and posting them to Facebook. But, unlike in “Alone Together,” where Turkle was content to diagnose, the tone of “Reclaiming Conversation” is therapeutic and hortatory. She calls on parents to understand what’s at stake in family conversations — “the development of trust and self-esteem,” “the capacity for empathy, friendship and intimacy” — and to recognize their own vulnerability to the enchantments of tech. “Accept your vulnerability,” she says. “Remove the ­temptation.”

“Reclaiming Conversation” is best appreciated as a sophisticated self-help book. It makes a compelling case that children develop better, students learn better and employees perform better when their mentors set good examples and carve out spaces for face-to-face interactions. Less compelling is Turkle’s call for collective action. She believes that we can and must design technology “that demands that we use it with greater intention.” She writes approvingly of a smartphone interface that “instead of encouraging us to stay connected as long as possible, would encourage us to disengage.” But an interface like this would threaten almost every business model in Silicon Valley, where enormous market capitalizations are predicated on keeping consumers riveted to their devices. Turkle hopes that consumer demand, which has forced the food industry to create healthier products, might eventually force the tech industry to do the same. But the analogy is imperfect. Food companies make money by ­selling something essential, not by placing targeted advertising in a pork chop or by mining the data that a person provides while eating it. The analogy is also politically unsettling. Since platforms that discourage engagement are less profitable, they would have to charge a premium that only affluent, well-educated consumers of the sort that shop at Whole Foods are likely to pay.

Although “Reclaiming Conversation” touches on the politics of privacy and ­labor-saving robots, Turkle shies from the more radical implications of her findings. When she notes that Steve Jobs forbade tablets and smartphones at the dinner table and encouraged his family to talk about books and history, or when she cites Mozart, Kafka and Picasso on the value of undistracted solitude, she’s describing the habits of highly effective people. And, indeed, the family that is doing well enough to buy and read her new book may learn to limit its exposure to technology and do even better. But what of the great mass of people too anxious or lonely to resist the lure of tech, too poor or overworked to escape the vicious circles? Matthew Crawford, in “The World Beyond Your Head,” contrasts the world of a “peon” airport lounge — saturated in advertising, filled with mesmerizing screens — with the quiet, ad-free world of a business lounge: “To engage in playful, inventive thinking, and possibly create wealth for oneself during those idle hours spent at an airport, requires silence. But other people’s minds, over in the peon lounge (or at the bus stop), can be ­treated as a resource — a standing reserve of purchasing power.” Our digital technologies aren’t politically neutral. The young person who cannot or will not be alone, converse with family, go out with friends, attend a lecture or perform a job without monitoring her smartphone is an emblem of our economy’s leechlike attachment to our very bodies. Digital technology is capitalism in hyperdrive, injecting its logic of consumption and promotion, of monetization and efficiency, into every waking minute.

It’s tempting to correlate the rise of “digital democracy” with steeply rising levels of income inequality; to see more than just an irony. But maybe the erosion of humane values is a price that most people are willing to pay for the “costless” convenience of Google, the comforts of Facebook and the reliable company of iPhones. The appeal of “Reclaiming Conversation” lies in its evocation of a time, not so long ago, when conversation and privacy and nuanced debate weren’t boutique luxuries. It’s not Turkle’s fault that her book can be read as a handbook for the privileged. She’s addressing a middle class in which she herself grew up, invoking a depth of human potential that used to be widespread. But the middle, as we know, is disappearing.


Book Review: Niall Ferguson’s ‘Kissinger. Volume I. 1923-1968: The Ideal­ist’

October 1, 2015

NY Times Sunday Book Review

Niall Ferguson’s ‘Kissinger. Volume I. 1923-1968: The Ideal­ist’

By Andrew Roberts

Kissinger --The IdealistIt is very rare for an official biography to be also a revisionist biography, but this one is. Usually it’s the official life that the revisionists attempt to dissect and ­refute, but such is the historical reputation of Henry Kissinger, and the avalanche of books and treatises already written about him, that Niall Ferguson’s official biography is in part an effort to revise the revisionists. Though not without trenchant criticisms, “Kissinger. Volume I. 1923-1968: The Ideal­ist” — which takes its subject up to the age of 45, about to begin his first stint of full-time government service — constitutes the most comprehensive defense of Kissinger’s outlooks and actions since his own three-volume, 3,900-page autobiography, published between 1979 and 1999.

Unlike the revisionists, Ferguson has had access to every part of Kissinger’s vast archive at the Library of Congress, which weighs several tons and comprises 8,380 documents covering 37,645 pages on the digitized database alone. These include a heartfelt essay on “The Eternal Jew” written by the 22-year-old German-born Sergeant Kissinger after witnessing the liberation of a Nazi concentration camp; some loving but uncompromising letters to his parents about his separation from their Orthodox faith; a jejune and somewhat cringe-making teenage note to a would-be girlfriend; and the minutes he took as secretary of a Jewish youth organization to which he belonged as the Nazis were seizing power in his homeland. Although this book is long at 986 pages, and Kissinger has only just joined the Nixon administration as national security adviser when it ends, the sheer quality of the material unearthed justifies the length and detail.

Ferguson gives the full story of the Kissinger family’s experience under the Third Reich before they emigrated in 1938, and Ferguson has identified at least 23 close family members who perished in the Holocaust. (Of the 1,990 Jews who lived in their hometown, Fürth, in 1933, fewer than 40 were left by the end of the war.) The first chapters covering the Kissingers’ life in the late 1930s and early 1940s in the Washington Heights neighborhood of New York recapture the Jewish immigrant experience superbly and put into perspective the fact that Henry (born Heinz) became the first foreign-born United States citizen to serve as secretary of state.

Whereas Kissinger has regularly underplayed his bravery during World War II, Ferguson shows that he saw action during the Battle of the Bulge, where he came under severe shelling. “His very presence” in the Meuse town of Marche “was hazardous in the extreme,” Ferguson writes, as German 88s, mortar shells and a V-1 rocket pulverized “the narrow streets of the town center where the divisional HQ was based.” After V-E Day, Kissinger became an extremely effective Nazi hunter with the Counter-­Intelligence Corps.

The subtitle of the book will surprise many for whom Kissinger’s name is almost synonymous with modern realpolitik and who are familiar with the revisionist accounts that equate him with Machiavelli, Bismarck and other such thinkers and statesmen normally thought far from idealists. Yet Ferguson’s investigation of Kissinger’s intellectual roots, especially through the influence of his Army mentor Fritz Kraemer and his Harvard supervisor William Yandell Elliott, shows Kissinger was indeed an idealist in the Kantian sense, rather than in its modern American political version. Kissinger’s unpublished senior thesis, “The Meaning of History,” was an investigation into Immanuel Kant’s philosophy of history, especially in contrast to the views of Arnold Toynbee and Oswald Spengler, although Ferguson slightly dismisses it as “an exercise in academic exhibitionism.”

Henry A. Kissinger, President Nixon?s chief foreign policy advisor, and Jill St. John, Hollywood actress whom he frequently dates, are dinner companions at the Now Grove in the Ambassador hotel in Los Angeles, Tuesday, Oct. 21, 1970. Kissinger, a bachelor, has escorted Miss St. John to a variety of functions in Washington as well as in the Hollywood and Los Angeles area. (AP Photo)

He dated actresses and beautiful women. They found him attractive and seductive as he exudes power. He is a small power packed personality. He is pictured with actress Jill Saint John who frequently dated.

In his thesis, Kissinger argued that “freedom is . . . an inner experience of life as a process of deciding meaningful alternatives” and that “whatever one’s conception about the necessity of events, at the moment of their performance their inevitability could offer no guide to action.” He also said, “However we may explain actions in retrospect, their accomplishment occurred with the inner conviction of choice.” The importance of choice led Kissinger to a belief in democracy. “Kissinger was never a Machiavellian,” Ferguson argues, but neither was he an idealist of the Woodrow Wilson variety. “It was an inherently moral act,” Ferguson says of Kissinger’s outlook, “to make a choice between lesser and greater evils.”

What brought Kissinger to huge public prominence while still only an assistant professor was his radical prescription for how to deal with the perceived (though in fact chimerical) relative weakness of the United States vis-à-vis the Soviet Union at the time of the successful launch of the Sputnik space satellite in October 1957. As Ferguson puts it, “Sputnik launched Kissinger into a new orbit.” Kissinger had only months earlier published his widely reviewed and highly controversial best seller “Nuclear Weapons and Foreign Policy,” which argued that the threat of a limited nuclear war was a more effective deterrent to Soviet incursions in the third world than the Eisenhower administration’s strategy of mutually assured destruction. And as Kissinger wrote in Foreign Affairs magazine, “The best opportunity to compensate for our inferiority in manpower” is “to use our superiority in technology to best advantage” (although he did rule out using any bomb of more than 500 kilotons in a tactical situation). For Ferguson, Kissinger’s argument “fails to convince,” but it won Kissinger interviews on “Face the Nation” and with The New York Herald Tribune that — once his accent and acerbic wit came to be appreciated by the American public — put him on the trajectory to intellectual rock star status that he never lost.

Partly because he described himself as an independent, Kissinger could be called upon by both political parties for advice. After failing to make an impact as a consultant to the Kennedy administration — he didn’t like the men or the methods, and they didn’t see him fitting the Camelot image — he went to work for Gov. Nelson Rockefeller of New York. Ferguson is clearly fascinated by what he calls the “turbulent friendship” between the aristocrat and the immigrant, and is at pains to point out that “Henry Kissinger has often been portrayed as very ruthless and calculating in his pursuit of power. But in committing himself again and again to Rockefeller, he failed to see that he was backing a man who would never be president.” Kissinger’s loyalty was based on affection and genuine admiration, rather than mere miscalculation.

Ferguson’s access to the diaries Kissinger kept before, during and after his visits to Vietnam in 1965 and 1966 allows him to argue, totally convincingly, that on his missions for the Johnson administration, Kissinger realized very early on that the United States had little or no hope of winning the war and therefore needed to enter into direct negotiations with Hanoi sooner rather than later, albeit from a position of strength. This book contains the first full account of the abortive initiative to start talks with Hanoi in 1967; as Ferguson puts it, “to an extent never previously recognized by scholars,” Kissinger attempted “to broker some kind of peace agreement with the North Vietnamese, using a variety of indirect channels of communication to Hanoi that passed through not only Paris but also Moscow.”

Yet it is in Ferguson’s comprehensive demolition of the revisionist accounts of the 1968 election by Seymour Hersh, Christopher Hitchens and others that this book will be seen as controversial. For he totally rejects the conspiracy theory that blames Kissinger for leaking details of the Paris peace negotiations to the Nixon camp, details that enabled Nixon, it was said, to persuade the South Vietnamese that they would get better treatment if he and not Hubert Humphrey were in the White House. Ferguson goes into this theory in great detail, disproving it on several grounds, but especially for its lack of even the most basic actual or circumstantial evidence. (It turns out that one of the reasons Kissinger was in Paris in 1967 was that he was secretly going to the Sorbonne to woo the only great love of his life, Nancy Maginnes, whom he subsequently married.)

Of course it will be in the second volume that Ferguson will come to grips with the revisionists’ attacks on Kissinger’s actions involving places like Chile, Argentina, Cyprus, East Timor and Bangladesh. The book’s introduction strongly implies that he will be acquitting Kissinger of the monstrous charge of war criminality that the revisionists have made over the years.

Yet this is no hagiography. As well as being highly critical of Kissinger’s theory of limited nuclear war, Ferguson describes a letter of his as a “solipsistic screed”; says of one of Kissinger’s books that it “remained, at root, the work of a committee”; and states that Kissinger was “even more demanding to his own subordinates” than Rockefeller was to him: “He learned to rant and rage.” The criticisms — and there are many more waspish ones — absolve Ferguson from the charge of whitewashing Kissinger and make his praise all the more credible.

This is an admiring portrait rather than a particularly affectionate one. Ferguson acknowledges in his preface all of the “conversing with him, supping with him, even traveling with him” that he did over the many years he spent researching and writing this book. But if Kissinger’s official biographer cannot be accused of falling for his subject’s justifiably famed charm, he certainly gives the reader enough evidence to conclude that Henry Kissinger is one of the greatest Americans in the history of the Republic, someone who has been repulsively traduced over several decades and who deserved to have a defense of this comprehensiveness published years ago.

Part of Kissinger’s charm of course derives from his highly developed sense of humor, which is given full rein here. “Nobody will ever win the battle of the ­sexes,” he once joked. “There’s just too much fraternizing with the enemy.” When someone came up to him at a reception and said, “Dr. Kissinger, I want to thank you for saving the world,” he replied, “You’re welcome.” All of this was delivered in the trademark voice that the journalist Oriana Fallaci described as like “that obsessive, hammering sound of rain falling on a roof.”

Niall Ferguson already has many important, scholarly and controversial books to his credit. But if the second volume of “Kissinger” is anywhere near as comprehensive, well written and riveting as the first, this will be his masterpiece.

Andrew Roberts is the Lehrman ­Institute ­distinguished fellow at the New-York ­Historical Society.

A version of this review appears in print on October 4, 2015, on page BR12 of the Sunday Book Review with the headline: Kissinger the Idealist. 


Allan Bloom and the Conservative Mind

September 25, 2015

Note: My friends with an intellectual bent and inquisitive  and critical minds, I have finished re-reading Allan Bloom’s The Closing of the American Mind (published in 1987).

BloomWith a fresh mind and since I am teaching Political Philosophy to a handful graduate students at The Techo Sen School, University of Cambodia, I have begun to find the subjects covered by the late Professor Bloom’s treatment of the subject illuminating and still relevant today.

The need for critical thinking and reasoned discourse is never more urgent, given the issues raised by Pope Francis of the Holy See during his celebrated Address to the US Congress. For far too long, we have been producing automatons and people devoid of well grounded ethical and moral values. Human beings have become self-centered and greedy. In his book, Allan Bloom makes a strong case for a return to classical education so that we can save democracy and the human soul from self-destruction. So enter  the Humanities, political philosophy, Socrates, Plato, Aristotle, the Enlightenment and their heirs.–Din Merican.

Who Closed the American Mind?


Allan Bloom was brilliant, but wrong about Burke and multiculturalism

by Patrick J. Deneen

illustration by Michael Hogue

One crisp morning 26 years ago I was walking across the campus of the University of Chicago, where I had just enrolled as a first-year Ph.D. candidate in the renowned Committee on Social Thought. While I had not yet met him, I had heard much about Allan Bloom, a legendary professor, teacher, and lecturer. I had read his translation of Plato’s Republic as an undergraduate and had some notion that I would write my eventual dissertation under his direction.

As I crossed one of the campus quads, I saw a man sitting on a bench, swaddled under a heavy overcoat and his head topped by a fedora. A photographer was arranging his equipment across from him, while he bemusedly awaited some kind of publicity shoot. While I realized only a short time later that the man I had seen was Allan Bloom, it was a year later—a quarter-century ago—that I realized that I had witnessed the photo session that led to the headshot inside the hardcover jacket of Bloom’s blockbuster book The Closing of the American Mind. By that time, I had left the University of Chicago, disillusioned by the program and put off by Bloom’s circle of students. But I loved the book and credit it, at least in part, for my eventual return to the academy and a career as a professor of political philosophy.

I still assign the book with some regularity, especially in a freshman seminar on education that I’ve taught over the last half-decade. As the years have passed, I’ve noticed how the book has aged—many of its cultural references are long dated, while contemporary hot-button issues like gay marriage and religious liberty are altogether absent from Bloom’s confident pronouncements on our likely future. Still, the book continues to excite new readers—today’s students find it engaging, even if, unlike their elders, they don’t get especially upset by it and almost unanimously have never heard of it before. And with every re-reading I invariably find something new that I hadn’t noticed before, a testimony to the expansiveness of Bloom’s fertile mind.

While I continue to learn much from Bloom, over the years I have arrived at three main judgments about the book’s relevance, its prescience, and its failings. First, Bloom was right to be concerned about the specter of relativism—though perhaps even he didn’t realize how bad it would get, particularly when one considers the reaction to his book compared to its likely reception were it published today. Second, his alarm over the threat of “multiculturalism” was misplaced and constituted a bad misreading of the zeitgeist, in which he mistook the left’s tactical use of identity politics for the rise of a new kind of communalist and even traditionalist tribalism. And, lastly, most of his readers—even today—remain incorrect in considering him to be a representative of “conservatism,” a label that he eschewed and a worldview he rejected. Indeed, Bloom’s argument was one of the early articulations of “neoconservatism”—a puzzling locution used to describe a position that is, in fact, today more correctly captured by its critics on the left as “neo-liberalism.”

What should most astonish any reader of Bloom’s Closing after 25 years is the fact that this erudite treatise about the crisis of higher education not only sat atop the bestseller list for many weeks but was at the center of an intense, lengthy, and ferocious debate during the late 1980s over education, youth, culture, and politics. In many ways, it became the most visible and weightiest salvo in what came to be known as “the culture wars,” and people of a certain generation still hold strong opinions about Bloom and his remarkable, unlikely bestseller.

Today there are many books about the crisis of higher education—while the nature of the crisis may change, higher education never seems to be out of the woods—but none before or since Bloom’s book achieved its prominence or made its author as rich and famous as a rock star. It was a book that many people bought but few read, at least not beyond a few titillating passages condemning rock-and-roll and feminism. Yet it was a book about which almost everyone with some engagement in higher education held an opinion—indeed, it was obligatory to have considered views on Bloom’s book, whether one had read it or not.

Bloom’s book was at the center of a debate—one that had been percolating well before its publication in 1987—over the nature and content of a university education. That debate intensified with the growing numbers of “diverse” populations seeking recognition on college campuses—concomitant with the rise of departments of Women’s Studies, African-American Studies, and a host of other “Studies” studies—leading to demands that the curriculum increasingly reflect contributions by non-male, non-white, non-European and even non-dead authors.

The Closing of the American Mind spawned hundreds, perhaps even thousands of responses—most of them critiques—including an article entitled “The Philosopher Despot” in Harper’s by political theorist Benjamin Barber, and the inevitably titled The Opening of the American Mind by Lawrence Levine. Partly spurred by the firestorm initiated by Bloom’s book, perennial presidential candidate Jesse Jackson led a march through the campus of Stanford University shouting through a bullhorn, “Hey hey, ho ho, Western Civ has got to go!” Passions for campus reform ran high, and an avalanche of words, articles, denunciations, and ad hominem attacks greeted Bloom’s defense of the Western canon.

Yet the nuances of Bloom’s qualified defense of the Western canon were rarely appreciated by critics or supporters alike. While Bloom was often lumped together with E.D. Hirsch—whose Cultural Literacy was published the same year and rose to number two on the New York Times bestseller list, just behind Closing—Bloom’s argument was fundamentally different and far more philosophically challenging than Hirsch’s more mundane, if nevertheless accurate, point that educated people increasingly did not have knowledge about their own culture. Hirsch’s book spoke to anxiety about the loss of a shared literary and cultural inheritance, which today has been largely supplanted by references to a few popular television shows and sports televised on ESPN.

Bloom made an altogether different argument: American youth were increasingly raised to believe that nothing was True, that every belief was merely the expression of an opinion or preference. Americans were raised to be “cultural relativists,” with a default attitude of non-judgmentalism. Not only all other traditions but even one’s own (whatever that might be) were simply views that happened to be held by some people and could not be judged inferior or superior to any other. He bemoaned particularly the decline of household and community religious upbringing in which the worldviews of children were shaped by a comprehensive vision of the good and the true. In one arresting passage, he waxed nostalgic for the days when people cared: “It was not necessarily the best of times in America when Catholic and Protestants were suspicious of and hated one another; but at least they were taking their beliefs seriously…”

He lamented the decline of such true belief not because he personally held any religious or cultural tradition to be true—while Bloom was raised as a Jew, he was at least a skeptic, if not a committed atheist—but because he believed that such inherited belief was the source from which a deeper and more profound philosophic longing arose. It wasn’t “cultural literacy” he wanted, but rather the possibility of that liberating excitement among college-age youth that can come from realizing that one’s own inherited tradition might not be true. From that harrowing of belief can come the ultimate philosophic quest—the effort to replace mere prejudice with the quest for knowledge of the True.

Near the beginning of Closing, Bloom relates one telling story of a debate with a psychology professor during his time teaching at Cornell. Bloom’s adversary claimed, “it was his function to get rid of prejudices in his students.” Bloom compared that function to the activity of an older sibling who informs the kids that there is no Santa Claus—disillusionment and disappointment. Rather than inspiring students to replace “prejudice” with a curiosity for Truth, the mere shattering of illusion would simply leave students “passive, disconsolate, indifferent, and subject to authorities like himself.”

Bloom relates that “I found myself responding to the professor of psychology that I personally tried to teach my students prejudices, since nowadays—with the general success of his method—they had learned to doubt beliefs even before they believed in anything … One has to have the experience of really believing before one can have the thrill of liberation.” Bloom’s preferred original title—before being overruled by Simon and Schuster—was Souls Without Longing. He was above all concerned that students, in being deprived of the experience of living in their own version of Plato’s cave, would never know or experience the opportunity of philosophic ascent.

This core of Bloom’s analysis seems to be not only correct, but, if possible, he may have underestimated its extent. Consider the intense response to Bloom’s book as evidence against his thesis. The overwhelming response by academia and the intelligentsia to his work suggested anything but “indifference” among many who might describe themselves as cultural relativists. Extraordinary debates took place over what books and authors should and should not appear in the “canon,” and extensive efforts were undertaken to shape new curricula in light of new demands of “multiculturalism.” The opponents of Bloom’s book evinced a deep concern for the formation of students, if their concern for what and whom they read was any indication.

In retrospect, however, we can discern that opponents to Bloom’s book were not the first generation of “souls without longing,” but the last generation raised within households, traditions, and communities of the sort that Bloom described, and the last who were educated in the older belief that a curriculum guided the course of a human life. The ferocity of their reaction to Bloom was not simply born of a defense of “multiculturalism” (though they thought that to be the case) but a belief that only a curriculum of the right authors and books properly shapes the lives of their students. Even in their disagreement with Bloom, they shared a key premise: the books we ask our students to read will shape their souls.

Today we live in a different age, one that so worried Bloom—an age of indifference. Institutions of higher learning have almost completely abandoned even a residual belief that there are some books and authors that an educated person should encounter. A rousing defense of a curriculum in which female, African-American, Latino, and other authors should be represented has given way to a nearly thoroughgoing indifference to the content of our students’ curricula. Academia is committed to teaching “critical thinking” and willing to allow nearly any avenue in the training of that amorphous activity, but eschews any belief that the content of what is taught will or ought to influence how a person lives.

Thus, not only is academia indifferent to whether our students become virtuous human beings (to use a word seldom to be found on today’s campuses), but it holds itself to be unconnected to their vices—thus there remains no self-examination over higher education’s role in producing the kinds of graduates who helped turn Wall Street into a high-stakes casino and our nation’s budget into a giant credit card. Today, in the name of choice, non-judgmentalism, and toleration, institutions prefer to offer the greatest possible expanse of options, in the implicit belief that every 18- to 22-year-old can responsibly fashion his or her own character unaided.

Bloom was so correct about the predictable rise of a society defined by indifference that one is entitled to conclude that were Closing published today, it would barely cause a ripple. This is not because most of academia would be inclined to agree with his arguments any more than they did in 1987. Rather, it is simply the case that hardly anyone in academe any longer thinks that curricula are worth fighting over. Jesse Jackson once thought it at least important to oppose Western Civilization in the name of an alternative; today, it would be thought untoward and unworkable to propose any shared curriculum.

Those who run institutions of higher learning tell themselves that this is because they respect the choices of their young adult charges; however, their silence is born precisely of the indifference predicted by Bloom. Today’s academic leaders don’t believe the content of those choices has any fundamental influence on the souls of our students, most likely because it would be unfashionable to believe that they have souls. As long as everyone is tolerant of everyone else’s choices, no one can get hurt. What is today called “tolerance,” Bloom rightly understood to be more deeply a form of indifference, the extreme absence of care, leading to a society composed not only of “souls without longing” but humans treated as utilitarian bodies that are increasingly incapable of love.

If this core argument of Bloom’s seems prescient, a second major argument not only seems to me incorrect but in fact is contradicted by this first argument. It was because of his criticisms about the rise of “multiculturalism” that Bloom came to be readily identified with the right-leaning culture-warriors like William Bennett and Dinesh D’Souza and was so vilified on the academic left. Yet Bloom’s first argument implicitly makes a qualified praise of “multiculturalism,” at least as the necessary launching pad for the philosophic quest. In his praise of the belief structures that once inspired some students to disillusionment, he was singing the praises of a society composed of various cultural traditions that exercised a strong influence over the beliefs and worldviews of that culture’s youth.

Such qualified praise led him to wax nostalgic about an age when Catholics and Protestants cared enough to hate one another. But at his most alarmist—and, frankly, either least perceptive or most pandering—Bloom portrays then-regnant calls for “multiculturalism” as a betrayal of the norms of liberal democracy and as the introduction of dangerous tribalism into the university, as well as the body politic. At times, Bloom painted a portrait in which the once-ascendant claims of American individual rights, enshrined in the Declaration of Independence, were about to be displaced by the incipient warfare of identity tribalism and groupthink.

At his best, Bloom sees through the sham of yesterday’s “multiculturalism” and today’s push for “diversity”—little of which had to do with enthusiasm for real cultural diversity, but which was then and remains today a way for individuals in under-represented groups to advance entitlement programs within America’s elite institutions. Those individuals, while claiming special benefits that should accrue to members in a particular group, had no great devotion to any particular “culture” outside the broader American anti-culture of liberalism itself. Indeed, the “cultures” in question were never really cultures at all, if by a culture we mean an identifiable group of people who share a generational, geographical, and distinctive set of customs aimed at shaping the worldview and practices of successive generations.

By this measure, women, blacks, Hispanics, and so on were people who might once have belonged to a variety of particular cultures, albeit not specifically as women or blacks or Hispanics. These new categorical groupings came to be based on claims of victimhood rather than any actual shared culture; many cultures have been persecuted, but it does not follow that everyone who has been mistreated constitutes a culture. While in passing Bloom acknowledged the paucity of such claims to cultural status, too often he was willing to take seriously professions of “multiculturalism” and to lament the decline of the American project of universalist natural rights.

The stronger case would have been to expose the claims of multiculturalism as cynical expressions from members of groups that did not, in fact, share a culture, while showing that such self-righteous claims, more often than not, were merely a thin veneer masking a lust for status, wealth and power. If the past quarter century has revealed anything, it has consistently shown that those who initially participated in calls for multiculturalism have turned out to be among the voices most hostile to actual cultures, particularly ones seeking to maintain coherent religious and moral traditions.

Bloom was prone to obtuseness about this fact because, at base, Bloom himself was not an admirer or supporter of the multiplicity of cultures. Indeed, he was suspicious and even hostile to the claims of culture upon the shaping of human character and belief—including religious belief. He was not a conservative in the Burkean sense; that is, someone apt to respect the inheritances of tradition and custom as a repository of past wisdom and experience. Rather, he was at his core a liberal: someone who believes that the only benefit of our cultural formation was that it constituted a “cave” from which ambitious and rebellious youth could be encouraged to pursue a life of philosophy.

Reflection about Bloom’s distaste for particular cultures suggests that the differences between Bloom and his apparent nemesis, the Cornell professor of psychology, are rather minimal. Both wanted to disabuse the youth of their “prejudices” in the name of openness: the psychology professor in the name of nihilisitic openness, and Bloom for the encouragement of philosophical inquiry, open to the possibility of Truth as well as the possibility of nihilism.

In fact, Bloom’s critique of the “multicultural” left is identical to and drawn from the critique of the “multicultural” right advanced by his teacher, Leo Strauss. In his seminal work Natural Right and History, Strauss identified Burke’s criticisms of the French Revolution as one of the lamentable responses to the “Crisis of Modern Natural Right,” a crisis that arose as a reaction against the social contractarianism of “modern natural right.” Burke’s argument against the revolutionary impulses of social contractarianism constituted a form of conservative “historicism”—that is, in Strauss’s view, the rejection of claims of natural right in favor of a preference for the vagaries of History. While today’s Straussians concentrate their criticisms largely on left historicism (i.e., progressivism), Strauss was just as willing to focus his criticisms on right historicism, that is, the traditionalism of Burke and his progeny.

Ironically, because the left in the 1980s adopted the language (if not the substance) of multiculturalism, Bloom was able to turn those Straussian critiques of Burke against those on the left—though of course they were no Burkeans, even if they used some Burkean language. For this reason, Bloom was assumed by almost everyone to be a “conservative,” a label that he not only explicitly rejected, but a worldview that he philosophically and personally abhorred.

Bloom’s argument became a major touchstone in the development of “neoconservatism,” a label that became associated with many fellow students of Strauss but which, ironically, explicitly rested on rejection of the claims of culture, tradition, and custom—the main impulses of Burkean conservatism. Bloom continuously invoked the natural-rights teachings of the Declaration and Constitution as necessary correctives to the purported dangers of left multiculturalism: rather than endorsing the supposed inheritance of various cultures, he commended the universalistic claims of liberal democracy, which ought to trump any identification with particular culture and creed. The citizen who emerged from the State of Nature, shorn of any specific cultural, religious, or ancestral limitation, was the political analogue for the philosopher who emerged from the Cave. Not everyone could become a philosopher, Bloom insisted, but everyone could be a liberal citizen, and ought rightly to be liberated from the limitations of place and culture—if for no other reason, to make them more tolerant of the radical philosophers in their midst.

Bloom’s was thus not only an early salvo in the culture wars, but an incipient articulation of the neoconservative impulse toward universalistic expansion. Burke’s willingness to acknowledge the basic legitimacy of most cultures—his “multiculturalism”—led him, in the main, to oppose most forms of imperialism. The rejection of multiculturalism, and the valorization of a monolithic liberal project, has inclined historically to a tendency toward expansionism and even imperialism, and neoconservatism is only the latest iteration of this tendency. While many of the claims about Strauss’s influence on the Iraq invasion and the neoconservative insistence upon spreading democracy throughout the world were confused, there was in fact a direct lineage from Bloom’s arguments against the multicultural left and rise of the neo-liberal or neoconservative imperialistic impulse. Bloom explicitly rejected the cautiousness and prudence endorsed by conservatism as a hindrance to philosophy, and thus rejected it as a political matter as a hindrance to the possibility of perfectibility:

Conservatives want young people to know that this tawdry old world cannot respond to their demands for perfection. … But … man is a being who must take his orientation by his possible perfection. …. Utopianism is, as Plato taught us at the outset, the fire with which we must play because it is the only way we can find out what we are.

Bloom here witheringly rejected “realism” as “the easy way out” of real inquiry; yet, in the wake of the Iraq invasion, one of Bloom’s longstanding allies and admirers, John Agresto, lamented the overconfidence of the neoconservatives, and especially their neglect of the reality of culture, in a post-invasion book entitled Mugged by Reality.

Bloom’s book remains a kind of liberation, an intellectually adventurous work written with a kind of boldness and even recklessness rarely to be found in today’s more politically correct and cramped age. But it was, ultimately, more reckless than many of its readers realized at the time—not because it was conservative, but precisely because it rejected the conservative impulses to modesty, prudence, the genius of place, and tradition. It opened an era of “culture wars” in which the only combatant who seemed absent from the field was a true conservatism. Perhaps it is finally time for an opening of the American mind.

Patrick Deneen is David A. Potenziani Memorial Associate Professor of Constitutional Studies at the University of Notre Dame.

Allan Bloom and the Conservative Mind

by Jim Sleeper


CONSERVATIVES in 1987 may still have been basking in Ronald Reagan’s “morning in America,” but nothing prepared their movement, or the academic and publishing worlds, for the wildfire success of Allan Bloom’s “Closing of the American Mind: How Higher Education Has Failed Democracy and Impoverished the Souls of Today’s Students.” Amid a furor recalling that over William F. Buckley Jr.’s “God and Man at Yale” in 1951, Bloom indicted liberal academics for betraying liberal education. His attack sold more than a million copies.

Who on an American campus could ignore Bloom’s accounts of Cornell faculty groveling before black-power student poseurs, or his sketches of politically correct administrator-mandarins and ditzy pomo professors? What dedicated teacher could dismiss his self-described “meditation on the state of our souls, particularly those of the young, and their education”? Some thoughtful liberals found themselves reading “The Closing” under their bedcovers with flashlights, unable either to endorse or repudiate it but sensing that some reckoning was due.

Conservatives championed Bloom then, of course, and they invoke him still. Roger Kimball, the Managing Editor of the conservative New Criterion, writes in an article, “Retaking the University: A Battle Plan”: “Traditionally, a liberal arts education involved both character formation and learning . . . to produce men and women who (as Allan Bloom put it) had reflected thoughtfully on the question ‘What is man?’ ” Kimball charges that the “adversary culture of the intellectuals” has taken over universities, an accusation echoed across a growing web of conservative campus activists, including Daniel Pipes’s Campus Watch, which tracks the utterances of leftist professors on the Middle East; the Collegiate Network, which trains combative conservative student journalists; the Intercollegiate Studies Institute of conservative campus organizations; and David Horowitz’s Center for the Study of Popular Culture, whose “Academic Bill of Rights” — which would subject professors to student grievances against political discrimination — is now before several state legislatures.

But everyone seems to have missed the elephant in the room: Bloom’s ostensibly conservative meditation in fact anticipated and repudiated almost every political, religious and economic premise of Kimball’s and Horowitz’s movement. Conservatives who reread Bloom today are in for a big, perhaps instructive, surprise.

Far from being a conservative ideologue, Bloom, a University of Chicago Professor of Political Philosophy who died in 1992, was an eccentric interpreter of Enlightenment thought who led an Epicurean, quietly gay life. He had to be prodded to write his best-selling book by his friend Saul Bellow, whose novel “Ravelstein” is a wry tribute to Bloom. Far more than liberal speech codes and diversity regimens, the bêtes noires of the intellectual right, darkened Bloom’s horizons: He also mistrusted modernity, capitalism and even democracy so deeply that he believed the university’s culture must be adversarial (or at least subtly subversive) before America’s market society, with its vulgar blandishments, religious enthusiasms and populist incursions.

“The semi-theoretical attacks of right and left on the university and its knowledge, the increased demands made on it by society, the enormous expansion of higher education,” Bloom wrote, “have combined to obscure” the universities’ mission “to maintain the permanent questions front and center” and “to provide a publicly respectable place . . . for scholars and students to be unhindered in their use of reason.”

Some conservatives may insist they are saying exactly that. But Bloom warned that liberal education is threatened as well by “proponents of the free market,” whose promise of social well-being “no longer compels belief,” and by religious belief that, “contrary to containing capitalism’s propensities, as Tocqueville thought it should, is now intended to encourage them.”

Bloom argued that our capitalist economy and liberal-democratic order turn civic virtue to mercenary ends. To cultivate “the use of reason beyond the calculation of self-interest,” he contended, “it is necessary that there be an unpopular institution in our midst that . . . resists our powerful urges and temptations.” That unpopular institution was the university. Surveying with nuanced regret what he saw as the failures of religion and of the Enlightenment (whose rationalism had collapsed into fascism or Communism), he hoped to rescue a classical Greek pedagogical tradition that wove eros and intellect into the love of knowing and the love of natural virtues.

Conservatives who reread Bloom will also discover that the 60’s left reminded him of the right-wing hordes his mentor Leo Strauss had encountered in Europe in the 30’s: “The fact that in Germany the politics were of the right and in the United States of the left should not mislead us. In both places the universities gave way under the pressure of mass movements” whose participants, full of animal spirits and spiritual animus, undertook “the dismantling of the structure of rational inquiry.” Yet Kimball and Horowitz themselves are trying to rouse a mass movement of alumni, the public and legislatures to “take back” the university.

“Many parents are alarmed, rightly so, at the spectacle of their children” coming back from college and jettisoning “every moral, religious, social and political scruple that they had been brought up to believe,” Kimball cries. But Bloom wanted reason to overturn familial and religious commitments, if necessary, to forge deeper attachments to truth and civic-republican virtue. Try to imagine Bloom’s seconding Kimball’s praise for “the rise of conservative talk radio, the popularity of Fox News . . . and the spread of interest in the Internet with its many right-of-center populist Web logs” as “heartening signs” that conservatives are becoming “a widespread counter to the counterculture” of universities.

Similarly, Horowitz’s Academic Bill of Rights would force professors to teach scholarly work opposed to their own. Most already do that, but it’s hard to imagine that Horowitz, or his conservative allies, want Milton Friedmanite free-marketeers to be required to tell their packed economics classes about Daniel Bell’s claim, anticipating Bloom, that our economy had led to “corporate oligopoly, and, in the pursuit of private wants, a hedonism that is destructive of social needs.”

Bloom wanted liberal education to resist both “whatever is most powerful” and the “worship of vulgar success.” True openness, he said, “means closedness to all the charms that make us comfortable with the present.” He disdained professors who strive to become counselors to the king and forget that “the intellectual, who attempts to influence . . . ends up in the power of the would-be influenced.” And he lamented the emergence of new academic departments like mass communications and business management, which “wandered in recently to perform some job that was demanded of the university.” A few years ago, a great university’s government department (not mine) nearly abolished its foreign-language requirement for Ph.D. candidates because “rational choice” whiz kids were touting a great new, universal language — computer English. An eminent conservative scholar and one of his formidable leftist colleagues rolled their eyes empathetically and voted together against the initiative.

Horowitz and other conservative activists know very well that Bloom didn’t reduce what he saw as liberal education’s crisis to a contest of left versus right: “I don’t want the universities to be conservative,” Horowitz himself protested recently to The Chronicle of Higher Education. “I want them to be academic, scholarly.” The magazine reported, however, that his small board of directors included John O’Neill of Swift Boat Veterans for Truth. That can’t be kind of the truth Allan Bloom had in mind.

ESSAY Jim Sleeper, a lecturer in political science at Yale, is the author of “The Closest of Strangers: Liberalism and the Politics of Race in New York” and “Liberal Racism.”



THE CLOSING OF THE AMERICAN MIND How Higher Education Has Failed Democracy and Impoverished the Souls of Today’s Students. By Allan Bloom. Foreword by Saul Bellow. 392 pp. New York: Simon & Schuster. $18.95.

ALLAN BLOOM, a Professor of philosophy and Political science at the University of Chicago, is perhaps best known as a translator and interpreter of Jean Jacques Rousseau’s ”Emile” and Plato’s ”Republic,” two classic texts that ponder the relationship between education and society. In ”The Closing of the American Mind,” Mr. Bloom has drawn both on his deep acquaintance with philosophical thinking about education and on a long career as a teacher to give us an extraordinary meditation on the fate of liberal education in this country – a meditation, as he puts it in his opening pages, ”on the state of our souls.”

Let me say at the outset that ”The Closing of the American Mind” is essential reading for anyone concerned with the state of liberal education in this society. Its pathos, erudition and penetrating insight make it an unparalleled reflection on the whole question of what it means to be a student in today’s intellectual and moral climate. But such qualities also make the book difficult to summarize briefly. Mr. Bloom ranges freely over centuries of thinking about freedom, values and the ends of education, moving with ease (to quote one of his more ambitious chapter headings) ”From Socrates’ Apology to Heidegger’s Rektoratsrede.” Yet the book’s scope and considerable learning have not made it any less immediate or compelling. In fact, one of the things that distinguishes it is its successful blending of erudition with great particularity. Among the more noteworthy examples of the latter is Mr. Bloom’s harrowing description, near the end of the book, of his experiences at Cornell University in the late 1960’s when students seized buildings at gunpoint, held professors hostage and intimidated a pusillanimous administration into a policy of appeasement.

As his title suggests, Mr. Bloom’s assessment of liberal education is not optimistic. In essence, he argues that over the last 25 years the academy has all but abandoned the intellectual and moral principles that have traditionally informed and given substance to liberal education, becoming prey to the enthusiasms – increasingly politicized – of the moment. While the eruption of violence and political activism in the 60’s marked the high point of those enthusiasms, in Mr. Bloom’s view, the university has yet to recover from the aftereffects of those disruptions. And because the university epitomizes the very spirit of free inquiry, which in turn is at the root of a free society, he concludes that ”a crisis in the university, the home of reason, is perhaps the profoundest crisis” for a modern democratic nation.

Mr. Bloom devotes a large part of the book to analyzing the character and intellectual disposition of those students who form his main subject and raison d’etre, liberal arts students ”who populate the twenty or thirty best universities.” Among much else, he describes the extent to which even such privileged students have in recent years ”lost the practice of and the taste for reading,” forsaking the companionship of books for the more accessible but less sustaining pleasures of movies and rock music. He discusses how changes in the family, especially the high incidence of divorce, have impinged upon the character of students, leaving them at once more cynical and less questioning, less critical. And he considers how the revolution in adolescent sexual mores not only wrought radical changes in sexual attitudes and behavior, but also has tended to dampen what Plato described as the ”erotic” element in education, the element of mystery and longing that has always been part of the excitement of discovering the world of liberal learning.

In all this, Mr. Bloom paints a sobering if not, alas, entirely unfamiliar picture. Today’s students, he finds, are generally ”nice” but passionless; above all, they are self-centered. More or less unthinkingly committed to an ethic of cultural relativism, they are intellectually and morally unambitious, ”spiritually detumescent.” The fundamental questions that have traditionally motivated a liberal education -What is the good? What is truth? What should I do? -strike them as hopelessly naive and beside the point.

Mr. Bloom makes it quite clear that he considers ”the good old Great Books approach” the ”only serious solution” to the crisis in education; and, as he stresses again and again, liberal education consists precisely in ”knowing the alternative answers and thinking about them.” At the same time, Mr. Bloom is skeptical about what he describes as ”the Great Books cult,” enumerating its deficiencies – from its tendency to encourage a kind of autodidactic amateurism to its penchant for ”a certain coarse evangelistic tone” – with greater penetration than many opponents of the approach.

In fact, one of the chief things to appreciate about ”The Closing of the American Mind” is that its dominant stance is interrogative, not prescriptive. Everything problematic that the term modernity implies, all the doubts about the meaning of tradition, the legitimacy of inherited values, the point of preserving high culture – all this Mr. Bloom is perfectly cognizant of. He, too, has read Nietzsche, and his discussion betrays none of the naivete that many conservative treatments of such matters display. Nor does he imply that the answer to the problem of liberal education is to return to some simpler, less encumbered past. About changes in the American family, for example, he notes that he is ”not arguing here that the old family arrangements were good or that we should or could go back to them. I am only insisting that we not cloud our vision to such an extent that we believe that there are viable substitutes for them just because we want or need them.”

Of course, this book will find many enemies – mostly, I suspect, because of its avowedly traditional vision of what it means to be an educated person. And no doubt many will object that this portrait of liberal education is in many ways a caricature or an exaggeration. Certainly, there are exceptions to the rule of mediocrity and ideological posing that Mr. Bloom anatomizes in these pages; but the question remains whether his general assessment is not in fact accurate.

Indeed, it is difficult not to conclude that ”The Closing of the American Mind” is that rarest of documents, a genuinely profound book, born of a long and patient meditation on questions that may be said to determine who we are, both as individuals and as a society. And while Mr. Bloom’s indictment is severe, it is by no means despairing. As he notes in his concluding remarks, despite the fragmentation and disorder in the university today, ”The questions are all there. They only need to be addressed continuously and seriously for liberal learning to exist; for it does not consist so much in answers as in the permanent dialogue.” With ”The Closing of the American Mind,” Mr. Bloom takes his place as an articulate participant in that dialogue.

NY Times Book Review–Sisters in Law

September 20, 2015

On Sandra Day O’Connor and Ruth Bader Ginsburg– Sisters in Law

by Linda Greenhouse

Sanda Dat OConnor

Ruth Bader Ginsburg

Two young women, near age-mates, grow up in very different corners of the country, one in near isolation on a vast Southwestern cattle ranch and the other on the crowded streets of Brooklyn. They obtain superb educations, enter into early marriage and motherhood, and set out to make their way in a man’s world. Decades later, we find them, having broken through more than a few glass ceilings, sitting together on the United States Supreme Court.

For anyone interested in the court, women’s history or both, the story of Sandra Day O’Connor and Ruth Bader Ginsburg, their separate routes to the Supreme Court and what they accomplished during the more than 12 years they spent together is irresistible. But “Sisters in Law,” with its ambitious subtitle, raises more questions than it answers. Did Justices O’Connor and Ginsburg really change the world? Or did they make it all the way to the Supreme Court, as the first and second women ever to serve there, because the world had changed?

There is a fascinating book struggling to emerge from the narrative structure Linda Hirshman has imposed on rich material. We glimpse it on those occasions when Hirshman chooses to highlight not the similarities between the two women but their differences. “Sandra Day O’Connor played defense; she would not permit the courts to roll the equality ball backward,” we’re told, while Ruth Ginsburg, for her part, “played offense.” Another way to put it, perhaps, is that while Ginsburg set out to change the world for women through her advocacy and her skill at picking just the right case to bring to the court at the right time, O’Connor had no such ambition. She chose to live largely in the world that continually opened before her, as she turned her social networks to her advantage and found her way into electoral politics. (She became majority leader of the Arizona Senate, the first woman in the country to hold so high a state legislative office.) O’Connor’s gift was the instinct for strategic and indispensable compromise. During her years at the center of the court — the role played since her departure in 2006 by Justice Anthony M. Kennedy — she often deployed concurring opinions “to make the conservative rulings more liberal and liberal opinions more conservative, usually by tying the outcome to the particular facts in the case.”

Sisters in Law

“O’Connor was by no means a committed strategist for women’s rights,” Hirshman writes. “She was not a robust voice for social change.” Hirshman, a lawyer and a scholar of feminism, whose last book was the well-received “Victory: The Triumphant Gay Revolution,” writes with authority and obvious admiration about Ginsburg (although with an odd fixation on the justice’s physical stature, describing her variously as “tiny,”  ­“minuscule,”  “skinny,”  ­“petite,”  “small” and, twice, “diminutive”).

But Hirshman struggles noticeably with what to make of O’Connor (“large, blond,”  “open-faced, cheerful and energetic”), and with how to fit her into the book’s overall construct. Hirshman properly cites O’Connor’s “tightfisted votes for equality,” her “ungenerous opinions even in cases where she voted for the woman’s side” and her “endless dalliance with allowing ever more intrusive restrictions” on access to abortion. She ex­presses puzzlement at O’Connor’s support for President Richard Nixon’s nomination of a fellow Arizonan, William H. Rehnquist, to the Supreme Court in 1971. O’Connor’s “passionate advocacy” for Rehnquist — she offered to testify for him at his confirmation hearing but was told that wouldn’t be necessary — “presents the question of how serious a feminist she was.” Really? Maybe O’Connor wasn’t thinking in ideological terms at all, but was simply thrilled that her old friend, with whom she had shared top academic honors at Stanford Law School, had reached the pinnacle of the legal profession.

The book’s title is offered without irony, but while Hirshman is too astute an observer to believe it fully, she is stuck with it nonetheless. This would have been a more coherent and satisfying book had she been willing to portray her subjects as I think she actually does understand them: not as sisters yoked together in a common project, but rather as representatives of the different ways that smart, ambitious women navigated life in mid-20th- century America, when social norms and expectations were changing but old patterns still prevailed.

Ginsburg, rejected for a clerkship by Justice Felix Frankfurter despite recommendations from leading law professors of the era because “I’m not hiring a woman,” eventually committed herself to uprooting the legal system’s built-in assumptions about the appropriate roles for women and men. O’Connor, offered a job as a legal secretary at a big California law firm because “our clients wouldn’t stand for” being represented by a woman, has probably never to this day labeled herself a feminist. With one avenue blocked, she shifted course and made her way in private practice and government service.

But by the choices she made, O’Connor lived feminism as a fact even if she didn’t embrace it as a cause, as Joan Bis­kupic documented in her sure-footed 2005 biography, “Sandra Day O’Connor: How the First Woman on the Supreme Court Became Its Most Influential Justice.” Upon taking her seat on the court in September 1981 (three years almost to the day after “First Monday in October,” a comedy that played the notion of a female Supreme Court justice for laughs, opened on Broadway), O’Connor became the ultimate symbol of women’s progress. In retiring in January 2006, at the age of 75, to care for her Alzheimer’s-disease-stricken husband, she became a symbol of women in a more traditional role, as caregiver. (While male justices have become widowers while serving on the court — Justice William J. Brennan Jr. and Chief Justice Rehnquist are recent examples — none left the bench to care for their spouses.)

In the book’s final pages, Hirshman suggests what might have been a powerful theme: that there had to be a Sandra Day O’Connor on the Supreme Court bench before there could be a Ruth Bader Ginsburg. “O’Connor had made it easier for her,” Hirshman writes. She even seems to forgive O’Connor the failings she has spent many pages chronicling: “Sounding so conservative and framing her mildly pro-woman decisions time after time as protective of authority — employers, school administrators — she represented the farthest women could hope to go in light of the irresistible conservative resurgence of the late 20th and early 21st centuries.” O’Connor displayed “laser judgment about what the court — and the society — would digest at any particular moment.” Indeed, while Ruth Ginsburg’s voice has become ever more powerful, it is, in the main, the power of the passionate and unanswerable dissent.

“Each one was better off for the other being there,” Hirshman writes. And now there are three.

Linda Greenhouse teaches at Yale Law School. Her new book (with Michael J. Graetz), “The Burger Court and the Rise of the Judicial Right,” will be published next June.

A version of this review appears in print on September 20, 2015, on page BR1 of the Sunday Book Review with the headline: ‘Sisters in Law’.

Review of Catch-Up Industrialization

September 16, 2o15

Review of Catch-Up Industrialization

Akira Suehiro, Catch-Up Industrialization: The Trajectory and Prospects of East Asian Economies

Translated by Tom Gill. Singapore and Kyoto: NUS Press and Kyoto University Press, 2008. Pp. xvi, 395; tables, figures, note on names, notes, bibliography, author and subject indices.

Akira Suehiro is a pioneer of the study of modern Southeast Asian political economy.   Along with scholars such as Richard Robison, Kevin Hewison and Gary Rodan, Suehiro has distinguished himself by taking Southeast Asian capital seriously. Through relentless data gathering and sensitivity to historical context, he has not only documented the growth of diversified business groups in Thailand but also in an extensive set of studies, such as Capital Accumulation in Thailand 1885-1995 (1), shed light on the shifting political contexts in which this development occurred.

This work has had an important theoretical impact: it provided the basis for other scholars to break out of a narrow dependency-theory framework that assumed that local capitalists in developing countries functioned solely as subordinate representatives of multinational corporations. Instead, Suehiro has argued, Thai capital and its constituent parts, especially banks and diversified business groups, function as key components of a late industrializing economy. His subsequent work has ranged from more straightforward political economy, as in his 2010 study of industrial restructuring policies in Thailand (2), to broader, more comparative work on the institutions of late developers, as represented in the volume under review, Catch-Up Industrialization, translated from Japanese into a very readable English-language version by Tom Gill.

This impressive study is divided into two parts. The first, containing four chapters, presents the broader concepts and theories framing the book’s empirical analysis. One of the most useful features of this part of the book is Suehiro’s comprehensive, critical and readable overview of approaches to late development, ranging from Liszt and Gerschenkron, to cultural/Confucian approaches, to Akamatsu’s “flying geese” model, to Michael Porter’s national competitiveness framework, to Schumpeter’s approach to innovation, to the “Washington Consensus” and its statist critics, such as Alice Amsden. Along with the treatment of these approaches are discussions of Japanese scholarship that may be less known to Western readers.


This overview leads to the presentation of the book’s core organizing concept, what Suehiro calls “social capability” for industrialization (page 8), a concept that highlights the roles and interactions of important public and private agents in national development and bears resemblance to formulations such as “systems of innovation.”(3) The concept not only helps to shed light on the Asian Financial Crisis, discussed in Chapter Four. But, most critically, it also structures Suehiro’s ability to do in the nine chapters comprising the second part of the book what few analysts even attempt – namely, to explore development from bottom to top: from the factory floor and labor markets, to systems of education and training, to strategies of multinationals and domestic family-owned groups, to global value chains, to the ideologies and politics of developmentalism.

A book with the breadth and depth of this study defies easy summary, but several specific strengths merit special note. First, Catch-Up Industrialization does more to integrate diverse theories, such as human capital theory and Schumpterian approaches to innovation, with a wide range of empirical evidence more than almost any book I can think of. Second, book is more comparative than most. Suehiro seeks to account for variation in development in East Asia by combining in-depth analysis of Thailand, the country to which he has devoted much of his career, with theoretically and empirically informed analysis of other national experiences, including Japan’s. Third, Suehiro’s treatment of a central concern of the book – innovation –avoids one-size-fits-all approaches to the topic. It recognizes diverse challenges of different stages, and emphasizes the incremental nature of technology absorption and innovation in late developers. Fourth, the book stresses that success in efforts toward innovation requires systemic coordination among key actors and functions. One factor that distinguishes Thailand from its higher-performing counterparts among the East Asian newly industrialized countries (NICs) relates to the importance of integrating trade policy with industrial policy. Fifth, however, what this book offers is far from a state-centric argument. Suehiro avoids glorifying the state by recognizing both costs and benefits of state initiatives, by tracing the shift from states as conductors to facilitators, and by stressing the crucial roles of local firms and labor.

Another strength is that the book’s analysis of import substitution and export promotion strategies highlights the importance of linkages, including in the case of agriculture. Suehrio is especially thoughtful on Thailand’s agricultural diversification and the resultant linkages, as these contrast with the trajectories of the East Asian NICs. His even-handed treatment recognizes the country’s significant achievements as a newly agro-industrializing country (NAIC). But it also notes the risks in such a strategy. These include not only environmental problems but also, and most critically for Suehiro’s analysis of Thailand’s weaknesses, the fact that an agriculture-based industrialization strategy is in some ways self-limiting. Because it “does not require a high level of domestic technology formation,” the strategy more easily attracts capable rivals and “does not upgrade a country’s industrial structure” (pages 138-139).

Of course, labor—as class and human capital—is fundamental to innovation, and Catch-Up Industrialization devotes significant space to the structure of labor markets, to personnel management, to the politics of labor movements, and to diverse approaches to education and training. These pages document important shifts in East Asian work forces, including not only growing female participation but also a pattern that has only increased in significance since the book’s publication – namely, the expansion of casualization, usually in the form of short-term contract workers. Suehiro’s discussion of human resource development recognizes plain old exploitation, but he usefully emphasizes the factory floor frictions that arise from rapid shifts to new labor systems as part of the move to export promotion (page 278).

The book’s discussion of labor politics highlights the fact that worker participation in post-war Japan and the NICs occurred “more in the context of shop-floor management than in politics, notably through collective contributions to movements to improve productivity and quality control” (page 281), that such productivity-related engagement has not occurred in Thailand, and that the 1997 Asian Financial Crisis provoked a move to ever-more flexible labor markets and weakened labor representation across the region, especially in Thailand.   Only in Singapore, Suehiro notes, does labor, through the National Trades Union Congress, seem to constitute a partner in productivity growth, albeit a politically subordinate one.

Not surprisingly, this book’s broad scope and ambitions result in some tensions and gaps. An important substantive tension involves the question of Thai capital. Even as Suehiro highlights the potential of family-owned business groups as indispensable components of Thailand’s catch-up process, he recognizes the weaknesses in these firms’ “organizational capability” (page 245) to absorb and develop technology. One is left with the question, what happened to Thai capital? Here the analysis could have benefited from a more explicit and systematic political economy treatment of the negative complementarities between political dynamics, such as coalitional instability and populism, and corporate strategies that stress risk reduction, through diversification rather than the development of technological competence.(4)

A more extensive political analysis could also have helped account for the ever-deteriorating position of labor both as political force and as partner in productivity growth. Here it would have been useful to draw out the impact of the growing casualization and informality noted briefly in the book. Addressing both of these issues would have allowed Suehiro perhaps to anticipate the dangers of the middle-income trap.

But a book cannot do everything, and this one does much more than most not only to address the core challenges facing late developers, but also, in its concluding chapter, to address goals such as job creation, income distribution, and broader quality of life. Such goals have become ever more challenging for countries such as Thailand, as they attempt to reconcile engagement in today’s global economy with sustainable and inclusive development.

Rick Doner is Goodrich C. White Professor of Political Science at Emory University.


1. Akira Suehiro, Capital Accumulation in Thailand 1885-1995 (Tokyo: Center for East Asian Cultural Studies, 1989).

2. Akira Suehiro, “Industrial Restructuring Policies in Thailand: Japanese or American Approach”, pp. 129-173 in Patarapong Intarakumnerd and Yveline Lecler, Sustainability of Thailand’s Competitiveness: The Policy Challenges (Singapore: ISEAS, 2010).

3. Richard Nelson, National Innovation Systems: A Comparative Analysis (New York: Oxford University Press, 1993).

4. Ben Schneider, Hierarchical Capitalism in Latin America: Business, Labor, and the Challenges of Equitable Development (New York: Cambridge University Press, 2013).


Book Review–Ike and Apprentice Dick

September 12, 2015

History: Dwight Eisenhower and Apprentice Richard Nixon



Ike and DickDwight David Eisenhower and Richard Milhous Nixon

Dr. Martin Luther King Jr., someone who unquestionably understood charisma, considered Vice President Richard Nixon “one of the most magnetic personalities” he had ever encountered. “When you are close to Nixon,” King observed in 1958, “he almost disarms you with his apparent sincerity.” But King also worried that there might be a hidden duality to Nixon, or worse, a facade. If the Vice President was actually insincere, King warned, he could be “the most dangerous man in America.”

Nixon’s Vice-Presidential years are arguably the least well-known of his long political career. It has been over 20 years since Stephen Ambrose wrote the first and until now only major book to focus on Nixon’s Vice Presidency. Much has since been released about the Eisenhower administration, and Ambrose’s own research methods have been called into question. But the reason Nixon’s activities between 1952 and 1961 are comparatively little understood also relates to a problem inherent in studying vice presidencies. Big decisions emanate from the White House, not the Vice President’s office (though Dick Cheney may have broken the mold). Furthermore, the most influential Vice Presidents know to keep their advice confidential.

With the publication of “The President and the Apprentice,” Irwin F. Gellman hopes to fill that void. He is a prodigious researcher, who made his name with fine books on Franklin Roosevelt’s Cuba policy and on Sumner Welles. “The Contender,” his first book on Richard Nixon, covered the congressional years, and made the case that other historians had missed the Nixon behind the redbaiting.

In this long-awaited second volume, Gellman continues trying to set the record straight. He sees far less animosity in the peculiar political marriage between Nixon and Dwight Eisenhower than did Jeffrey Frank in his elegant and indispensable “Ike and Dick.” Gellman agrees with most historians that Eisenhower was prepared to drop Nixon from the ticket in 1952 over allegations about a secret fund set up by Southern Californian businessmen. Gellman, who has found the notes Eisenhower made while watching Nixon give the so-called Checkers speech, concludes that the General gained new respect for his running mate. Persuaded that Nixon was being honest, and impressed by his savvy and political courage, Eisenhower started to groom him for the Presidency.

Book Review--Eisenhower and his ApprenticeAlthough Nixon is clearly the “apprentice” of the title, what Gellman describes is more like a symbiotic relationship. Young enough to be Eisenhower’s son, Nixon traveled around the world for the President, serving as his eyes and ears. Presidential cynicism played a role in these assignments. Eisenhower exploited Nixon’s unassailable anti-Communist credentials to defend his policies abroad. At home, Eisenhower used Nixon to rally the Republicans’ restive right-wing base, occasionally wincing when Nixon verged on charging Democrats with treason but never ordering him to curtail his Reds! Reds! Reds! roadshows.

In a fascinating chapter on Nixon’s health, Gellman breaks new ground in understanding the man. Nixon’s trusted doctor Arnold Hutschnecker turns out to have been a Dr. Feelgood. Starting in 1952, Nixon sought help from Hutschnecker for a series of stress-induced ailments, and the doctor prescribed a medicine-­cabinetful of barbiturates and sleep aids (Seconal and Doriden), tranquilizers (Equanil) and “uppers” (Dexamyl), a potentially addictive, mood-altering cocktail that Nixon apparently took throughout the 1950s and possibly thereafter. We can now reconcile assertions by Nixon’s defenders that he drank little with evidence of strange late-night calls, slurred words and incoherence. As Gellman writes, “At the height of the Cold War, both the president and the vice president could easily have been simultaneously incapacitated, leaving no one responsible for governing.”

Like many Nixon scholars, Gellman believes that there were two Nixons. His private Nixon was a thoughtful pragmatist. The demagogy was political theater. “Nixon,” Gellman writes, “the inflexible ­anti-Communist in public, was far more flexible in private.” Unfortunately, instead of reflecting on the consequences of Nixon’s cynical use of anti-Communist rhetoric for the country, Gellman focuses on the cost to Nixon’s reputation. Had historians and the news media been allowed to sit in on Eisenhower’s national security meetings, he argues, they would have seen the real, non-ideological Nixon. Nixon’s crowning foreign policy achievement, the opening to China a decade later, would not then have so shocked Nixon watchers. “The roots of Nixon’s thinking about East Asia,” he asserts, “go back to his vice presidency.”

Gellman’s case for Nixon’s foreign policy pragmatism this early on is not persuasive. There is nothing in the book to suggest that Nixon was inclined to think a two-China policy possible. Nixon returned from a 1953 meeting with the Nationalist Chinese leader Chiang Kai-shek singing his praises, despite the fact that the delusional Chiang was lobbying for support of a 600,000-man army to invade the mainland and topple Mao. More important, Gellman tends to play down the scattered but unmistakable evidence that Eisenhower and Nixon disagreed on how cold the Cold War should be. Eisenhower, for example, wanted to expand East-West trade as a way of forcing the Soviets to be better players in the game of nations; Nixon thought this a bad idea. Nixon favored American armed intervention to help the French win their war in Indochina in 1954. Eisenhower wisely disagreed. In sum, when Eisenhower deviated from hard-line Cold War policies, at least in his first term, Nixon was uncomfortable.

It is on the explosive issue of race where pragmatism may be the best explanation for Nixon’s Vice Presidency. Nixon was Eisenhower’s personal representative to the civil rights community, and “The President and the Apprentice” provides a thorough accounting of his activities. Gellman rightly points out that the Eisenhower administration’s record on civil rights was as significant as the Truman administration’s. And Nixon was comfortable among ­African-Americans to an extent not shared by Eisenhower or Truman. African-­American leaders like King took notice.

Gellman is convinced that Nixon was a sincere advocate of civil rights. “Fighting for racial justice,” Nixon wrote privately in 1958, “is for me a moral as well as a legal obligation.” As a result, Gellman sees Nixon as unfairly tarred with racism. “During my 20 years of Nixon research,” Gellman says, “I have not found him uttering any racial slurs.” He then cites another scholar, Luke Nichter, to demonstrate that even on the infamous tapes, where Nixon revels in using every other dirty word, the N-word does not escape his lips.

People of good faith can debate whether in fact they hear that word on the often muddy recordings, but racism is not exclusively the use of an epithet. In two chilling conversations with Daniel Patrick Moynihan in October and December 1971, Nixon discussed the implications for federal social policy of “science” allegedly showing that the Negro race was genetically inferior. Nixon, at least as President, believed that race largely determined I.Q.

Although Gellman’s research is extensive and his work on Nixon’s well-being is essential reading, this book is like a feast that leaves one hungry. A bit too quick to distance himself from the most ­single-minded of Nixon’s critics, Gellman provides an equally simplistic theory for what lay behind the actions of a publicly loyal Vice President. His Nixon is a little bland: loyal, eager and, though politically cynical, deeply misunderstood. As Vice President, Nixon clearly did not have the power to be “the most dangerous man in America.” That power would come later.

Timothy Naftali, clinical associate professor of history and public service at New York ­University, is the Founding Director of the Federal Richard Nixon Presidential Library and Museum.


Get every new post delivered to your Inbox.

Join 4,416 other followers