Book Review: Thinking without a Banister

April 13, 2018

The Philosopher in Dark Times

Essays in Understanding, 1953-1975
By Hannah Arendt
Edited by Jerome Kohn
569 pp. Schocken Books. $40.

What is the relationship between thinking, acting and historical consciousness? How do we preserve a spirited intellectual autonomy that yet includes enough sense of the past to contextualize and resist those power-grabbers who would bamboozle the public with their own fun house versions of truth? Hannah Arendt, the philosopher and political theorist, was always acutely concerned with questions of how to make thought and knowledge matter in the struggle against injustice, never more so than in the last two decades of her life, when the rich medley of the material collected in “Thinking Without a Banister” was created. “What really makes it possible for a totalitarian or any other kind of dictatorship to rule is that the people are not informed,” she remarked in a 1973 interview. “If everyone always lies to you, the consequence is not that you believe the lies, but that no one believes anything at all anymore — and rightly so, because lies, by their very nature, have to be changed, to be ‘re-lied,’ so to speak.” A lying government pursuing shifting goals has to ceaselessly rewrite its own history, leaving people not only dispossessed of their ability to act, “but also of their capacity to think and to judge,” she declared. “And with such a people you can then do what you please.”



Image result for hannah arendt

She’d seen this process firsthand. Born in Germany in 1906, a Jew by birth and an iconoclast by temperament, she fled her native country after Hitler became chancellor in 1933, first for Czechoslovakia, then Switzerland, then Paris, where she was living in 1937 when the Nazis officially eradicated her citizenship so that she became stateless. Some of her most potent work reflects on the consequences of eliminating people’s national identity. Deprivation of citizenship should be classified as a crime against humanity, Arendt argued, because most legal protections are now conferred through functioning state governments. “Some of the worst recognized crimes in this category have … not incidentally, been preceded by mass expatriations,” she wrote, adding that the state’s ability to sentence someone to death was minor compared with its right to denaturalization, since the second could put the subject entirely beyond the pale of the law. Such passages make for particularly chilling reading at a moment when America has begun rescinding the temporary protected status of thousands of longtime residents, threatening to deport them to their countries of origin, some of which labor under severe economic disadvantages and sociopolitical strains, where their rights and safety cannot be assured.

A year after the fall of France, in the spring of 1941, Arendt emigrated to the United States. Through her prolific essays, she began building a reputation as a penetrating thinker with an urbane and unceremonious style that she would attribute to her zest for “pearl diving” in history. Tradition having been shattered by the calamitous events of the 20th century, she saw her task as plucking the precious bits from time’s waves and subjecting them to her critical thinking, without pretending they could be melded back into any grand, systemic whole. She warned her audience that if they attempted to practice her “technique of dismantling,” they had to be “careful not to destroy the ‘rich and strange,’ the ‘coral’ and the ‘pearls,’ which can probably be saved only as fragments.”

In New York, Arendt’s intellectual acuity and conversational punch swiftly translated into social cachet. After meeting her at a dinner party in the mid-1940s, the literary critic Alfred Kazin was smitten: “Darkly handsome, bountifully interested in everything, this 40-year-old German refugee with a strong accent and such intelligence — thinking positively cascades out of her in waves,” he wrote in his diary.


Image result for hannah arendt

Arendt’s sheer delight in intellectual speculation counterpoints her intense ethical commitment to thinking as a form of political engagement.


Though she would only fully embrace the principle of amor mundi, love of the world, after contending philosophically with the cataclysm of World War II, the insatiable curiosity was there early on. “I believe it is very likely that men, if they ever should lose their ability to wonder and thus cease to ask unanswerable questions, also will lose the faculty of asking the answerable questions upon which every civilization is founded,” she declared in one address. Arendt’s sheer delight in intellectual speculation counterpoints her intense ethical commitment to thinking as a form of political engagement.


The relationship was sometimes uneasy and often controversial, most famously in the case of her account of Adolf Eichmann’s trial in Jerusalem, in which she coined the term “the banality of evil.” Watching Eichmann testify in his glass booth, Arendt became convinced that he was, above all, an inarticulate buffoon whose wicked deeds resulted from his participation in a bureaucratic structure that dissipated the sense of personal responsibility, and deadened the capacity for cognition. Gershom Scholem, the pioneering scholar of kabbalah, was one of many public intellectuals who felt that Arendt had lost track of the human reality of the Holocaust amid the scintillating twists of her argument. She had failed to reckon with the raw pleasure that playing God over others could afford, and so had overemphasized the role of systemically enforced thoughtlessness in preparing individuals to execute enormous crimes. Recent historical scholarship suggests that Arendt did, indeed, underestimate Eichmann’s ideological passion for National Socialism: Much of his clownish bumbling in Jerusalem may have been a conscious, self-exculpating performance. But her core insight into how even mediocrities can be institutionally benumbed and conscripted into heinous projects remains fertile.


Image result for hannah arendt


Some of the work anthologized in this volume, edited by Jerome Kohn, comprises Arendt’s responses to current events, like her analysis of the televised 1960 national conventions, in which Kennedy and Nixon were the principal rivals, offering a rather surprising defense of the onscreen experience as a revealing format for viewing those “imponderables of character and personality which make us decide, not whether we agree or disagree with somebody, but whether we can trust him.” Other essays provide deep conceptual etymologies of historical events, key figures and schools of thought. These include her profoundly enlightening study of how Karl Marx fits into the long Western political tradition and her detailed analysis of the challenge that the 1956 Hungarian revolution posed to the Russian military and propagandistic juggernaut. The most dynamic pieces here are Arendt’s interviews, in which the sweep and depth of her ruminations are layered with the caustic wit and engagé appeal of her voice. For all Arendt’s opposition to totalitarianism — and her willingness to implicate Marx in the development of certain totalitarian movements — Arendt remained unabashedly enamored of Marx’s proposition that “the philosophers have only interpreted the world. … The point, however, is to change it.” She relished his determination to wrest higher thought from the supine realm of the Greek symposium and thrust it into the ring of political activism, challenging, as she wrote, “the philosophers’ resignation to do no more than find a place for themselves in the world, instead of changing the world and making it ‘philosophical.’” For Arendt, thinking that helped advance the cause of human freedom entailed a form of relentlessly critical examination that imperiled “all creeds, convictions and opinions.” There could be no dangerous thoughts simply because thinking itself constituted so dangerous an enterprise.

Almost every essay in this book contains “pearls” of Arendt’s tonically subversive thinking, and many of her observations push readers to think harder about the language in which political activity is conducted. Reflecting on the numerous allusions to “reason of state” that crept into White House discourse after Watergate, she notes how the term became synonymous with national security. “National security now covers everything,” she commented, including “all kinds of crime. For instance, ‘the president has a right’ is now read in the light of ‘the king can do no wrong.’” This is no longer a matter of justifying particular crimes, she warns, but rather concerns “a style of politics which in itself is criminal.” The indictment chimes with her taxonomy of the tyrant in an essay titled “The Great Tradition”: “He pretends to be able to act completely alone; he isolates men from each other by sowing fear and mistrust between them, thereby destroying equality together with man’s capacity to act; and he cannot permit anybody to distinguish himself, and therefore starts his rule with the establishment of uniformity, which is the perversion of equality.”

Image result for hannah arendt

Such observations should give pause to those who would prop up a tyrant for personal ends, and must redouble the opposition’s will to depose that ruler before the public’s capacity for thought and action alike is confounded.

NY Times Book Review: Steven Pinker’s Latest Book, Enlightenment Now

March 4, 2018

NY Times Book Review: Steven Pinker’s Latest Book, Enlightenment Now

The Case for Reason, Science, Humanism, and Progress
By Steven Pinker
556 pp. Viking. $35.

Optimism is not generally thought cool, and it is often thought foolish. The optimistic philosopher John Stuart Mill wrote in 1828, “I have observed that not the man who hopes when others despair, but the man who despairs when others hope, is admired by a large class of persons as a sage.” In the previous century, Voltaire’s “Candide” had attacked what its author called “optimism”: the Leibnizian idea that all must be for the best in this best of all possible worlds. After suffering through one disaster after another, Candide decides that optimism is merely “a mania for insisting that all is well when things are going badly.”

Yet one might argue (and Steven Pinker does) that the philosophy Voltaire satirizes here is not optimism at all. If you think this world is already as good as it gets, then you just have to accept it. A true optimist would say that, although human life will never be perfect, crucial aspects of it can improve if we work at it, for example by refining building standards and seismological predictions so that fewer people die in earthquakes. It’s not “best,” but it is surely better.

This optimist’s revenge on “Candide” is one of the passing pleasures in “Enlightenment Now,” Pinker’s follow-up to his 2011 book “The Better Angels of Our Nature.” The earlier work assembled banks of data in support of his argument that human life is becoming, not worse as many seem to feel, but globally safer, healthier, longer, less violent, more prosperous, better educated, more tolerant and more fulfilling. His new book makes the same case with updated statistics, and adds two extra elements. First, it takes into account the recent rise of authoritarian populism, especially in the form of Donald Trump — a development that has led some to feel more despairing than ever. Second, it raises the polemical level with a rousing defense of the four big ideas named in the subtitle: progress, reason, science and humanism — the last being defined not mainly in terms of non-theism (though Pinker argues for that, too), but as “the goal of maximizing human flourishing — life, health, happiness, freedom, knowledge, love, richness of experience.” Who could be against any of that? Yet humanism has been seen in some quarters as unfashionable, or unachievable, or both. Pinker wants us to take another look.

Much of the book is taken up with evidence-based philosophizing, with charts showing a worldwide increase in life expectancy, a decline in life-shattering diseases, ever better education and access to information, greater recognition of female equality and L.G.B.T. rights, and so on — even down to data showing that Americans today are 37 times less likely to be killed by lightning than in 1900, thanks to better weather forecasting, electrical engineering and safety awareness. Improvements in health have bettered the human condition enormously, and Pinker tells us that his favorite sentence in the whole English language comes from Wikipedia: “Smallpox was an infectious disease caused by either of two virus variants, Variola major and Variola minor.” The word “wasis what he likes.


Credit Alessandra Montalto/The New York Times

He later adds that he could have ended every chapter by saying, “But all this progress is threatened if Donald Trump gets his way.” Trumpism risks knocking the world backward in almost every department of life, especially by trying to undo the international structures that have made progress possible: peace and trade agreements, health care, climate change accords and the general understanding that nuclear weapons should never be used. All this is now in question. Pinker is particularly sharp on the dangers of ignoring or overriding the systems that make nuclear war unlikely.

Image result for Steven Pinker

This book will attract some hammering itself: It contains something to upset almost everyone. When not attacking the populist right, Pinker lays into leftist intellectuals. He is especially scathing about newspaper editorialists who, in 2016, fell over themselves in their haste to proclaim the death of Enlightenment values and the advent of “post-truth.” His (rather too broadly painted) targets include humanities professors, postmodernists, the politically correct and anyone who has something nice to say about Friedrich Nietzsche. “Progressive” thinkers seem to consider progress a bad thing, he claims; they reject as crass or naïve “the notion that we should apply our collective reason to enhance flourishing and reduce suffering.”

In fact, there may already be signs of a change in mood, with chirps of optimism being heard from varied directions. The musician David Byrne has just launched a web project entitled “Reasons to Be Cheerful,” celebrating positive initiatives in the realms of culture, science, transportation, civic engagement and so on. Quartz, a business journalism site, ended 2017 with a list of 99 cheerful links to the year’s good news: snow leopards being taken off the endangered species list; a province in Pakistan planting a billion trees over the last two years as a response to the 2015 floods; a dramatic fall in sufferers from the hideous Guinea worm (from 3.5 million in 1986 to just 30 in 2017); and a slow but steady increase in women holding parliamentary seats worldwide, from 12 percent in 1997 to 23 percent now.

Related image

Bertrand Russell once pointed out that maintaining a sense of hope can be hard work. In the closing pages of his autobiography, with its account of his many activist years, he wrote: “To preserve hope in our world makes calls upon our intelligence and our energy. In those who despair it is frequently the energy that is lacking.” Steven Pinker’s book is full of vigor and vim, and it sets out to inspire a similar energy in its readers.

He cites one study of “negativity bias” that says a critic who pans a book “is perceived as more competent than a critic who praises it.” I will just have to take that risk: “Enlightenment Now” strikes me as an excellent book, lucidly written, timely, rich in data and eloquent in its championing of a rational humanism that is — it turns out — really quite cool.

On Becoming A Philosopher

March 3, 2018

On Becoming A Philosopher

by A.C. Grayling

Image result for A.C.GraylingPhilosopher A.C. Grayling and Harvard’s Steven Pinker


“Socrates liked to tease his interlocutors by saying that the only thing he knew was that he knew nothing. There is a deep insight in this, for the one thing that is more dangerous than true ignorance is the illusion of knowledge and understanding. Such illusion abounds, and one of the first tasks of philosophy – as wonderfully demonstrated by Socrates in Plato’s “Meno” – is to explore our claims to know things about ourselves and the world, and to expose them if they are false or muddled.”–Philosopher and Teacher A.C. Grayling,

When asked my profession, I say that I teach philosophy. Sometimes, with equal accuracy, I say that I study philosophy. The form of words is carefully chosen; a certain temerity attaches to the claim to be a philosopher – “I am a philosopher” does not sound as straight-forwardly descriptive as “I am a barrister/soldier/carpenter,” for it seems to claim too much. It is almost an honorific, which third parties might apply to someone only if he or she merited it. And such a one need not necessarily be – indeed, may well not be – an academic teacher of the subject.

When I reply in the way described, I see further questions kindle in the interrogator’s eye. “What do philosophers do in the mornings when they get up?” they ask themselves, privately. Everyone knows what a barrister or carpenter does. The teaching part in “teaching philosophy” is obvious enough; but the philosophy part? Do salaried philosophers arrange themselves into Rodinesque poses, and think – all day long?

But the question they actually ask is, “How did you get into that line of work?” The answer is simple. Sometimes people choose their occupations, and sometimes they are chosen by them. People used to describe the latter as having a vocation, a notion borrowed from the idea of a summons to the religious life, and applied to medicine and teaching as well as to the life of the mind. No doubt there are people who make a conscious decision to devote themselves to philosophy rather than, say, tree surgery; but usually it is not an option. Like the impulse to write, paint, or make music, it is a kind of urgency, for it feels far too significant and interesting to take second place to anything else.

The world is, however, a pragmatic place, and the dreams and desires people have – to be professional sportsmen, or prima ballerinas, or best-selling authors – tend to remain such unless the will and the opportunity are available to help onward. Vocation provides the will; in the case of philosophy, opportunity takes the form of an invitation, and a granting of license to take seriously the improbable path of writing and thinking as an entire way of life. In my case, as with many others who have followed the same path, the invitation came from Socrates.

When Socrates returned to Athens from his military service at Potidiae, one of the first things he did was to find out what had been happening in philosophy while he was away, and whether any of the current crop of Athenian youths was distinguished for beauty, wisdom, or both. So Plato tells us at the beginning of his dialogue “Charmides”, named for the handsome youth who was then the centre of fashionable attention in Athens. Always interested in boys like Charmides, Socrates engaged him in conversation to find out whether he had the special attribute which is even greater than physical beauty – namely, a noble soul.

Socrates’ conversation with Charmides was the trigger that made me a lifelong student of philosophy. I read that dialogue at the age of twelve in English translation – happily for me, it is one of Plato’s early works, all of which are simple and accessible; and it immediately prompted me to read others. There was nothing especially precocious about this, for all children begin as philosophers, endlessly voicing their wonder at the world by asking “wh–” questions – why, what, which – until the irritation of parents, and the schoolroom’s authority on the subject of Facts, put an end to their desire to ask them. I was filled with interest and curiosity, puzzlement and speculation, and wanted nothing more than to ask such questions and to seek answers to them forever. My good luck was to have Socrates show that one could do exactly that, as a thing not merely acceptable, but noble, to devote one’s life to. I was smitten by the nature and subject of the enquiries he undertook, which seemed to me the most important there could be. And I found his forensic method exhilarating – and often amusing, as when he exposes the intellectual chicanery of a pair of Sophists in the “Euthydemus,” and illustrates the right way to search for understanding. Presented with such an example, and with such fascinating and important questions, it struck me that there is no vocation to rival philosophy.

These juvenile interests were more or less successfully hidden from contemporaries in the usual way – under a mask of cricket, rugby, and kissing girls in the back row of the cinema – because being a swot was then as always a serious crime; but although all these disguises were agreeable in their own right, especially the last (the charms of Charmides notwithstanding; but they anyway expanded my view of what human flourishing includes), they could not erase what had taken hold underneath – a state of dazzlement before the power and beauty of ideas, and of being fascinated both by the past and the products of man’s imagination. It was a fever that took hold early, and never afterwards abated.

My youthful discovery of philosophy occurred in propitious circumstances, in the sense that I grew up in a remote region of the world, the parts of central and east Africa described by Laurens van der Post in his “Venture into the Interior.” This was before television services reached those high dusty savannahs and stupendous rift valleys, and therefore members of the expatriate English community there, of which my family was part, were much thrown on their own devices, with reading as the chief alternative to golf, bridge and adultery. In the pounding heat of the African tropics all life is shifted back towards dawn and on past evening, leaving the middle of the day empty. School began at seven and ended at noon. Afternoons, before the thunderstorms broke – one could set the clocks by them – were utterly silent. Almost everyone and everything fell asleep. Reading, and solitude of the kind that fills itself with contemplations and reveries, were my chief resources then, and became habitual.

With parents and siblings I lived the usual expatriate life of those distant regions before Harold Macmillan’s “winds of change.” It was a life of Edwardian-style magnificence, made easy by servants in crisp white uniforms, who stood at attention behind our wicker chairs when we took our ease on the terrace, or beside the swimming pool or tennis court, in our landscaped garden aflame with frangipani and canna lilies. Maturing reflection on this exploitative style of life, together with the realisation that Plato’s politics are extremely disagreeable (today he would be a sort of utopian Fascist, and perhaps even worse), gave my political views their permanent list to port.

My mother always yearned for London, and clucked her tongue in dismay, as she read the tissue-paper airmail edition of the Times, over the shows and concerts being missed there. I agreed with her, in prospective fashion. But a good feature of this artificial exile was the local public library. It stood on the slope of a hill, on whose summit, thrillingly for me, lay the skeletal remains of a burned-out single-seater monoplane. In the wreckage of this aircraft I flew innumerable sorties above imagined fields of Kent, winning the Battle of Britain over again. But I did this only in the intervals of reading under a sun-filled window in the empty library, eccentric (as I now see) in its stock of books, but a paradise to me. I had the good fortune to meet Homer and Dante there, Plato and Shakespeare, Fielding and Jane Austen, Ovid and Milton, Dryden and Keats; and I met Montaigne on its shelves, Addison, Rousseau, Dr. Johnson, Charles Lamb and William Hazlitt – and Hume, Mill, Marx and Russell. From that early date I learned the value of the essay, and fell in love with philosophy and history, and conceived a desire to know as much as could be known – and to understand it too. Because of the miscellaneous and catholic nature of these passions, the books in the strange little library gave me a lucky education, teaching me much that filled me then and fills me still with pleasure and delight.

One aspect of this was the invitation to inhabit, in thought, the worlds of the past, not least classical antiquity. In ancient Greece the appreciation of beauty, the respect paid to reason and the life of reason, the freedom of thought and feeling, the absence of mysticism and false sentimentality, the humanism, pluralism and sanity of outlook, which is so distinctive of the cultivated classical mind, is a model for people who see, as the Greeks did, that the aim of life is to live nobly and richly in spirit. In Plato this ideal is encapsulated as “sophrosyne,” a word for which no single English expression gives an adequate rendering, although standardly translated as “temperance,” “self-restraint” or “wisdom.” In his most famous and widely-read dialogue, the “Republic,” Plato defines it as “the agreement of the passions that Reason should rule.” If to this were added the thought – reflecting the better part of modern sensitivity – that the passions are nevertheless important, something like an ideal conception of human flourishing results.

Image result for Plato

Plato and Aristotle

When not in Athens I was in ancient Rome. For the Romans in their republican period something more Spartan than Athenian was admired, its virtues (“vir” is Latin for “man”) being the supposedly manly ones of courage, endurance and loyalty. There is a contrast here between civic and warrior values, but it is obvious enough that whereas one would wish the former to prevail, there are times when the latter are required, both for a society and for its individual members. For a society such values are important in times of danger, such as wartime; and for individuals they are important at moments of crisis, such as grief and pain. The models offered by Rome were Horatius – who defended the bridge against Tarquin the Proud and Lars Porsena – and Mucius Scaevola, who plunged his hand into the flames to show that he would never betray Rome. Unsurprisingly, the dominating ethical outlook of educated Romans was Stoicism, the philosophy which taught fortitude, self-command, and courageous acceptance of whatever lies beyond one’s control. The expressions “stoical” and “philosophical,” to mean “accepting” or “resigned,” derive from this tradition.

One Saturday afternoon when I was fourteen I bought – for sixpence, at a fete run by the Nyasaland Rotary Club – a battered copy of G. H. Lewes’s “Biographical History of Philosophy”, which begins (as does the official history of philosophy) with Thales, and ends with Auguste Comte, who was Lewes’s contemporary. Lewes was George Eliot’s consort, a gifted intellectual journalist, whose biography of Goethe is still the best available, and whose history of philosophy is lucid, accurate and absorbing. I could not put it down on first reading, and in all must have read it a dozen times before I had my fill. It superinduced order on the random reading that had preceded it, and settled my vocation.

When I returned to England as a teenager it was to a place intensely familiar and luminous because whenever in my reading I was not either in the ancient world or somewhere else in history, I was there – and especially in London. Everywhere one goes in London, even on ordinary daily business, one encounters its past and its literature – retracing Henry James’s first journeys through the crowded streets of what was in his day the largest and most astonishing city in the world, seeing Dickens’s Thames slide between its oily banks, and Thackeray’s Becky tripping down Park Lane smiling to herself. In this spirit my imagination heard the roar from Bankside, where pennants fluttered above the Bear-garden and the theatres, and saw crowds milling under the jewelled lanterns of Vauxhall Gardens, where fashion and impropriety mingled. Deptford on the map seemed to me a horrifying name, because Marlowe was stabbed there. On the steps of St Paul’s I thought of Leigh Hunt’s description of the old cathedral, before the fire, when it was an open highway through which people rode their horses, in whose aisles and side-chapels prostitutes solicited and merchants met to broker stocks, and where friends called to one another above the sound of matins being said or vespers sung. London is richly overlaid by all that has happened in it and been written about it. There is a character in Proust who is made to play in the Champs Elysees as a boy, and hated it; he later wished he had been able to read about it first, so that he could relish its ghosts and meanings. Luckily for me I came prepared just so for London.

It seemed entirely appropriate to me later, as an undergraduate visiting London at every opportunity, to spend afternoons in the National Gallery and evenings in the theatre (every night if it could be afforded – and even when not) because that is what my companions – my friends on the printed page under the sunlit window in Africa, such as Hazlitt, Pater, and Wilde – intimated was the natural way of relishing life.

But it was not just the relish that mattered, for everything offered by art, theatre and books seemed to me rich grist for the philosophical mill, prompting questions, suggesting answers for debate and evaluation, throwing light on unexpected angles and surprising corners of the perennial problems of life and mind. An education as a philosopher involves studying the writings of the great dead, which enables one to advance to engagement with the technical and often abstruse debates of contemporary philosophy. But philosophical education requires more than this too, for in order to do justice to the question of how these debates relate to the world of lived experience – of how gnosis connects with praxis – a wide interest in history, culture and science becomes essential. The reason is well put by Miguel de Unamuno. “If a philosopher is not a man,” he wrote, “he is anything but a philosopher; he is above all a pedant, and a pedant is a caricature of a man.”

Image result for A.J. Ayer


At Oxford I had the good fortune to be taught by A. J. Ayer, a gifted and lively teacher, and P. F. Strawson, one of the century’s leading philosophical minds. There were other accomplished philosophers there whose lectures and classes I attended, but I benefited most from personal intercourse with these two. And when in my own turn I became a lecturer in philosophy, first at St Anne’s College, Oxford and then at Birkbeck College, London, I appreciated the force of the saying “docendo disco” – by teaching I learn – for the task of helping others grasp the point in philosophical debates has the salutary consequence of clarifying them for oneself.

Socrates liked to tease his interlocutors by saying that the only thing he knew was that he knew nothing. There is a deep insight in this, for the one thing that is more dangerous than true ignorance is the illusion of knowledge and understanding. Such illusion abounds, and one of the first tasks of philosophy – as wonderfully demonstrated by Socrates in Plato’s “Meno” – is to explore our claims to know things about ourselves and the world, and to expose them if they are false or muddled. It does so by beginning with the questions we ask, to ensure that we understand what we are asking; and even when answers remain elusive, we at least grasp what it is that we do not know. This in itself is a huge gain. One of the most valuable things philosophy has given me is an appreciation of this fact.

BOOK REVIEW: The Science of Values: The Moral Landscape by Sam Harris

February 4, 2018

BOOK REVIEW: The Science of Values: The Moral Landscape by Sam Harris

Note: My friend and ex-Diplomat, Dato Hamzah Majeed, introduced me to the refreshing writings of Sam Harris and I am hooked. By reading Harris, I am led to other writers like Carl Sagan, Stephen Hawking, Christopher Hitchens, Richard Dawkins, Bernard Lewis, Neil Degrasse Tyson, Tenzin Gyatzo (His Holiness Dalai Lama), Ayaan Hirsi Ali and others. At the present I am reading Sam’s  The End of Faith: Religion, Terror and the Future of Reason.–Din Merican

Reviewed by James W. Diller and Andrew E. Nuzzolilli

Image result for moral landscape by sam harris


In The Moral Landscape, Sam Harris (2010) proposes that science can be used to identify values, which he defines as “facts that can be scientifically understood: regarding positive and negative social emotions, retributive impulses, the effects of specific laws and social institutions on human relationships, the neurophysiology of happiness and suffering, etc.” (pp. 1–2). Harris argues that scientific principles are appropriately applied in this domain because “human well-being entirely depends on events in the world and on states of the human brain. Consequently, there must be scientific truths known about it” (p. 3). Although readers of this journal would have few problems with the assertion that behavior (here, reports of well-being and correlated responses) changes as a function of environmental events, the role of the neurophysiological correlates of these responses has been a point of debate within the conceptual literature of behavior analysis (e.g., Elcoro, 2008; Reese, 1996; Schaal, 2003).

Image result for moral landscape by sam harris

Author: Sam Harris

The Moral Landscape represents an important contribution to a scientific discussion of morality. It explicates the determinants of moral behavior for a popular audience, placing causality in the external environment and in the organism’s correlated neurological states. The contemporary science of behavior analysis has and will continue to contribute to this discussion, originating with Skinner’s seminal works Beyond Freedom and Dignity (1971) and Walden Two (1976). Neither book is explicitly a treatise on morality, but both are attempts to introduce behavioral science to a broader audience. The behavior-analytic approach (which is largely compatible with Harris’s efforts in The Moral Landscape) supports the superiority of a scientific approach to life, including questions of morality. Skinner (1976), for example, highlighted the importance of the experimenting culture to identify practices that were effective (cf. Baum, 2005).Tacit within behavior analysis is the expectation that a scientific worldview can and will improve the quality of life. Consistent with this view, Harris suggests that the currently accepted determinants of morality (e.g., religion, faith) are not what society ought to espouse. Instead, he proposes that scientific inquiry into morality as its own subject would enhance global levels of well-being. From a behavioral perspective, the study of morality is necessarily the study of behavior, including the contexts in which it occurs and the environmental events of which it is a function. Analysis in this framework may allow the successful identification of the variables that control moral behavior, and, ultimately, the development of cultural practices to increase its occurrence.

Image result for Dawkins, Hitchens, Harris and Carl Sagan

The Moral Landscape is a recent contribution to a collection of books (e.g., Dawkins, 2006; Harris, 2005; Hitchens, 2007; Sagan, 2006) that subject the claims of religion to the same standard of empirical rigor that other epistemologies (e.g., science) must abide by. Dawkins (2006), for example, criticizes the appeal to supernatural gods as explanatory agents and takes issue with the privileged place of religion within societal discourse. Harris echoes and expands on these concerns in The Moral Landscape.

Collectively, these authors take issue with the notion of nonoverlapping magisteria (NOMA; Gould, 1999), which is the assertion that science and religion are both valid systems of knowledge, and that neither discipline can inform the other. Behavior analysts take issue with the notion that scientific behavior and religious behavior are egalitarian (see Galuska, 2003, for suggestions about successful navigation of NOMA by behavior analysts). Skinner (1987) commented, “Science, not religion, has taught me my most useful values, among them intellectual honesty. It is better to go without answers than to accept those that merely resolve puzzlement” (p. 12). Although religion may be effective at inducing behavioral change among its followers, it continues to have unintended effects that, to borrow Harris’s analogy, reach the depths of the moral landscape. Hitchens (2007) makes a subtitular claim that “religion poisons everything,” supporting his thesis with discussions of demonstrably negative outcomes associated with religious practice, discussing examples of how religion leads to poorer states of human health and impedes social progress. As an alternative, he proposes a rational, scientific view of the world, which Harris applies to the study of morality.

Because they are members of a relatively small discipline, it may be beneficial for behavior analysts to align themselves with and support the authors of these works, garnering attention from the controversial coverage from popular media outlets that writers such as Dawkins and Harris regularly elicit. Perhaps controversial exposure is better than no exposure at all, especially when behavior analysis can enable the development of the hypothetical secular society that Harris, Dawkins, Hitchens, and Sagan call for. Indeed, behavior analysis may be the only discipline that can identify and establish reinforcers to motivate prosocial, so-called moral, human behavior in the absence of organized religion.

It is noteworthy that no psychologist has tackled the problem of secular values alongside these authors in spite of the contradictory facts that religion presents about human nature, facts that take away from the value of our discipline. Indeed, much of the rich “prescientific” vocabulary that inhibits psychology from becoming a natural science is either religious or metaphysical in nature (Schlinger, 2004). It is imperative for the validation of the field of psychology, as well as behavior analysis by association, to be a part of this modern empiricist movement championed by Harris.

Harris’s argument unfolds in an introduction and five subsequent chapters. In the introduction, he defines his title concept of the moral landscape as a hypothetical space representing human well-being, encompassing all human experiences. This space contains the well-being of members of all cultures and groups of individuals on the planet. The peaks of this landscape are the heights of prosperity, and the valleys represent the depths of human suffering. The goal of plotting the cartography of this landscape is to maximize “the well-being of conscious creatures” (i.e., humans) which “must translate at some point into facts about brains and their interactions with the world at large” (p. 11). For Harris, the brain is the locus of interest. We believe that it is possible to recast the argument into one about whole organisms—with correlated neurological states, perhaps—interacting with their environment to determine behavior. This scientific approach to human behavior, with a goal of improving the welfare of living organisms, is consistent with the application of behavior analysis to bring about societal change (e.g., Baer, Wolf, & Risley, 1968; Skinner, 1971, 1978).

In the subsequent chapters of his book, Harris makes the case for applying scientific thinking to determine human values. Chapter 1 outlines the knowable nature of moral truths, suggesting that they are subject to scientific (rather than religious) inquiry. In Chapter 2, Harris tackles the topics of good and evil, suggesting that these terms may be outmoded; instead, the goal of both religion and science should be to determine ways to maximize human well-being. In the third chapter, Harris explores the neurological correlates of belief, tracing the complex sets of behavior back to brain activity. In the fourth chapter, he examines the role of religious faith in contemporary society, suggesting that a scientific approach may lead to an increase in overall well-being. The final chapter outlines a plan for future work, disentangling science and philosophy, and offering an optimistic picture about the use of science to improve the human condition. In sum, Harris presents a cogent argument for the application of scientific principles to identify moral principles and values. In what follows, we describe his arguments and some intersections with the behavioral approach to this topic.

Defining Morality

The crux of Harris’s argument is that the well-being of conscious creatures should be the paramount consideration when determining whether an action is morally correct or incorrect. Harris uses the term conscious creature extensively in formulating his science of morality. Although he does not provide an explicit definition of consciousness, his use seems to be at odds with the behavioral approach to this construct. For Harris, consciousness seems to be a property of the brain, discoverable by explorations in neuroscience. In contrast, Skinner (1945) suggested that, when defining psychological terms, it is useful to identify the conditions under which those terms are used, and the history of the verbal community that produces that usage. Consistent with this analysis, Schlinger (2008) proposed that consciousness is best understood with a focus on the behaviors that are associated with the use of the word consciousness (e.g., self-talk, private behavior), rather than the study of the reified thing itself.

Consciousness, defined as a set of verbal behavior, is a prerequisite for a discussion of morality. Verbal behavior is required for us to evaluate our own subjective well-being in relation to the well-being of others, allowing us to identify the relative “goodness” or “badness” of each; such an analysis is necessarily dependent on verbal behavior. Indeed, in other media (cf. The Richard Dawkins Foundation, 2011), Harris has suggested that a universe of rocks could not define a science of morality, because consciousness (i.e., a verbal repertoire about one’s own behavior) is required to discuss subjective experience.

When providing a definition for the well-being that should be promoted, Harris likens this concept to physical health, noting,

Indeed the difference between a healthy person and a dead one is about as clear and consequential a distinction as we ever make in science. The difference between the heights of human fulfillment and the depths of human misery are no less clear even if new frontiers await us in both directions. (p. 12)

With this definition, it is possible to cast a wide net and capture a multitude of human behaviors and conditions. Harris suggests that, much like physical health, well-being eludes concise definition. Although the use of well-being is not precisely operationalized, he does define morality as “the principles of behavior that allow people to flourish” (p. 19). This phrasing is likely the closest to an operational definition of morality that is possible without undertaking the scientific analysis that Harris proposes in which fundamental principles to increase moral behavior could be discovered. The use of flourishing human life as a criterion for morality may be consistent with Skinner’s (1945) approach to evaluating terms as a function of the conditions in which they occur: Morality, for Harris, may be evident only when well-being is enhanced. The next step of the analysis would be to systematically identify the conditions that give rise to that flourishing human life, exploring the antecedents (e.g., having basic needs met, education, leisure time) and the consequences thereof. Behavioral technologies such as functional analysis (e.g., Iwata, Dorsey, Slifer, Bauman, & Richman, 1982/1994) may provide the tools required to successfully carry out this work.

A major premise of Harris’s work is that there is variability in the degree of “goodness” that individuals experience in life, and this variability can be accounted for by brain states and events in the external environment. If one accepts the distinction between “the good life” and “the bad life” and the idea that there are lawful patterns and factors that contribute to each of these outcomes (i.e., a deterministic framework), it allows the development of a scientific view of morality. This scientific view, according to Harris, stands as an alternative to traditional religious perspectives. Harris writes, “There is simply no question that how we speak about human values—and how we study or fail to study the relevant phenomena at the level of the brain—will profoundly influence our collective future” (p. 25). This theme can be found in the writings of Skinner (1971), who suggested that the scientific approach to the world’s practical problems can allow the development of solutions to those problems. Although Harris’s argument is framed in the language of neuroscience instead of Skinner’s behavioral perspective, a similarly pragmatic approach shows through.

The introduction of Harris’s book is wholly devoted to the qualification of values as scientific facts: verifiable statements about organisms and the environment around them. This argument, that utterances reflect or are symbolic of environmental events, should be familiar to those who are familiar with Skinner’s conceptualization of a verbal community. If we are to accept that utterances about “moral behavior,” “morality,” or “ethics” are not importantly different from other verbal behavior, then they too can become a topic for scientific inquiry. Such an analysis could evaluate the conditions under which this verbal behavior is emitted and the consequences thereof. With this understanding of the contingencies of reinforcement that promote and maintain these responses, it would be possible to shape the moral behavior of individuals or groups.

Harris posits that “science can, in principle, help us understand what we should do and should want—and, therefore, what other people should do and should want in order to live the best lives possible” (p. 28). This is congruent with Skinner’s acceptance of the value judgment (i.e., is or ought statements) as a tool to reveal the oftentimes subtle contingencies that control social behavior. Harris describes well-being as the conceptual basis for morality and values, stating, “there must be a science of morality … because the well-being of conscious creatures depends upon how the universe is, altogether” (p. 28). Bringing morality into the natural world makes it amenable to scientific study, and Harris’s book complements the work that behavior analysts have done with respect to questions of morality.

The behavior-analytic approach to values and morals has its origin with Skinner, who suggested that things that individuals call good are reinforcing, and that “any list of values is a list of reinforcers” (1953, p. 35). When describing Skinner’s approach, Ruiz and Roche (2007) commented that “it is important to provide translations of value statements in functional terms in order to reveal the relevant contingencies of reinforcement” (p. 4). Thus, as with the discussion of consciousness above, the conditions under which particular behaviors are morally correct or incorrect must be considered. This functional approach may expand on Harris’s proposed science of values and make it more acceptable to a behavioral audience.

Distinguishing between Philosophical Positions on morality

A large portion of Harris’s book differentiates between the religious notions of values and morality and the scientific principles thereof. Harris suggests that religious concerns about morality are related to human well-being. In Chapter 1, he describes an agenda of finding scientific truth about questions of morality. To deal with the relative unpopularity of his approach (Harris reports that more people in contemporary American society believe that morality should stem from religious than scientific inquiries), he asserts that consensus and truth are not the same thing: “One person can be right, and everyone else can be wrong. Consensus is a guide to discovering what is going on in the world, but that is all that it is. Its presence or absence in no way constrains what may or may not be true” (p. 31). Harris reports that 57% of Americans believe that preventing homosexual marriage is a moral imperative (p. 53), a clear example of a common belief that impairs the progression of well-being.

With respect to differences between perspectives of different groups, Harris writes, “those who do not share our scientific goals have no influence on scientific discourse whatsoever; but, for some reason, people who do not share our moral goals render us incapable of even speaking about moral truth” (p. 34). Here, Harris is suggesting that religious beliefs, which may be incorrect according to other epistemological systems (e.g., science), prevent other systems from declaring them to be incorrect. However, religious belief systems do comment on the “truth” of empirical inquiries, a double standard with which Harris takes issue, and a concern that is expressed by other authors such as Dawkins (2006). The ability of science to comment on affairs related to religion and morality has the potential for the further advancement of human well-being via the development of new ideas and technologies. Without a scientific response to these issues, progress seems less likely.

To determine the merits of given philosophical systems, one can adopt a relativistic position. Relativism is the belief that points of view have no absolute truth. This tradition is largely a by-product of scientific skepticism, and can be just as harmful to a science of morality as any religious doctrine. By Harris’s account, moral relativism is endemic throughout the scientific community. This is problematic for the development of the theoretical moral landscape because historically science has “had no opinion” on moral issues, which Harris ascribes to a fear of retribution by religious groups, political agendas, or intellectual laziness; he objects to the continuance of this harmful tradition. Harris suggests that relativism is accepted as an absolute position and is not subject to a contextual analysis (that relativism itself should require). He points out that this absolute acceptance of a relativistic worldview is fundamentally contradictory to the principle of relativism itself. If we are to believe that the practices in question (examples that Harris highlights include female genital mutilation and subjugation of women) are correct in the relevant cultural and historical time period, this belief must also be cast as relative and changeable, which it generally is not. In addition, Harris suggests that relativistic positions may lead to misguided beliefs about how to improve human well-being.

Perhaps at odds with Harris’s analysis, Skinner suggested that there are multiple sets of values that may emerge across cultural settings: “Each culture has its own set of goods, and what is good in one culture may not be good in another” (1971, p. 122). The reinforcers (i.e., values) identified across cultures necessarily vary as a function of the different physical and cultural environments in which the moral systems develop. For Skinner, the criterion by which to evaluate the goodness of a cultural practice is the degree to which it promotes survival of the society. Thus, although there are potentially many different ways for a culture to survive, there may be some that maximize the level of well-being of the individuals and the group. Skinner’s position is pragmatic, but has garnered criticism from within the behavior-analytic community (e.g., Ruiz & Roche, 2007). Critiques of the cultural survivability criterion emphasize the impossibility of determining which cultural practices will, in fact, enhance survivability without definite knowledge of the future. Ruiz and Roche (2007) called “for behavior analysts to consider seriously where we as a community stand on relativism and to discuss openly and thoroughly the criteria we will use in adopting ethical principles” (p. 11). Harris’s position of rejecting moral relativism in favor of universal principles to promote well-being may help to inform the behavior-analytic discourse.


After establishing that our beliefs can, indeed, be incorrect or somehow inconsistent with reality, Harris qualifies his argument. Citing research conducted in his own laboratory on the neuroscience of belief, he posits that there is no difference between what we deem to be “knowledge,” “belief,” and “truth,” and these utterances can be attributed to functionally equivalent neurological correlates. Indeed, the brain’s endogenous reward systems reinforce beliefs and utterances that we deem “true” with positive emotional valence. He writes,

When we believe a proposition to be true, it is as though we have taken it in hand as part of our extended self, we are saying in effect, “This is mine. I can use this. This fits my view of the world.” (p. 121)

This evidence from neuroscience supports the notion that values, knowledge, belief, and truth belong to the same class of verbal behavior, but may not necessarily share discriminative stimuli (Skinner, 1945). Taking this research to its logical conclusion, one can suggest that an individual’s learning history would dictate which beliefs, truths, or bits of knowledge could fit into a person’s worldview. By Harris’s account, we dislike information that contradicts our worldviews as much as we dislike being lied to. With this bias established, it is easier to see precisely how maladaptive or harmful beliefs can be propagated.

Organism-Environment Interactions

In Chapter 2, Harris suggests that an understanding of the human brain and its states will allow an understanding of forces that improve society (e.g., prosocial behavior). He writes,

As we better understand the brain, we will increasingly understand all of the forces … that allow friends and strangers to collaborate successfully on the common projects of civilization. Understanding ourselves in this way and using the knowledge to improve human life, will be among the most important challenges to science in the decades to come. (pp. 55–56)

Cooperation is one of the mechanisms through which values may come about, and Harris contends that “there may be nothing more important than human cooperation” (p. 55). Conceptualizing the failures of cooperation as the everyday grievances of theft, deception, and violence, it is plain to see how failing to cooperate can be an impediment to human well-being and moral development.

Harris emphasizes the role of consequences in the formation of values, suggesting that

all questions of value depend upon the possibility of experiencing such value. Without potential consequences at the level of experience—happiness, suffering, joy, despair etc.—all talk of value is empty … even within religion, therefore, consequences and conscious states remain the foundation of all values. (p. 62)

In this quote, Harris suggests the power of consequences to effect change in behavior. In so doing, Harris takes morality out of his context of neurological events and places it into an environmental framework. Although he does acknowledge the behavior–environment interaction as a cause for moral responding, a behavior-analytic approach would go further, emphasizing the power of consequences to increase or decrease (i.e., reinforce or punish) the likelihood that moral behavior would occur. It is the consequences of behavior that make it more or less likely to occur in a selectionist framework (cf. Glenn & Madden, 1995), and those same consequences seem to work similarly at the neurological level (e.g., Stein, Xue, & Belluzzi, 1994). Thus, it is the interaction between the environment and the organism that leads to the development of any behavior, including moral responses and those associated with varying degrees of well-being.

Taking this environment-based approach, Harris presents contemporary research from neuroscience throughout his book. After describing the neurological precursors and correlates of behavior, he dismisses the notion of free will, citing additional biological data to suggest that it is the brain—and not an agent of free will—that is responsible for behavior. He makes a familiar argument for a deterministic framework, in which the historical and contemporary environments (including neurological states) are responsible for behavior. After dismantling free will, Harris describes ramifications for the justice system. With this knowledge, we can no longer hold people accountable for their actions because they are determined by historical and contemporary events. This view negates a justice system based on punishment or retribution. Consistent with a behavioral position (e.g., Chiesa, 2003), Harris suggests that, with increased knowledge about the brain (for which we may be able to substitute behavior without losing any meaning, because the brain necessarily belongs to a complete organism), reforms of the justice system may be necessary. A reformed justice system would be more compassionate based on its more accurate understanding of causes of behavior (i.e., the environment and biological states). Harris takes this position to an extreme, proposing that it may even be immoral to fail to consider environmental and biological factors within the context of the justice system. Here, there is a fundamental compatibility between the approach that Harris is advocating and a behavioral worldview.

Indeed, the understanding of proximate and ultimate causes that precede any event are essential to making logically coherent arguments, not just from the perspective of the justice system but in understanding the behavior of all organisms. The ultimate cause of many reprehensible human behaviors lies in the distant evolutionary past. Proximate causes can be shaped over the course of single lifetimes and may covary with environmental stimuli (Mayr, 1961; Skinner, 1981). For the purposes of Harris’s argument, we will agree that the nervous system and the brain are proximate causes of behavior, but these were influenced by both the evolutionary history of the species and the learning environment of the individual (cf. Schlinger & Poling, 1998, pp. 39–41). A more thorough discussion of ultimate cause may be a better locus to develop the science of moral behavior which he calls for.

A Program for changing moral behaviour

If Harris’s claims that morality is knowable through scientific processes are true (and, based on his arguments in the book, we, at least, are convinced), behavior analysis ought to be at the forefront of the emerging science of morality. As a discipline, behavior analysis is uniquely positioned to deal with matters that span the continuum of well-being and suffering (i.e., the peaks and valleys of Harris’s moral landscape). Behavior analysis has a history of developing and using demonstrably effective behavior-change procedures. Because Harris’s neurological correlates of well-being are isomorphic with human behavior, the methods of experimental and applied behavior analysis could be used to support the type of work that Harris proposes.

In the first chapter of the book, Harris outlines three primary directions that work in the science of morality can take: (a) Explain why people engage in particular behavior “in the name of morality”; (b) “determine which patterns of thought and behavior we should follow in the name of ‘morality’”; and (c) convince people “who are committed to silly and harmful patterns of thought and behavior in the name of ‘morality’ to break these commitments and to live better lives” (p. 49). Behavior analysts have the conceptual framework and behavior-change techniques to potentially make meaningful contributions to each of these goals.


In The Moral Landscape, Harris begins to develop a science of morality that he believes could be used to maximize the well-being of humans. Although his approach to morality is largely grounded in neuroscience (rather than the study of the whole organism), he does present an environment-based approach to morality to a wide readership, continuing in the tradition of other recent works that have espoused secular worldviews (e.g., Dawkins, 2006; Hitchens, 2007). Harris’s approach to the development of the science of morality is largely consistent with the behavioral approach; for him, as for behavior analysts, morality is behavior, and that behavior is subject to environmental (and biological) manipulation.

As part of his work, Harris describes the need to produce change in methods for producing change in moral behavior; the discipline of behavior analysis is ideally suited to contribute to this mission. The application of behavioral techniques to socially significant problems has been a hallmark of behavior analysis ever since Baer et al. (1968) laid the foundations of applied behavior analysis. Behavior change has been demonstrated from the level of the individual to the level of society, and these same principles could be applied to moral behavior, as described by Harris, to promote universal well-being. No other discipline matches behavior analysis in its scientific understanding of behavior or its tools to modify it. If Harris is correct that science should take an active role in determining human values, behavior analysts must be a part of that conversation.


We thank Mirari Elcoro for her thoughtful comments on a previous version of this manuscript.


  • Baer D.M., Wolf M.M., Risley T.R. Some current dimensions of applied behavior analysis. Journal of Applied Behavior Analysis. 1968;1:91–97. [PMC free article] [PubMed]
  • Baum W.M. Understanding Behaviorism: Behavior, Culture, and Evolution (2nd ed.) Malden, MA: Blackwell; 2005.
  • Chiesa M. Implications of determinism: Personal Responsibility and the Value of Science. In: Lattal K.A., Chase P.N., editors. Behavior theory and philosophy. New York, NY: Kluwer Academic/Plenum; 2003. pp. 243–258. (Eds.)
  • Dawkins R. The God Delusion. New York, NY: Houghton Mifflin; 2006.
  • Elcoro M. Including physiological data in a science of behavior: A critical analysis. Brazilian Journal of Behavioral and Cognitive Therapy. 2008;10:253–261.
  • Galuska C.M. Advancing Behaviorism in a Judeo-Christian Culture. In: Lattal K.A., Chase P.N., editors. Behavior Theory and Philosophy. New York, NY: Kluwer Academic/Plenum; 2003. pp. 259–274. (Eds.)
  • Glenn S.S., Madden G.J. Units of Interaction, Evolution, and Replication: Organic and Behavioral Parallels. The Behavior Analyst. 1995;18:237–251. [PMC free article] [PubMed]
  • Gould S.J. Rocks of Ages: Science and Religion in the Fullness of Life. New York, NY: Ballentine; 1999.
  • Harris S. The End of faith: Religion, Terror, and the Future of Reason. New York, NY: Norton; 2005.
  • Harris S. The Moral Landscape: How Science can determine Human Values. New York, NY: Free Press; 2010.
  • Hitchens C. God is not great: How Religion Poisons Everything. New York, NY: Twelve Books, Hachette Book Group; 2007.
  • Iwata B.A., Dorsey M.F., Slifer K.J., Bauman K.E., Richman G.S. Toward a Functional Analysis of Self-Injury. Journal of Applied Behavior Analysis. 1994;27:197–209. (Reprinted from Analysis and Intervention in Developmental Disabilities, 2, 3–20, 1982) [PMC free article] [PubMed]
  • Mayr E. Cause and Effect in Biology. Science. 1961;134:1501–1506. [PubMed]
  • Reese H.W. How is Physiology Relevant to Behavior analysis? The Behavior Analyst. 1996;19:61–70. [PMC free article] [PubMed]
  • The Richard Dawkins Foundation. Who says science has nothing to say about religion? 2011. [DVD] Retrieved from
  • Ruiz M.R., Roche B. Values and the Scientific Culture of Behavior analysis. The Behavior Analyst. 2007;30:1–16. [PMC free article] [PubMed]
  • Sagan C. The Varieties of Scientific Experience: A Personal View of the A Search for God. New York, NY: Penguin; 2006.
  • Schaal D.W. Explanatory Reductionism in Behavior Analysis. In: Lattal K.A., Chase P.N., editors. Behavior theory and philosophy. New York, NY: Kluwer Academic/Plenum; 2003. pp. 83–102. (Eds.)
  • Schlinger H.D. Why Psychology hasn’t kept its Promises. The Journal of Mind and Behavior. 2004;25(2):123–144.
  • Schlinger H.D. Consciousness is Nothing but a word. Skeptic. 2008;13:58–63.
  • Schlinger H.D., Poling A. Introduction to Scientific psychology. New York, NY: Plenum; 1998.
  • Skinner B.F. The Operational Analysis of Psychological Terms. Psychological Review. 1945;52:268–277.
  • Skinner B.F. Science and Human Behavior. New York, NY: Macmillan; 1953.
  • Skinner B.F. Beyond Freedom and Dignity. New York, NY: Knopf; 1971.
  • Skinner B.F. Walden Two. New York, NY: Macmillan; 1976.
  • Skinner B.F. Reflections on behaviorism and society. New York, NY: Prentice Hall; 1978.
  • Skinner B.F. Selection by consequences. Science. 1981;213:501–504. [PubMed]
  • Skinner B.F. What religion means to me. Free Inquiry. 1987;7:12–13.
  • Stein L., Xue B.G., Belluzzi J.D. In Vitro Reinforcement of Hippocampal Bursting: A Search for Skinner’s Atoms of Behavior. Journal of the Experimental Analysis of Behavior. 1994;61:155–168. [PMC free article] [PubMed]

The United States of America Is Decadent and Depraved

December 23, 2017

The United States of America Is Decadent and Depraved

The problem isn’t Donald Trump – it’s the Donald Trump in all of us.

Image result for The Donald Trump in all of us

In The History of the Decline and Fall of The Roman Empire, Edward Gibbon luridly evokes the Rome of 408 A.D., when the armies of the Goths prepared to descend upon the city. The marks of imperial decadence appeared not only in grotesque displays of public opulence and waste, but also in the collapse of faith in reason and science. The people of Rome, Gibbon writes, fell prey to “a puerile superstition” promoted by astrologers and to soothsayers who claimed “to read in the entrails of victims the signs of future greatness and prosperity.”

Would a latter-day Gibbon describe today’s America as “decadent”? I recently heard a prominent, and pro-American, French thinker (who was speaking off the record) say just that. He was moved to use the word after watching endless news accounts of U.S. President Donald Trump’s tweets alternate with endless revelations of sexual harassment. I flinched, perhaps because a Frenchman accusing Americans of decadence seems contrary to the order of nature. And the reaction to Harvey Weinstein et al. is scarcely a sign of hysterical puritanism, as I suppose he was implying.

And yet, the shoe fit. The sensation of creeping rot evoked by that word seems terribly apt.

Perhaps in a democracy the distinctive feature of decadence is not debauchery but terminal self-absorption

Perhaps in a democracy the distinctive feature of decadence is not debauchery but terminal self-absorption

— the loss of the capacity for collective action, the belief in common purpose, even the acceptance of a common form of reasoning. We listen to necromancers who prophesy great things while they lead us into disaster. We sneer at the idea of a “public” and hold our fellow citizens in contempt. We think anyone who doesn’t pursue self-interest is a fool.

We cannot blame everything on Donald Trump, much though we might want to. In the decadent stage of the Roman Empire, or of Louis XVI’s France, or the dying days of the Habsburg Empire so brilliantly captured in Robert Musil’s The Man Without Qualities, decadence seeped downward from the rulers to the ruled. But in a democracy, the process operates reciprocally. A decadent elite licenses degraded behavior, and a debased public chooses its worst leaders. Then our Nero panders to our worst attributes — and we reward him for doing so.

“Decadence,” in short, describes a cultural, moral, and spiritual disorder — the Donald Trump in us. It is the right, of course, that first introduced the language of civilizational decay to American political discourse. A quarter of a century ago, Patrick Buchanan bellowed at the Republican National Convention that the two parties were fighting “a religious war … for the soul of America.” Former Speaker Newt Gingrich (R-Ga.) accused the Democrats of practicing “multicultural nihilistic hedonism,” of despising the values of ordinary Americans, of corruption, and of illegitimacy. That all-accusing voice became the voice of the Republican Party. Today it is not the nihilistic hedonism of imperial Rome that threatens American civilization but the furies unleashed by Gingrich and his kin.

The 2016 Republican primary was a bidding war in which the relatively calm voices — Jeb Bush and Marco Rubio — dropped out in the early rounds, while the consummately nasty Ted Cruz duked it out with the consummately cynical Donald Trump. A year’s worth of Trump’s cynicism, selfishness, and rage has only stoked the appetite of his supporters. The nation dodged a bullet last week when a colossal effort pushed Democratic nominee Doug Jones over the top in Alabama’s Senate special election. Nevertheless, the church-going folk of Alabama were perfectly prepared to choose a racist and a pedophile over a Democrat. Republican nominee Roy Moore almost became a senator by orchestrating a hatred of the other that was practically dehumanizing.

Image result for The Donald Trump in all of usAmerican voters disagreed with past Presidents. 2017 will be behind us soon. So the question is: Will  President Donald Trump be more presidential and can he behave like a global leader? He cannot be exclusively America First.–Din Merican


Trump functions as the impudent id of this culture of mass contempt.

Trump functions as the impudent id of this culture of mass contempt

Of course he has legitimized the language of xenophobia and racial hatred, but he has also legitimized the language of selfishness. During the campaign, Trump barely even made the effort that Mitt Romney did in 2012 to explain his money-making career in terms of public good. He boasted about the gimmicks he had deployed to avoid paying taxes. Yes, he had piled up debt and walked away from the wreckage he had made in Atlantic City. But it was a great deal for him! At the Democratic convention, then-Vice President Joe Biden recalled that the most terrifying words he heard growing up were, “You’re fired.” Biden may have thought he had struck a crushing blow. Then Americans elected the man who had uttered those words with demonic glee. Voters saw cruelty and naked self-aggrandizement as signs of steely determination.

Perhaps we can measure democratic decadence by the diminishing relevance of the word “we.” It is, after all, a premise of democratic politics that, while majorities choose, they do so in the name of collective good. Half a century ago, at the height of the civil rights era and Lyndon B. Johnson’s Great Society, democratic majorities even agreed to spend large sums not on themselves but on excluded minorities. The commitment sounds almost chivalric today. Do any of our leaders have the temerity even to suggest that a tax policy that might hurt one class — at least, one politically potent class — nevertheless benefits the nation?

There is, in fact, no purer example of the politics of decadence than the tax legislation that the President will soon sign. Of course the law favors the rich; Republican supply-side doctrine argues that tax cuts to the investor class promote economic growth. What distinguishes the current round of cuts from those of either Ronald Reagan or George W. Bush is, first, the way in which they blatantly benefit the president himself through the abolition of the alternative minimum tax and the special treatment of real estate income under new “pass-through” rules. We Americans are so numb by now that we hardly even take note of the mockery this implies of the public servant’s dedication to public good.

Second, and no less extraordinary, is the way the tax cuts have been targeted to help Republican voters and hurt Democrats, above all through the abolition or sharp reduction of the deductibility of state and local taxes. I certainly didn’t vote for Ronald Reagan, but I cannot imagine him using tax policy to reward supporters and punish opponents.

I certainly didn’t vote for Ronald Reagan, but I cannot imagine him using tax policy to reward supporters and punish opponents

He would have thought that grossly unpatriotic. The new tax cuts constitute the economic equivalent of gerrymandering. All parties play that game, it’s true; yet today’s Republicans have carried electoral gerrymandering to such an extreme as to jeopardize the constitutionally protected principle of “one man, one vote.” Inside much of the party, no stigma attaches to the conscious disenfranchisement of Democratic voters. Democrats are not “us.”

Finally, the tax cut is an exercise in willful blindness. The same no doubt could be said for the 1981 Reagan tax cuts, which predictably led to unprecedented deficits when Republicans as well as Democrats balked at making offsetting budget cuts. Yet at the time a whole band of officials in the White House and the Congress clamored, in some cases desperately, for such reductions. They accepted a realm of objective reality that existed separately from their own wishes. But in 2017, when the Congressional Budget Office and other neutral arbiters concluded that the tax cuts would not begin to pay for themselves, the White House and congressional leaders simply dismissed the forecasts as too gloomy.

Here is something genuinely new about our era: We lack not only a sense of shared citizenry or collective good, but even a shared body of fact or a collective mode of reasoning toward the truth.

We lack not only a sense of shared citizenry or collective good, but even a shared body of fact or a collective mode of reasoning toward the truth

 A thing that we wish to be true is true; if we wish it not to be true, it isn’t. Global warming is a hoax. Barack Obama was born in Africa. Neutral predictions of the effects of tax cuts on the budget must be wrong, because the effects they foresee are bad ones.

It is, of course, our president who finds in smoking entrails the proof of future greatness and prosperity. The reduction of all disagreeable facts and narratives to “fake news” will stand as one of Donald Trump’s most lasting contributions to American culture, far outliving his own tenure. He has, in effect, pressed gerrymandering into the cognitive realm. Your story fights my story; if I can enlist more people on the side of my story, I own the truth. And yet Trump is as much symptom as cause of our national disorder. The Washington Post recently reported that officials at the Center for Disease Control were ordered not to use words like “science-based,” apparently now regarded as disablingly left-leaning. But further reporting in the New York Times appears to show that the order came not from White House flunkies but from officials worried that Congress would reject funding proposals marred by the offensive terms. One of our two national political parties — and its supporters — now regards “science” as a fighting word. Where is our Robert Musil, our pitiless satirist and moralist, when we need him (or her)?

A democratic society becomes decadent when its politics, which is to say its fundamental means of adjudication, becomes morally and intellectually corrupt. But the loss of all regard for common ground is hardly limited to the political right, or for that matter to politics. We need only think of the ever-unfolding narrative of Harvey Weinstein, which has introduced us not only to one monstrous individual but also to a whole world of well-educated, well-paid, highly regarded professionals who made a very comfortable living protecting that monster. “When you quickly settle, there is no need to get into all the facts,” as one of his lawyers delicately advised.

This is, of course, what lawyers do, just as accountants are paid to help companies move their profits into tax-free havens. What is new and distinctive, however, is the lack of apology or embarrassment, the sheer blitheness of the contempt for the public good. When Teddy Roosevelt called the monopolists of his day “malefactors of great wealth,” the epithet stung — and stuck. Now the bankers and brokers and private equity barons who helped drive the nation’s economy into a ditch in 2008 react with outrage when they’re singled out for blame. Being a “wealth creator” means never having to say you’re sorry. Enough voters accept this proposition that Donald Trump paid no political price for unapologetic greed.

The worship of the marketplace, and thus the elevation of selfishness to a public virtue, is a doctrine that we associate with the libertarian right. But it has coursed through the culture as a self-justifying ideology for rich people of all political persuasions — perhaps also for people who merely dream of becoming rich.

Decadence is usually understood as an irreversible condition — the last stage before collapse. The court of Muhammad Shah, last of the Mughals to control the entirety of their empire, lost itself in music and dance while the Persian army rode toward the Red Fort. But as American decadence is distinctive, perhaps America’s fate may be, too. Even if it is written in the stars that China will supplant the United States as the world’s greatest power, other empires, Britain being the most obvious example and the one democracy among them, have surrendered the role of global hegemon without sliding into terminal decadence.

Can the United States emulate the stoic example of the country it once surpassed? I wonder.

Can the United States emulate the stoic example of the country it once surpassed? I wonder.

The British have the gift of ironic realism. When the time came to exit the stage, they shuffled off with a slightly embarrassed shrug. That, of course, is not the American way. When the stage manager beckons us into the wings we look for someone to hit — each other, or immigrants or Muslims or any other kind of not-us. Finding the reality of our situation inadmissible, like the deluded courtiers of the Shah of Iran, we slide into a malignant fantasy. 

But precisely because we are a democracy, because the values and the mental habits that define us move upward from the people as well as downward from their leaders, that process need not be inexorable. The prospect of sending Roy Moore to the Senate forced a good many conservative Republicans into what may have been painful acts of self-reflection. The revelations of widespread sexual abuse offer an opportunity for a cleansing moment of self-recognition — at least if we stop short of the hysterical overreaction that seems to govern almost everything in our lives.

Our political elite will continue to gratify our worst impulses so long as we continue to be governed by them. The only way back is to reclaim the common ground — political, moral, and even cognitive — that Donald Trump has lit on fire. Losing to China is hardly the worst thing that could happen to us. Losing ourselves is.

*James Traub is a contributing editor at Foreign Policy, a fellow at the Center on International Cooperation, and author of the book “John Quincy Adams: Militant Spirit.”

The United States of America Is Decadent and Depraved

The Protocols of Donald J. Trump

November 1, 2017

The Protocols of Donald J. Trump

Democracy depends both on the right to free speech and the right to know. We may have no alternative but to strike a new balance between the two.–Robert Skidelsky


There has always been a thriving market for fake information, forgeries, hoaxes, and conspiracy theories. The difference today is that purveyors of lies, like US President Donald Trump, no longer have to be able to hoodwink more or less reputable news outlets.

LONDON – It is an odd quirk in the history of logic that the blameless Cretans should have given their name to the famous “liar paradox.” The Cretan Epimenides is supposed to have said: “All Cretans are liars.” If Epimenides was lying, he was telling the truth – and thus was lying.

Image result for Trump is a liar

Something similar can be said of US President Donald Trump: Even when he’s telling the truth, many assume he is lying – and thus being true to himself. His trolling is notorious. For years, he claimed, with no evidence other than unnamed sources that he called “extremely credible,” that Barack Obama’s birth certificate was fraudulent. During the Republican primary, he linked his opponent Senator Ted Cruz’s father to John F. Kennedy’s assassination. He has promoted the quack idea that vaccines cause autism, and has masterfully deployed the suggestio falsi – for example, his insinuation that climate change is a Chinese hoax designed to cripple the American economy.

There has always been a thriving market for fake information, forgeries, hoaxes, and conspiracy theories. “History is a distillation of rumor,” wrote Thomas Carlyle in the nineteenth century. Sellers of fakery manufacture information for money or for political profit; there are always eager buyers among the credulous, prurient, or vindictive. And gossip is always entertaining.

Modern history provides us with some famous examples. The Zinoviev letter, a forgery implicating Britain’s Labour Party in Kremlin-led Communist sedition, was published by the Daily Mail four days before the United Kingdom’s general election in 1924, dashing Labour’s chances.

Perhaps the most famous such forgery was The Protocols of the Elders of Zion. Possibly manufactured for money, The Protocols purported to be evidence of a Jewish plan for world domination. Its key passage reads: “[…] we shall so wear down the Goyim that they will be compelled to offer us an international power that by its position will enable us without any violence gradually to absorb all the State forces of the world and to form a Super-Government.” Circulated by the Czarist secret police in the early 1900s to justify the regime’s anti-Jewish pogroms, it became the foundation of the anti-Semitic literature of the first half of the twentieth century, with horrendous consequences.

So what is new? The attention being paid to fake information today arises from the hugely expanded speed with which digitally manufactured information travels around the world. In the past, one had to be able to hoodwink more or less reputable news outlets to plant fake stories. Now misinformation can go viral through social media, like a modern Black Death.

The important question is how this will affect democracy. Will the unprecedented ease of access to information liberate people from thought control, or will it strengthen it to such an extent that democracy simply drowns in a sea of manipulation?

Optimists and pessimists both have good arguments. “Knowledge is power,” say the optimists. It seems to follow that the more information made available, the more knowledgeable voters will be, and therefore the more able to hold leaders to account.

But information, the pessimists point out, is not knowledge. Information has to be structured to become knowledge. Institutions like schools, universities, newspapers, and political parties have been our traditional structuring devices. But digital technology is institutionally naked. It provides no structuring mechanism, and therefore no control on the spread of knowledge-free opinion.

Image result for The European Populists

In Donald Trump, Europe’s populist leaders think they have found a champion.

 Social media have undoubtedly played a part in the rise of the populist politics that thrives in such an environment. Left-wing populists, like Jeremy Corbyn in the UK, Bernie Sanders in the US, and Jean-Luc Mélenchon in France, received a significant boost from social media’s ability to bypass traditional news outlets. Right-wing populists, like Trump, Marine Le Pen in France, and Geert Wilders in the Netherlands, benefited in exactly the same way. Both sides accuse long-established media outlets of “faking” news.

Perhaps the market for news will eventually find its own equilibrium between truth and falsehood. A fraction of the population will always be willing buyers of fake news; but the majority will learn to distinguish between trustworthy and unreliable sources.1

But if the spread of misinformation is thought of as a virus, there is no natural equilibrium to be had, short of catastrophe. So it has to be checked by inoculation.

Few trust politicians, who often have a vested interest in false information, to do the job. One answer is independent agencies along the lines of the consumer watchdog Which? There are a number of websites now devoted to fact checking and debunking putative news stories. One of the best known,, was launched in 1994 as a project to check the accuracy of urban legends. Facebook is now attempting to flag fake news stories by noting that they have been “disputed by third-party fact checkers.”

Worthy though these attempts are, they suffer from an inherent weakness: they still place responsibility on readers to check whether a news story is true. But we are all liable to seek information that confirms our beliefs and ignore information that challenges them. Facts will not be checked by those whose beliefs depend on not checking them.

There are no easy answers. Obviously, education in critical thinking, and especially in social sciences such as economics, is necessary. But will that be sufficient to counter the massive increase in the ability to spread fake information?

Democracy depends both on the right to free speech and the right to know. We may have no alternative but to strike a new balance between the two.


*Lord  Skidelsky, Professor Emeritus of Political Economy at Warwick University and a fellow of the British Academy in history and economics, is a member of the British House of Lords. The author of a three-volume biography of John Maynard Keynes, he began his political career in the Labour party, became the Conservative Party’s spokesman for Treasury affairs in the House of Lords, and was eventually forced out of the Conservative Party for his opposition to NATO’s intervention in Kosovo in 1999.