May 15, 2016

NY Times Books of The Times

Review: Siddhartha Mukherjee’s ‘The Gene,’ a Molecular Pursuit of the Self

by Jennifer Senior

 

Dr. Siddhartha Mukherjee

Thank heavens Gregor Mendel was a lousy priest. Had he shown even the faintest aptitude for oratory or ministering to the poor, he might never have determined the basic laws of heredity. But bumbling he was, and he made a rotten university student to boot; his failures drove him straight to his room, where he bred mice in secret. The experiment scandalized his superiors.

“A monk coaxing mice to mate to understand heredity was a little too risqué, even for the Augustinians,” writes Siddhartha Mukherjee in “The Gene: An Intimate History.” So Mendel switched — auspiciously, historically — to pea plants. The abbot in charge, writes the author, acquiesced this time, “giving peas a chance.”

Love Dr. Mukherjee, love his puns. They’re everywhere. I warn you now.It is Dr. Mukherjee’s curse — or blessing, assuming he’s a glass-half-full sort of fellow — to have to follow in his own mammoth footsteps. “The Emperor of All Maladies: A Biography of Cancer,” his dazzling 2010 debut, won the Pulitzer and almost every other species of literary award; it became a three-part series on PBS; Time magazine deemed it one of the 100 most influential books written in the English language since 1923.

In his acknowledgments to “The Gene,” Dr. Mukherjee, a researcher and cancer specialist, confesses that he once feared his first book would also be his last — that “‘Emperor’ had sapped all my stories, confiscated my passports and placed a lien on my future as a writer.” The solution, he eventually realized, was to tell the story of the gene. It is his debut’s natural prequel, a tale of “normalcy before it tips into malignancy.”

By the time “The Gene” is over, Dr. Mukherjee has covered Mendel and his peas, Darwin and his finches. He’s taken us on the quest of Watson, Crick and their many unsung compatriots to determine the stuff and structure of DNA. We learn about how genes were sequenced, cloned and variously altered, and about the race to map our complete set of DNA, or genome, which turns out to contain a stunning amount of filler material with no determined function.

Many of the same qualities that made “The Emperor of All Maladies” so pleasurable are in full bloom in “The Gene.” The book is compassionate, tautly synthesized, packed with unfamiliar details about familiar people. (Francis Galton, the father of eugenics, used to rank the beauty of women on the street by “using pinpricks on a card hidden in his pocket.” Ick.)

But there are also crucial differences. Cancer is the troll that scratches and thumps beneath the floorboards of our consciousness, if it hasn’t already beaten its way into the room. The subject immediately commands our attention; it’s almost impossible to deny, and not to hear, the emotional clang of its appeal. In Dr. Mukherjee’s skilled hands, the story of this frightening disease became a page-turner. He explained its history, politics and cunning biological underpinnings; he traced the evolving and often gruesome logic underlying cancer treatment.

And in the middle of it all, agonizing over treatment protocols and watching his patients struggle with tremendous existential and physical pain, was the author himself.

There are far fewer psychological stakes in reading about the history of genetics. “The Gene” is more pedagogical than dramatic; as often as not, the stars of this story are molecules, not humans. Dr. Mukherjee still has a poignant personal connection to the material — mental illness has wrapped itself around his family tree like a stubborn vine, claiming two uncles and a cousin on his father’s side — but this book does not aim for the gut. It aims for the mind.

So what does this mean? That there are many excursions deep into the marshes of biochemistry and cellular biology. Bring your waders. It gets dense in there. Dr. Mukherjee can write with great clarity about difficult genetic concepts — he’s especially handy with metaphors — but the science gets increasingly complex, and it lasts for many pages. Even when the going is easy, readers should be prepared for parentheticals like this: “i.e., ACT CCT GGG –>ACU CCU GGG.”

Dr. Mukherjee’s explanations are sometimes so thorough they invite as many questions as they answer — from the most elementary (why is something that contains so many bases called deoxyribonucleic acid?) to the more esoteric (if, as he says in a Homeric footnote on Page 360, the Y chromosome is so unstable it might eventually disappear, will we still reproduce?)

I do not mean to suggest that Dr. Mukherjee has neglected to attend to big questions or ideas in this work; they just get lesser billing than I’d have liked. But any book about the history of something as elemental and miraculous as the gene is bound, at least indirectly, to tell the story of innovation itself. “The Gene” is filled with scientists who dreamed in breathtakingly lateral leaps.

Erwin Schrödinger in particular was one visionary cat: In 1944, he hazarded a guess about the molecular nature of the gene and decided it had to be a strand of code scribbled along the chromosome — which pretty much sums up the essence of DNA.

With each and every genetic discovery, a host of questions arose, both ethical and philosophical. What are the implications of cloning, of creating genetic hybrids, of gene editing? Is there any value in knowing about the existence of a slumbering, potentially lethal genetic mutation in your cells if nothing can be done about it? (Personally, I wish he’d dedicated 50 pages to this question — it’d have offered a potentially moving story line and a form of emotional engagement I badly craved.)

Does the genome have anything to tell us about race, sexual identity, gender? Do these three-billion-plus base pairs connect, in any way, to what we think of as “a self”?

Dr. Mukherjee answers these questions cautiously and compassionately, if at times too cursorily for my satisfaction. He notes, repeatedly, that for all we know about the genome, there is so very much we don’t — it is a recipe, not a blueprint, as Richard Dawkins likes to say. Yes, sometimes one gene controls one specific trait; but often, dozens of genes do, and in ways we do not understand (or cannot even fully identify), and they interact mysteriously with the environment all along the way.

But as research continues apace, we must entertain the sci-fi prospect of one day customizing ourselves and our children. For now, we’re burdened with more and more moral decisions to make as genetic tests become increasingly refined.

“If the history of the last century taught us the dangers of empowering governments to determine genetic ‘fitness,’” Dr. Mukherjee writes — referring to Nazism, eugenics, every genocidal experiment involving social engineering — “then the question that confronts our current era is what happens when the power devolves to the individual.”

But we are not apps. Dr. Mukherjee knows this, struggles with it. Is optimization really the point of life? “Illness might progressively vanish,” he writes, “but so might identity.”

A version of this review appears in print on May 9, 2016, on page C1 of the New York edition with the headline: In Molecular Pursuit of the Genetic Code. Today’s Paper.

Bank Negara’s Independence at Risk


March 20, 2016

Bank Negara’s Independence at Risk

by John Berthelsen

http://www.asiasentinel.com

The projected April retirement of Zeti Akhtar Aziz, the 68-year-old governor of Malaysia’s central bank, Bank Negara, has kicked off intense speculation about who will follow her into the job. Over recent months, as the investigation into the personal finances of Prime Minister Najib Razak has droned on, Zeti is almost the sole person in authority who has not knuckled under and let the investigation die.

It appears unlikely, given the political situation, that her successor will be Muhammad Ibrahim (above), the Deputy Governor and ranking professional in the bank. That could be a mistake. Bank Negara since the 1980s has suffered crisis after crisis as politicians, particularly former Prime Minister Mahathir Mohamad, have attempted to direct monetary policy.

In October, Bank Negara, which is charged with regulating the country’s financial institutions, credit system and monetary policy, issued a statement saying it had requested a criminal investigation into the affairs of the scandal-plagued 1Malaysia Development Bhd investment fund despite the fact that Najib’s hand-picked Attorney General Mohamad Apandi Ali, to whom it had forwarded the case, said there was no reason for prosecution. Apandi Ali ultimately turned down the request.

Bank Negara, however, responded with a statement contradicting the Attorney General’s office and saying 1MDB had secured permits for investment abroad based on inaccurate or incomplete disclosure of information, breaching banking regulations, and added that it had revoked three permits granted to 1MDB for investments abroad totaling US$1.83 billion (RM7.53 billion) and ordered the state fund to repatriate the funds to Malaysia

Sources in Kuala Lumpur said Zeti, one of the world’s most respected central bankers, faced the danger of blackmail by forces aligned with Najib over concerns that the government might prosecute her husband, Tawfik Ayman, because of secret overseas accounts. Rosmah Mansor, the Prime Minister’s wife, was reportedly involved in a campaign to drive Zeti from her position.

The Wall Street Journal last week printed a story saying the fawning Irwan Serigar Abdullah, the Secretary General of the Ministry of Finance, was favored for the job. Serigar is considered close to Najib and his appointment would be tantamount to handing control of the now-independent institution to the prime minister. However, that has been denied and other candidates, including Dr. Awang Adek Hussain, the current Malaysian Ambassador to the United States, and Abdul Wahid Omar, the former Chief Executive Officer for Malayan Banking Bhd. who is now Minister in the Prime Minister’s Department in charge of Economic Planning. Both are considered to be clean.

The bottom line is that Najib, in the fight of his life over a plethora of international and domestic scandals, will want someone he can control. Thus the job probably won’t go to Muhammad, the bank’s deputy governor, who is assumed to be too independent for Najib ‘s tastes. He is a career banker who has risen up the ranks and who is not going to compromise his integrity.

BUKAN, TUN DR. MAHATHIR

In fact, the independence of the central bank has largely been engineered by Zeti, who is widely respected and credited for pushing reforms and sound policies, as well as protecting the bank’s independence. Prior to her appointment, former Prime Minister Mahathir Mohamad used the central bank for his own personal punching bag, causing it to lose billions of dollars.

In 1985, following the so-called Plaza Accord of finance ministers in New York, which pushed the value of the US dollar down sharply, Bank Negara’s dollar reserves fell sharply when the bank was wrong-footed. The then-governor, Jaffar Hussein, began trading speculatively in an effort to make up the losses, apparently at Mahathir’s behest. The bank became a major player in the foreign exchange market, with the Federal Reserve requesting that the bank rein in its activities. At one time, the central bank’s exposure was rumored to be in the region of RM270 billion  (US$66.65 billion at current exchange rates) – three times the country’s gross domestic product and more than five times its foreign reserves at the time.

In 1992 and 1993, Mahathir became convinced he could make billions of ringgit by taking advantage of a British recession, rising unemployment and a decision by the British government to float the pound sterling free of the European Exchange Rate Mechanism.

Mahathir ordered Bank Negara to buy vast amounts of pounds sterling on the theory that the British currency would appreciate once it floated. However, in what has been described as the greatest currency trade ever made, the financier and currency wizard George Soros’s Quantum hedge fund established short positions, borrowing in pounds and investing in Deutschemark-denominated assets as well as using options and futures positions.

In all, Soros’s positions alone ac­counted for a gargantuan US$10 billion. Many other investors, sensing Quantum was in for the kill, soon followed, putting strenuous downward pressure on the pound. The collapse was inevitable. Quantum walked away with US$1 billion in a single day, earning Mahathir’s eternal enmity and earning Soros the title “the man who broke the Bank of England.”

Mahathir and Bank Negara, on the other hand, walked away with a US$4 billion loss, followed by another US$2.2 billion loss in 1993, the total equivalent of RM15.5 billion. Although the disastrous trades destroyed the entire capital base of Bank Negara, after first denying it had taken place, the then-Finance Minister Anwar Ibrahim repeatedly reassured parliament that the losses were only “paper losses.”

Eventually, the Finance Ministry had to recapitalize the central bank, almost unheard of for any government anywhere. It is reliably estimated that Bank Negara lost as much as US$30 billion in this and other disastrous currency trades, costing the head of the central bank and his currency trader deputy their jobs.

If anything, that ought to be an argument for a professional from the ranks of the central bank to take over the reins when Zeti steps down. But politics, especially the politics of preserving Najib’s job, has taken precedence in Malaysia. It would be foolish to bet against a candidate aligned with the Prime Minister.

 

The Rise and Decline of Islam


March 19, 2016

The Rise and Decline of Islam

by: Kassim Ahmad

Revised and expanded on March 19, 2016

The Quran in Surah Ali-‘Imran (3) states that “The only religion approved by God is Islam.” The Arabic word ‘deen’ essentially mean ‘way of life’ rather that the restricted ritualistic meaning of the word ‘religion’.

Former political activist Kassim Ahmad © speaks to journalist outside the Kuala Lumpur High Court in Kuala Lumpur on January 6, 2015. The Malaysian Insider/Najjua Zulkefli

Former political activist Kassim Ahmad © speaks to journalist outside the Kuala Lumpur High Court in Kuala Lumpur on January 6, 2015. The Malaysian Insider/Najjua Zulkefli

This religion of strict monotheism is taught by all prophet-messengers from Adam to its completion and perfection by Muhammad, the last of all prophet-messegers. But, as it is wont with human beings, corruption and deterioration set in and complete their work in after about 300 years (10 generations) to change the original teachings. Thus, the monotheism of Prophet Moses became polytheism in Judaism, of Prophet Jesus polytheism in Christianity, and of Muhammad polytheism in Sunnism. Sunnism is polytheistic in that it has elevated Muhammad to a second god, against his will. [1]

Sunnism is sectarian “Islam”, worshiping two gods. [2] Two gods are one too many. It is polititeism. Fortunately for mankind, the last of God’s scripture, the Quran, is divinely  protected so that all mankind can always refer to it as its guide.

This divine protection lies internally in the scripture in a mathematically awesome and impossible to imitate structure called Code 19. This Code is stated in the Quran in Surah Al-Muddaththir  (74), verses 30-31.

The verses go, “Over it is nineteen. We appointed angels to be guardians of Hell, and we assign their number (19) (1) to disturb the disbelievers, (2) to convince the Christians and the Jews (that this is a divine scripture), (3) to strengthen the faith of the faithful, (4) to remove all traces of doubt from the hearts of Christians, Jews, as well as believers, and (5) to expose those who harbor doubt in their hearts. The disbelievers will say, ‘What does God mean by this allegory?’ God thus sends astray whomever He wills, and guides whomever He wills. None knows the soldiers of your Lord except He. This is a reminder for the people.”

The rise of Islam, beginning with the reign of Prophet Muhammad in the Arabian Peninsular in early seventh century, within a short time of only sixty years shot up to be the Number One power in the then world, beating the two superpowers, the Byzantian Empire and the Persian Empire.

Historian Philip K. Hitti, in his book, History of the Arabs (1970), states, “If someone in the first third of the seventh Christian century had the audacity to prophesy that within a decade some unheralded, unforeseen power from hitherto barbarous and little-known land of Arabia was to make its appearance, hurl itself against the only two world powers of the age, fall heir to the one — the Sasanid – and strip the other – the Byzantine — of its fairest provinces, he would have undoubtedly have been declared a lunatic. Yet that was exactly what happened.

After the death of the Prophet sterile Arabia seems to have been converted as if by magic into a nursery of heroes the like of whom both in number and quality is hard to find anywhere. The military campaigns of Khalid ibn-al-Walid and ‘Amar ibn-al-‘As which ensued in al-Iraq, Persia, Syria and Egypt remain among the most brilliantly executed in the history of warfare and bear favourable comparison with those of Napoleon, Hannibal or Alexander.” (p. 142).

A Western philosophical historian, Robert Briffault, in his epoch-making book, The Making of Humanity (1919), after denouncing a conspiracy of silence by most Western historians on the contributions of Muslim science to modern Europe,  surmarised the contribution of Muslim science to civilization, thus: “The debt of our science to that of the Arabs does not consist in startling discoveries or revolutionary theories. Science owes a great deal more to  Arab culture , it owes its existence. The ancient world was, as we saw, pre-scientific. The astronomy and mathematics of the Greeks were a foreign importation never thoroughly acclamatised in Greek culture.

The Greeks systematized, generalized and  theorized, but he patient ways of investigation, the accumulation of positive knowledge, the minute methods of science, detailed and prolonged observation, experimental inquiry, were altogether alien to the Greek temperament. … What we called science arose in Europe as result of a new spirit of inquiry, of new methods of investigation, of the method of experiment, observation, measurement, of the development of mathematics in a form unknown to the Greeks. That spirit and those methods were introduced into the European world by the Arabs.” (p. 191)

Muslim civilization lasted eight centuries. In that time, Baghdad became the capital of the world and Europe became students at the feet of Baghdad. When the rot set in, Europe took over the banner of civilization and what is known as the European Renaissance began. Will Western leadership last for ever? Only time can  tell. But basing ourselves on its truncated epistemology, we can say that it cannot last forever, at most another two or three decades.

One of two thing will happen. Either Europe and the United States will adopt the true revolutionary doctrine of Islam, which  I characterize as “revolutionary, life-affirming, and death-defying”, or the Muslims themselves will be reborn with that true spirit of the Quran and borne in the life Prophet Muhammad and the early republican-democratic Caliphates.

In the meanwhile, Muslim leaders must answer the question why the Muslim way of life, guaranteed by God, has collapsed, and how they can rebuild it. To answer this all-important question, they must re-study the Quran with a scientific methodology. I can suggest a few signposts.

First, at a certain point in time, Muslim science froze and deteriorated, due to wrong teaching of certain so-called masters. These were made into masters by a new priesthood class adopted in imitation of medieval Hinduism and Christianity. In Islam there is no priesthood class.

Second, at a certain point in time, a certain attitude of fatalism developed in Islam due a new theology preached in accordance to hadith teachings. Hadiths are essentially fabrications falsely ascribed to the great name of Prophet  Muhammad.

Third, that new theology also preached salvation in the Afterlife, in a nondescript Theologians’ Nirvana in imitation of Buddhism. This led to Muslim apathy in a life waiting for death. At this point, roughly from the Fourteenth Century onwards, this false Islam died, with the false Muslims.

Fourth and last to rebuild, the Muslims must re-study the Quran (which is their and mankind’s book of guidance) and the examples of their great leaders in the republican democratic period  , to find correct answers to their current plight.

I have surmarised the teachings of the Quran as “revolutionary, life-affirming and death-defying”.  We must seek salvation in this life by raising our souls to a higher level. It is this raising of our souls to a higher level that is necessary for the coming Second Muslim Civilization, which must come.

 [1] See Quran (16: 51) which states: “Do not worship two gods. There is only one God.” Further Surah 63, verse I, invalidates the second syahadah which is uttered by hypocrites.

[2] God has proclaimed: “Do not worship two gods; there is only on god. Therefore you shall reverence Me.” (Quran, 16: 51).

Cultivate the Art of Serendipity


January 5, 2016

NY Times Sunday Review | Opinion

SundayReview | Opinion

Cultivate the Art of Serendipity

03kennedy-1451576711727-blog427In 2008, an inventor named Steve Hollinger lobbed a digital camera across his studio toward a pile of pillows. “I wasn’t trying to make an invention,” he said. “I was just playing.” As his camera flew, it recorded what most of us would call a bad photo. But when Mr. Hollinger peered at that blurry image, he saw new possibilities. Soon, he was building a throwable videocamera in the shape of a baseball, equipped with gyroscopes and sensors. The Squito (as he named it) could be rolled into a crawlspace or thrown across a river — providing a record of the world from all kinds of “nonhuman” perspectives. Today, Mr. Hollinger holds six patents related to throwable cameras.

A surprising number of the conveniences of modern life were invented when someone stumbled upon a discovery or capitalized on an accident: the microwave oven, safety glass, smoke detectors, artificial sweeteners, X-ray imaging. Many blockbuster drugs of the 20th century emerged because a lab worker picked up on the “wrong” information.

While researching breakthroughs like these, I began to wonder whether we can train ourselves to become more serendipitous. How do we cultivate the art of finding what we’re not seeking?

For decades, a University of Missouri information scientist named Sanda Erdelez has been asking that question. Growing up in Croatia, she developed a passion for losing herself in piles of books and yellowed manuscripts, hoping to be surprised. Dr. Erdelez told me that Croatian has no word to capture the thrill of the unexpected discovery, so she was delighted when — after moving to the United States on a Fulbright scholarship in the 1980s — she learned the English word “serendipity.”

Today we think of serendipity as something like dumb luck. But its original meaning was very different.

In 1754, a belle-lettrist named Horace Walpole retreated to a desk in his gaudy castle in Twickenham, in southwest London, and penned a letter. Walpole had been entranced by a Persian fairy tale about three princes from the Isle of Serendip who possess superpowers of observation. In his letter, Walpole suggested that this old tale contained a crucial idea about human genius: “As their highnesses travelled, they were always making discoveries, by accident and sagacity, of things which they were not in quest of.” And he proposed a new word — “serendipity” — to describe this princely talent for detective work. At its birth, serendipity meant a skill rather than a random stroke of good fortune.

Dr. Erdelez agrees with that definition. She sees serendipity as something people do. In the mid-1990s, she began a study of about 100 people to find out how they created their own serendipity, or failed to do so.

Her qualitative data — from surveys and interviews — showed that the subjects fell into three distinct groups. Some she called “non-encounterers”; they saw through a tight focus, a kind of chink hole, and they tended to stick to their to-do lists when searching for information rather than wandering off into the margins. Other people were “occasional encounterers,” who stumbled into moments of serendipity now and then. Most interesting were the “super-encounterers,” who reported that happy surprises popped up wherever they looked. The super-encounterers loved to spend an afternoon hunting through, say, a Victorian journal on cattle breeding, in part, because they counted on finding treasures in the oddest places. In fact, they were so addicted to prospecting that they would find information for friends and colleagues.

You become a super-encounterer, according to Dr. Erdelez, in part because you believe that you are one — it helps to assume that you possess special powers of perception, like an invisible set of antennas, that will lead you to clues.

A few months ago, I was having a drink in Cambridge, Mass., with a friend, a talented journalist who was piecing together a portrait of a secretive Wall Street wizard. “But I haven’t found the real story yet; I’m still gathering string,” my friend told me, invoking an old newsroom term to describe the first stage of reporting, when you’re looking for something that you can’t yet name. Later that night, as I walked home from the bar, I realized “gathering string” is just another way of talking about super-encountering. After all, “string” is the stuff that accumulates in a journalist’s pocket. It’s the note you jot down in your car after the interview, the knickknack you notice on someone’s shelf, or the anomaly that jumps out at you in Appendix B of an otherwise boring research study.

As I navigated the brick sidewalk, passing under the pinkish glow of a streetlight, I thought about how string was probably hiding all around me. A major story might lurk behind the Harvard zoology museum ahead or in the plane soaring above. String is everywhere for the taking, if you have the talent to take it.

In the 1960s, Gay Talese, then a young reporter, declared that “New York is a city of things unnoticed” and delegated himself to be the one who noticed. Thus, he transformed the Isle of Manhattan into the Isle of Serendip: He traced the perambulations of feral cats, cataloged shoeshine purveyors, tracked down statistics related to the bathrooms at Yankee Stadium and discovered a colony of ants at the top of the Empire State Building. He published his findings in a little book titled “New York: A Serendipiter’s Journey.”

The term “serendipiter” breathed new life into Walpole’s word, turning serendipity into a protagonist and a practitioner. After all, those ants at the top of the Empire State Building didn’t find themselves; Mr. Talese had to notice them, which was no easy matter. Similarly, Dr. Erdelez came up with the term super-encounterer to give us a way to talk about the people rather than just the discoveries. Without such words, we tend to become dazzled by the happy accident itself, to think of it as something that exists independent of an observer.

We can slip into a twisted logic in which we half-believe the penicillin picked Alexander Fleming to be its emissary, or that the moons of Jupiter wanted to be seen by Galileo. But discoveries are products of the human mind.

As people dredge the unknown, they are engaging in a highly creative act. What an inventor “finds” is always an expression of him- or herself. Martin Chalfie, who won a Nobel Prize for his work connected with green fluorescent protein — the stuff that makes jellyfish glow green — told me that he and several other Nobel Prize winners benefited from a chain of accidents and chance encounters on the way to their revelations. Some scientists even embrace a kind of “free jazz” method, he said, improvising as they go along: “I’ve heard of people getting good results after accidentally dropping their experimental preparations on the floor, picking them up, and working on them nonetheless,” he added.

So how many big ideas emerge from spills, crashes, failed experiments and blind stabs? One survey of patent holders (the PatVal study of European inventors, published in 2005) found that an incredible 50 percent of patents resulted from what could be described as a serendipitous process. Thousands of survey respondents reported that their idea evolved when they were working on an unrelated project — and often when they weren’t even trying to invent anything. This is why we need to know far more about the habits that transform a mistake into a breakthrough.

IN the late 1980s, Dr. John Eng, an endocrinologist, became curious about certain animal poisons that damaged the pancreas, so he ordered lizard venom through the mail and began to play around with it. As a result of this curious exercise, he discovered a new compound in the saliva of a Gila monster, and that in turn led to a treatment for diabetes. One of Dr. Eng’s associates (quoted in a 2005 newspaper article) remarked that he was capable of seeing “patterns that others don’t see.”

Is this pattern-finding ability similar to the artistic skill of a painter like Georgia O’Keeffe? Is it related to the string-gathering prowess of Gay Talese? We still know so little about creative observation that it’s impossible to answer such questions.

That’s why we need to develop a new, interdisciplinary field — call it serendipity studies — that can help us create a taxonomy of discoveries in the chemistry lab, the newsroom, the forest, the classroom, the particle accelerator and the hospital. By observing and documenting the many different “species” of super-encounterers, we might begin to understand their minds.

A number of pioneering scholars have already begun this work, but they seem to be doing so in their own silos and without much cross-talk. In a 2005 paper (“Serendipitous Insights Involving Nonhuman Primates”), two experts from the Washington National Primate Research Center in Seattle cataloged the chance encounters that yielded new insights from creatures like the pigtail macaque. Meanwhile, the authors of a paper titled “On the Exploitation of Serendipity in Drug Discovery” puzzled over the reasons the 1950s and ’60s saw a bonanza of breakthroughs in psychiatric medication, and why that run of serendipity ended. And in yet another field of study, a few information scientists are trying to understand the effects of being bombarded on social media sites with countless tantalizing pieces of “string.”

What could these researchers discover if they came together for one big conversation?

Of course, even if we do organize the study of serendipity, it will always be a whimsical undertaking, given that the phenomenon is difficult to define, amazingly variable and hard to capture in data. The clues will no doubt emerge where we least expect them, perhaps in the fungi clinging to the walls of parking garages or the mating habits of bird-watchers. The journey will be maddening, but the potential insights could be profound: One day we might be able to stumble upon new and better ways of getting lost.

A version of this op-ed appears in print on January 3, 2016, on page SR1 of the New York edition with the headline: Cultivating the Art of Serendipity.

‘Brief Candle in the Dark,’ by Richard Dawkins


November 29, 2015

NY Times Sunday Book Review

‘Brief Candle in the Dark,’ by Richard Dawkins

Some lumbering robot, this Richard Dawkins. “Lumbering robots” was one of the ways in which this scarily brilliant evolutionary biologist described human beings vis-à-vis their genes in “The Selfish Gene,” his first and probably still his most influential book — more than a million copies sold. (His atheist manifesto, “The God Delusion,” has sold more than three million.) We’re essentially a means of physical and, more important, temporal transportation for our genes, he explained. They can live on for eons after we take our own inherited genes and mate with those of that handsome boy behind us in the ­movie-ticket line who ended up sitting next to us or the ones belonging to that pretty girl whose change we picked up by mistake at the newsstand and with whom we then had an apologetic coffee. And so on down the line. Our lines. Dawkins has also called us “throwaway survival machines” for our genes. But only, I think, to make a biological point.

In all of his work — including this new memoir, “Brief Candle in the Dark: My Life in Science” (a sort of sequel to “An Appetite for Wonder,” about his early life) — Dawkins himself gives the existential lie to the notion that if we are here for any reason, we are here primarily, maybe exclusively, to provide Uber service for our genes and, just a little more altruistically, for the genes of those biologically most closely related to us. Because his genes don’t know anything about him and he knows just about everything about them.

In “Brief Candle in the Dark” — a title that I have to admit made me say, “Oh, please!” — Dawkins gives us a chronologically helter-skelter account of his grown-up research, discoveries, reflections, collaborations and controversies (especially about religion), along with reports on his appearances at various events, debates and conferences. So many events, so many conferences. He has become what Yeats calls himself in “Among School Children,” a “smiling public man.” (Though not always smiling, in Dawkins’s case, especially when it comes to his atheism.)

“Helter-skelter”? The book is “organized” achronologically, with, for example, sections devoted to the author’s academic progress, culminating in his appointment as Oxford’s first Charles Simonyi professor of public understanding of science; a chapter about his publishing history; another about “Debates and Encounters.” “If you don’t like digressive anecdotes,” Dawkins tells us, “you might find you’re reading the wrong book.”

Here is Dawkins describing Jane Brockmann’s experiments with the burrows of the female digger wasp, which he used to demonstrate the principle of evolutionarily stable strategy: “We need ESS theory whenever it happens that the best strategy for an animal depends on which strategy most other animals in the population have adopted.” Here he is three pages later introducing at some admiring length his Oxford University student Alan Grafen, who helped with the math of the digger-wasp-burrow study. A page later, still nominally among the wasp burrows, we find a Monty Python-esque description of the Great Annual Punt Race, in which the Animal Behavior Research Group rows against the Edward Grey Institute of Field Ornithology.

Dawkins’s tributes to teachers, colleagues, students and public figures mingle with fairly extensive reprises on and further thoughts about the scientific research and philosophical positions he has developed in his 12 previous works. (They are all still in print, Dawkins tells us, presumably with a little blush.) There is his tribute to one of his “heroes,” the Nobel Prize-winning biologist Peter Medawar, admired “as much for his writing style as for his science.” And another to David Attenborough, brother of Richard, a “marvelous man.” And to Susan Blackmore, a “briskly intelligent psychologist.” Then there’s Christopher Hitchens, with his “intellect, wit, lightning repartee.” And so on.

These encomiums and credit-givings complement Dawkins’s persistent efforts to leaven his recollections with humor, applying a generally light touch: “An agent was a good thing to have,” and Caroline Dawnay “was a good representative of the genus.” “The snort of a pig-frog . . . may affect another pig-frog as the nightingale affected Keats, or the skylark Shelley.” Together, these mots — bon and otherwise — and Dawkins’s acknowledgments of the talents and the contributions of others to his life and work add up to a kind of self-­effacement campaign. The crucial element in “self-effacement” is “self.” Self-effacement is not the same as modesty or humility — it is an effort of will, not a unitary psychological state. Nevertheless, that Dawkins mounts this campaign in “Brief Candle in the Dark” is surprisingly sweet, and admirable. That he loses the battle is in no way shameful. If anyone in modern science deserves to regard his or her own contributions with pride, even with triumph, it is Richard Dawkins.

The sections of “Brief Candle in the Dark” that deal with religion and atheism are middle-aged if not old hat to anyone who knows anything about the public Dawkins, along with Sam Harris, Lawrence Krauss and Christopher Hitchens. But they are still entertaining. The often long passages that involve pure science are sometimes difficult and thus, sadly, require short shrift in a book review. “Natural selection, at each locus independently, favors whichever allele cooperates with the other genes with whom it shares a succession of bodies: And that means it cooperates with the alleles at those other loci, which cooperate in their turn.” But work on them and they become, as you might expect, cogent précis of Dawkins’s life’s work, and vastly illuminating: “Animals are islands in this hyperspace, vastly spaced out from one another as if in some Hyperpolynesia, surrounded by a fringing reef of closely related animals.” “If one identical twin were good at three-­dimensional visualization, I would expect that his twin would be too. But I’d be very surprised to find genes for gothic arches, postmodern finials or neoclassical ­architraves.”

Especially bright is the light thrown in summary on replication and adaptation and connectedness, not only biological but cultural, especially in the concept of the “meme” — a word coined by Dawkins to describe images, phrases, references, pieces of music, that are themselves replicated and then spread virally throughout the world’s cultural consciousness. The meme is at best, I think, a metaphorically baggy analogue to the gene, but it serves the purpose of emphasizing the recursiveness and interrelatedness of our experience of the world.

Sometimes you get the feeling that ­Dawkins sees — and believes we should see — everything as connected to everything else, everything affecting everything else, everything determining and being determined by everything else. In fact, in “Brief Candle in the Dark,” he recursively recites something pertinent to this point that he wrote in “Unweaving the Rainbow,” about the compatibility of art and science: “The living world can be seen as a network of interlocking fields of replicator power.”

In his marveling at art and music and the accomplishments of his predecessors, in his sense of wonder, unspoiled — in fact amplified — by science, Dawkins proves we’re not in any way reducible to mere lumbering (or any other kinds of) robots for our genes. Even though the price of our ability to learn and marvel is death, and our genes have at least theoretical immortality, they’re really but tiny vehicles for our own wonder.

Daniel Menaker’s most recent book is a memoir, “My Mistake.”

A version of this review appears in print on November 29, 2015, on page BR8 of the Sunday Book Review with the headline: In His Genes. Today’s Paper