September 17, 2012
On Perception, Human Mind and Decision Making
by Khairie Hisyam Aliman@http://www.themalaysianinsider.com
One particularly memorable class I had in university was when a professor talked about marble stones and the metamorphic processes that create them, ending with an short account of when he was in Mecca.
As he laid eyes on the marble flooring near the Kaabah, his mind immediately analysed its properties and before he realised it he had a good idea of its parent rock’s geological qualities and history.
At the time, I was awestruck by the professor’s geological expertise and how it provides an additional lens through which he perceives the world, picking out details that another person could never guess at. Years later I find myself almost understanding what that might feel like — albeit with words and language instead of rocks.
Previously I wrote about sub-editors and how the nature of the work imparts lifelong habits, even after moving on to other jobs. While that is somewhat different from my professor’s knowledge and experience flavouring his perception of his surroundings, I feel both boils down to the same basic thing: our work defines a significant part of who we are. The knowledge and skills that we learn, acquire and master, once hardwired into our brain, inevitably influence how we interact with our world.
Inevitably, these bits and pieces that we keep adding to our great archive as we go through life will shape us as individuals. As we learn new things and discover, the way we perceive things around us evolves to reflect what we know and understand.
When I was in my teens transitioning from comic books to more text-heavy volumes by Raymond E. Feist and Terry Brooks, my perception of the books was rather simple. Both writers tell different stories, and that was all I saw. At the back of my mind I was vaguely aware of another aspect differentiating the authors that I could not seem to vocalise, like a forgotten word at the tip of your tongue that just won’t come out.
It was only when I learned to write professionally and grew aware of the concept of “writing style” that I realised — like a light switched on in a pitch-black room — that the authors structure their sentences differently, finally seeing the nuances that mark their respective voices.
From that point on I began paying attention to how different writers arrange their words, how different it is from how I would write it and what makes their personality shine through the dry ink on paper. Learning that one concept as a writer added an extra lens through which I read, and whenever I read I look through it without conscious effort.
I imagine it is the same with everyone, whatever you do for a living. What we know colours our perspective and, eventually, after accumulating enough knowledge or skill in something, that colouring stays permanently.
Our brain processes what we see and hear and touch based on what it knows, and the more we know in one field of expertise, the more it will be inclined to access that area of its archives first to give definition to what we perceive. It is the reason why an architect will look at a house and immediately ponder its design, whereas a realtor might see the same house and weigh its location and value.
Sometimes it makes me wonder: are those around me seeing things I do not? Perhaps they do. And perhaps I see little things that they miss, too. The thought of something I see clearly still holding mysteries that are in plain view to someone else fascinates me as much as it humbles me.
My professor sees the world through the eyes of a geologist. Whose eyes might you be looking through?
* The views expressed here are the personal opinion of the columnist.
The Science of Irrationality
A Nobelist explains our fondness for not thinking
by Jonah Lehrer
Here’s a simple arithmetic question: “A bat and ball cost $1.10. The bat costs $1 more than the ball. How much does the ball cost?”
The vast majority of people respond quickly and confidently, insisting the ball costs 10 cents. This answer is both incredibly obvious and utterly wrong. (The correct answer is five cents for the ball and $1.05 for the bat.) What’s most impressive is that education doesn’t really help; more than 50% of students at Harvard, Princeton and the Massachusetts Institute of Technology routinely give the incorrect answer.
Daniel Kahneman (left), a Nobel Laureate and professor of psychology at Princeton, has been asking questions like this for more than five decades. His disarmingly simple experiments have profoundly changed the way that we think about thinking.
While philosophers, economists and social scientists had assumed for centuries that human beings are rational agents, Mr. Kahneman and his scientific partner, the late Amos Tversky, demonstrated that we’re not nearly as rational as we like to believe.
When people face an uncertain situation, they don’t carefully evaluate the information or look up relevant statistics. Instead, their decisions depend on mental short cuts, which often lead them to make foolish decisions. The short cuts aren’t a faster way of doing the math; they’re a way of skipping the math altogether.
Although Mr. Kahneman is now widely recognized as one of the most influential psychologists of the 20th century, his research was dismissed for years. Mr. Kahneman recounts how one eminent American philosopher, after hearing about the work, quickly turned away, saying, “I am not interested in the psychology of stupidity.”
But the philosopher missed the point. The biases and blind-spots identified by Messrs. Kahneman and Tversky aren’t symptoms of stupidity. They’re an essential part of our humanity, the inescapable byproducts of a brain that evolution engineered over millions of years.
In Mr. Kahneman’s important new book, “Thinking, Fast and Slow,” his first work for a popular audience, he outlines the implications of this new model of cognition. What are the most important mental errors that we all make? And can they be overcome?
Consider the overconfidence bias, which drives many of our mistakes in decision-making. The best demonstration of the bias comes from the world of investing. Although many fund managers charge high fees to oversee stock portfolios, they routinely fail a basic test of skill: persistent achievement.
As Mr. Kahneman notes, the year-to-year correlation between the performance of the vast majority of funds is barely above zero, which suggests that most successful managers are banking on luck, not talent.
This shouldn’t be too surprising. The stock market is a case study in randomness, a system so complex that it’s impossible to predict. Nevertheless, professional investors routinely believe that they can see what others can’t. The end result is that they make far too many trades, with costly consequences.
And it’s not just investors who suffer from this mental flaw. The typical entrepreneur believes that he or she has a 60% chance of success, though less than 35% of small businesses survive more than five years. Meanwhile, CEOs who hold more company stock—taken here as a sign of self-confidence—also tend to make more irresponsible decisions, overpaying for acquisitions and engaging in misguided mergers.
Even consumers are hurt by this bias. A recent survey of American homeowners found that they expected, on average, to spend about $18,500 on remodelling their kitchens. The actual average cost? Nearly $39,000.
We like to see ourselves as a Promethean species, uniquely endowed with the gift of reason. But Mr. Kahneman’s simple experiments reveal a very different mind, stuffed full of habits that, in most situations, lead us astray. Though overconfidence may encourage us to take necessary risks—Mr. Kahneman calls it the “engine of capitalism”—it’s generally a dangerous (and expensive) illusion.
What’s even more upsetting is that these habits are virtually impossible to fix. As Mr. Kahneman himself admits, “My intuitive thinking is just as prone to overconfidence, extreme predictions and the planning fallacy as it was before I made a study of these issues.”
Even when we know why we stumble, we still find a way to fall. WSJ: Jonah Lehrer