Thursday 15 November 2012

"Thinking, fast and slow"

My class mates used to tell me that the reason I got high grades in oral exams was because of my confident argumentation. I used to believe it was true. However, it was schooling, it was easy to make assumptions, to prove a theory, or apply a specific formula, and consequently, sound certain.
However, recently I have noticed that unfortunately, in some situations I was misguided by my own perceptions of reality, my own argumentations, or even my own knowledge on a particular subject. Especially now, after being employed for one of the biggest Danish companies, I have completely changed my perception of certainty, after having experienced “real-world” uncertainty.  Because of business complexity, my own intuitive and not-intuitive perceptions can be misguiding. Not that my understanding or solution to a given problem are wrong, but as the world becomes more complex and information more limited, my own decisions become more biased, and arguments less powerful.

I am actually becoming more suspicious of my own judgment of a situation. I even don’t know it is for good, or for bad.  I recognize that it is thus hard to avoid making mistakes, or it’s easy to misunderstand a situation. But the good news is this also applies to “powerful” people, which unfortunately, cannot see their own delimitations. Their perception of a situation can also be just an illusion.
Now, being on this stage of my carrier, a book of Daniel Kahneman, Nobel Laureate in Economics, "Thinking, Fast and Slow" is a right book to read. He classifies our mind into two systems, system 1 and system 2.
System 1 is an automatic system and is responsible for our behavior when relying on our feelings, habits, instincts, emotions, illusions and perceptions. Some routine tasks can indeed be performed by the system 1. When we talk about system 1, we talk: “this was a pure system 1 response. She reacted to the threat before she recognized it.”
System 2 is a slow system, which requires more mental effort when relying on it. For example, “mental arithmetic”, concentration and self-control are distinguishing features of the system. So system 1 suggests, and system 2 controls and calculates.
According to the author, system 1 is rapid and often is based on our intuitive judgment. At the same time, the author provides “scientific” example of how “syndrome” of familiarity creates an illusion of remembering, or how the fact of repeating can create an illusion of truth. For example, the familiarity of one phrase in the statement sufficed to make the whole statement feel familiar, and therefore true. Thus, we tend to relay on the System 1.
Also we rely on the System 1, because system 2 is a LAZY system. Applying the “System 1 vs. System 2” into my own situation, it all makes sense. In an uncertain situation, we put too much weight on system 1 when judging a situation/ making a decision, which consequently create even more uncertainty.
I found this book as a good reminder about our critical sense. I felt it contained some important messages.
To Economic forecasters/policy decision makers: be more critical when granting tax money legitimized by research, which could actually be based on inadequate evidence;
To Business decision makers: “you can be blind to the obvious, but we are also blind to our blindness” (my favorite quote, I was looking forward to apply it in this context).
To us, ordinary people: “constantly questioning our own thinking would be impossibly tedious…the best we can do is to make a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high”.
If we remember that the System 1 response is less reliable (I mean in business context), and we all gave System 2 to take control, probably the world would be different, maybe less uncertain?

No comments:

Post a Comment