The belief that human beings are fundamentally rational individuals is perhaps the most important foundational element of modern economics. While no economist would ever claim that all individuals are rational all the time, it has long been thought that rational choice governs human behavior frequently enough to be a dominant force driving the economy.
The concept of the invisible hand, first put forward by Adam Smith more than 250 years ago, describes the means by which individual pursuit of rational self interest naturally leads toward socially desirable ends. Over time, this concept has been expanded upon and refined, perhaps most notably by the recognition that individual self interest can result in the imposition of externalities on others which implies a regulatory role for government. However, few economists questioned the basic premise of human rationality in a serious way, most likely because doing so would have required incorporating concepts from other intellectual disciplines.
Daniel Kahneman and Amos Tversky did not start their careers with the intention of one day upending the field of economics. At the start of their collaboration in the late 1960s, they had been shaped by experiences that were superficially similar but, in fact, quite different. Although Daniel Kahneman was born in Tel Aviv in 1934, he spent his childhood in France during the second world war. After experiencing many hardships in Nazi-occupied France including the death of his father, he returned to Israel after the war as a socially awkward young man with a deep interest in what drives seemingly inexplicable human behavior. Amos Tversky was born in Haifa in 1937 and was shaped by the experience of growing up in Israel during its struggle for independence. He was an extrovert who seemed to be in constant motion. One wouldn’t have expected these two individuals to be friends let alone end up collaborating in ways that would shake the foundations of economics, but that is precisely what happened.
It takes a skilled writer to combine elements of a biography with a discourse on psychology and economics but Michael Lewis was up to the challenge. The Undoing Project: A Friendship That Changed Our Minds is the remarkable story of Daniel Kahneman and Amos Tversky and the impact of their professional collaboration, but at its core the book is the story of two friends. Mr. Lewis gets us to think in terms of “Danny” and “Amos” and this human story adds to the context of their professional accomplishments.
By the time of “the collision” between Daniel Kahneman and Amos Tversky in 1969, they had similar resumes despite their very different personalities. Both had served in the Israeli military and had earned doctorates in psychology in the United States. They had both returned to Hebrew University and were thought to be rivals, yet they entered into a long collaboration in which their respective skills and personalities combined to create work that neither man could have generated on his own. The failure of human intuition – the fact that people are not natural statisticians – was a major element of their early work. Rather than being intuitive statisticians, people seemed to use “rules of thumb”, or heuristics, to make judgments.
Anchoring and Adjustment
One of the most interesting heuristics is referred to as “anchoring and adjustment”. This is a heuristic that is directly applicable to investors and, once fully understood, quite alarming if our goal is to make rational decisions. Consider the following example:
Two groups of high school students are given five seconds to guess the answer to a math problem that would require more time to actually solve. The first group is asked to estimate this product:
8 x 7 x 6 x 5 x 4 x 3 x 2 x 1
The second group is asked to estimate this product:
1 x 2 x 3 x 4 x 5 x 6 x 7 x 8
The study found that the first group’s median answer was 2,250. The second group’s median answer was 512. The correct answer is 40,320. What could account for the huge difference in the estimates from the two groups? The first group was anchored by the fact that the first number in the sequence is eight. The second group anchored on the first number in the sequence being one.
This might seems like a clever parlor trick, but additional research revealed many other examples of systematic bias based on the anchoring effect. People tend to anchor on information that is not at all relevant to the problem they are being asked to solve. Another example involved asking people to spin a “wheel of fortune” with slots on it numbered from 0 to 100. They were then asked to estimate the percentage of African countries in the United Nations. The individuals who landed on a high number in the wheel of fortune guessed that a higher percentage of countries in the United Nations are African compared to individuals who landed on a low number. Subliminal anchoring appears to be a systematic flaw in human thinking.
The anchoring effect has important implications for investors. When evaluating a company as a possible investment, to what extent do we allow the current stock price or market capitalization to influence our estimate of what the company is worth? If we listen to management presentations, to what extent does earnings guidance influence our own estimate of future earnings? The evidence would suggest that we are prone to taking the initial number we have seen, whether it is a stock price or an earnings estimate, and adjust from that point rather than coming up with our own evaluation in an unbiased manner.
Prospect Theory and Loss Aversion
In 2002, Daniel Kahneman won the Nobel Prize in Economics for his contributions to Prospect Theory which was developed in collaboration with Amos Tversky who died in 1996 at the age of 59. Following the publication of the main paper describing prospect theory in 1979, the two men grew apart when Dr. Tversky appeared to gain the most professionally from the collaboration. The very human story of these later years is described well by Mr. Lewis and reinforces the fact that even those who study the human mind professionally are not at all immune to the negative aspects of human emotion including jealousy, envy, arrogance, and just basic misunderstanding.
While the human part of the story is very interesting, prospect theory itself is fascinating. In 2011, Daniel Kahneman published Thinking, Fast and Slow, a book that describes prospect theory and many additional topics in a format intended for a non-academic audience. For investors, the principles Dr. Kahneman describes are just as important as having a solid grasp of business fundamentals and valuation. One very important concept involves loss aversion. Consider the exhibit below:
Through a series of experiments and surveys, Dr. Kahneman demonstrates that individuals feel the benefits of gains much less forcefully than the pain of losses. Investors can ask themselves questions such as “What is the smallest gain that I need to balance an equal chance of losing $100?” Many people will answer $200, or twice as much as the potential loss. Obviously, each investor will answer this question differently but studies have shown that the “loss aversion ratio” is usually in a range of 1.5 to 2.5.
We feel the pain of losses much more intensely than the pleasure of gains which has important implications. For one thing, the more often we monitor our investments or the stock market as a whole, the more often we will be faced with “paper losses”. An investor who monitors the value of his holdings once per quarter will obviously experience paper losses from time to time, but much less often than an investor who checks quotations every week, every day, or every minute. We can minimize pain by changing the timeframe that we use to evaluate gains and losses.
We Are Not Fully Rational
Perhaps the most important point we can take away from the work of Daniel Kahneman and Amos Tversky is that we are all subject to systematic biases that lead us to make decisions that are not always strictly rational. People base their decisions on heuristics rather than a dispassionate evaluation of probability. Furthermore, knowing that this is the case provides no immunity. Experts in various fields have shown that gut feelings and rules of thumb often govern their behavior. The consequences can be severe.
What is the answer to this problem? Being aware of the fact that it exists is a good start but vigilance is required to reduce the chances of error, especially serious errors. The use of checklists has been shown to help people make better decisions and reduce risk. Adopting multiple mental models based on different disciplines is also very important. The fact that two Israeli psychologists dared to cross disciplines and upend centuries of economic assumptions proves this point. The story of their lives is fascinating and will likely induce readers to explore the subject in more detail.