We often assume things will come out the way we think they should—stocks with great performance should have great stories driving their gains. Great companies should be great investments; boring companies should be boring investments. In forming subjective judgments people look for familiar patterns, relying on well-worn stereotypes. These metal shortcuts are called heuristics, or mental rules-of-thumb. In many instances, these mental shortcuts are helpful, but not when it comes to investing. Here, they frequently lead to errors in judgment.
We’ve seen, for example, that people largely ignore how frequently something occurs. These odds are called base rates. Base rates are among the most illuminating statistics that exist. They’re just like batting averages. For example, if a town of 100,000 people had 70,000 lawyers and 30,000 librarians, the base rate for lawyers in that town is 70 percent. When used in the stock market, base rates tell you what to expect from a certain class of stocks (e.g., all stocks with high dividend yields) and what that variable generally predicts for the future. But base rates tell you nothing about how each individual member of that class will behave.
Most statistical prediction techniques use base rates. 75 percent of university students with grade point averages above 3.5 go on to do well in graduate school. Smokers are twice as likely to get cancer. Stocks with low price-to-earnings ratios outperform the market 65 percent of the time. The best way to predict the future is to bet with the base rate that is derived from a large sample. Yet numerous studies have found that people make full use of base rate information only when there is a lack of descriptive data. In one example, people are told that out of a sample of 100 people, 70 are lawyers and 30 are engineers. When provided with no additional information and asked to guess the occupation of a randomly selected 10, people use the base rate information, saying all 10 are lawyers, since by doing so they assure themselves of getting the most right.
However, when worthless but descriptive data is added, such as “Dick is a highly motivated 30-year-old married man who is well liked by his colleagues,” people largely ignore the base rate information in favor of their “feel” for the person. They are certain that their unique insights will help them make a better forecast, even when the additional information is meaningless. We prefer descriptive data to impersonal statistics because it better represents our individual experience. When stereotypical information is added, such as “Dick is 30 years old, married, shows no interest in politics or social issues and like to spend free time on his many hobbies which include carpentry and mathematical puzzles”, people totally ignore the base rate and bet Dick is an engineer, despite the 70 percent chance that he is a lawyer. One can even jack the base rate for lawyers up to over 90 percent, and people will cling to their stereotypical opinion of an engineer.
Simple Solutions are Usually the Best
We also prefer complex explanations to simple ones, even though most of the advances made by in science over the last 1000 years have been guided by Occam’s razor, which states that when there are competing solutions to a problem, the simplest one is generally correct. To demonstrate people’s preferences, professor Alex Bavelas designed a fascinating experiment in which two subjects, Smith and Jones, face individual projection screens. They cannot see or communicate with each other. They’re told the purpose of the experiment is to learn to recognize the difference between healthy and sick cells, and they must learn to distinguish between the two through trial and error. In front of each are two buttons marked Healthy and Sick, along with two signal lights marked Right and Wrong. Every time a slide is projected they guess if it’s healthy or sick by pressing the button so marked. After they guess, their signal light will flash Right or Wrong, informing them if they have guessed correctly.
Here’s the hitch. Smith gets true feedback. If he’s correct, his light flashes Right, if he’s wrong, it flashes Wrong. Since he’s getting true feedback, Smith soon gets around 80 percent correct, since it’s a matter of simple discrimination.
Jones’ situation is entirely different. He doesn’t get true feedback based on his guesses. Rather, the feedback he gets is based on Smith’s guesses! It doesn’t matter if he’s right or wrong about a particular slide; he’s told he’s right if Smith guessed right and wrong if Smith guessed wrong. Of course, Jones doesn’t know this. He’s been told there is a true order that he can discover from the feedback. He ends up searching for order when there is no way to find it.
The moderator then asks Smith and Jones to discuss the rules they use for judging healthy and sick cells. Smith, who got true feedback, offers rules that are simple, concrete and to the point. Jones, on the other hand, uses rules that are, out of necessity, subtle, complex and highly adorned. After all, he had to base his opinions on contradictory guesses and hunches.
The amazing thing is that Smith doesn’t think Jones’ explanations are absurd, crazy or unnecessarily complicated. He’s impressed by the “brilliance” of Jones’ method and feels inferior and vulnerable because of the pedestrian simplicity of his own rules. The more complicated and ornate Jones’ explanations, the more likely they are to convince Smith.
Before the next test with new slides, the two are asked to guess who will do better this time around. All Joneses and most Smiths say that Jones will. In fact, Jones shows no improvement at all. Smith, on the other hand, does significantly worse than he did the first time around since he’s now making guesses based on some of the complicated rules he learned from Jones.
Such is the state of Homo Economicus —even though we can learn and rationally understand why we make the investing mistakes we do, we are destined to repeat them. We are hard-wired to act the way we do. Neurobiologists are proving this with PET scans of our brains—when making decisions under uncertainty the rational part of the brain is mostly dormant but the emotional part fires away! In his book Mean Markets and Lizard Brains, Terry Burnham says that there are biological causes for irrational financial behavior, and these in turn cause market panics and crashes. We literally are reverting to our “lizard brain” when faced with the emotion-jarring task of investing our money. He points out what a recent study at MIT confirmed—the most successful investors are those who have a system in place to guard against emotional decisions.
Indeed, having a guiding, unemotional system might be the only way to successfully guard against making the same mistakes time and again. As Woody Dorsey says in his book Behavioral Trading: Methods for Measuring Investor Confidence, Expectations, and Market Trends: “What is the difference between hunter-gatherer guys and gals of 40,000 years ago and our contemporary go-getters? Nothing. The competitive urge is basic and perpetual.”
Financial markets have alternated between booms and busts for hundreds of years. Each generation falls prey to the fads, fallacies, enthusiasms and stories of its era, most often when the market is at or near the end of one cycle and the beginning of the next. The problem is, investors make decisions based on information they learned about as it unfolded—information that proves nearly useless in the market’s next phase. This explains why investors so predictably shun stocks and bonds near market bottoms but buy with abandon near market tops. It seems each generation is amused by the folly of those that preceded it while remaining totally ignorant of its own.
To understand that our earlier theories of rational human decision-making were fatally flawed, we must pay attention to the actual data and the actual way we make choices. As Maurice Allais, the eminent French economist and winner of the 1988 Nobel Memorial prize, says “I have never hesitated to question commonly accepted theories when they appeared to me to be founded on hypotheses which implied consequences incompatible with observed data. Dominant ideas, however erroneous they may be, end up, simply through repetition, by acquiring the quality of established truths which cannot be questioned without confronting the active ostracism of the establishment.”
By ignoring all of the experimental data that has accumulated over the last 50 years, we continually put ourselves in harm’s way, and continue to make exactly the same mistakes, generation after generation. It seems our very humanity is what makes this endless cycle a permanent facet of our investment lives.