As humans, we like to think we're special. We might realize that other people are fallible, that their brains are prone to consistent cognitive errors. But surely that can't be true of us. And as investors, the idea is even more painful: What if cognitive biases we can't even see end up warping our thoughts on a company and we miss out on an opportunity -- or see a fantastic chance where none exists?
But we are, in fact, all human. That means we can enjoy avocado toast and the Hamilton soundtrack, but it also means our brains are susceptible to mistakes and biases that can affect our decision-making. Fortunately, those brains are also capable of learning about those mistakes and biases, and that helps us correct for them. Here are three of the most common fallacies, and how to prevent them from affecting your investing.
Confirmation bias is the process of seeking out information that confirms our preconceived notions. Say you're considering investing in a company that has strong revenue growth, great capital allocation, and stellar net income. On the conference calls, you find the executive team engaging; they regularly predict good targets and continuously exceed them. You decide to look up the company on Glassdoor.com to see the opinion of some lower-level employees (which is very Foolish of you!), and you find it has a 3-star rating: by definition, average.
This is where you may find yourself explaining away some of the lower reviews. "You'll never be able to please every employee," you might tell yourself, or "This employee was mad in such a specific situation, it doesn't reflect anything about the broader company." That's if you even look at the negative reviews at all! You might be much more inclined to only click on the 4- and 5-star reviews. And that's understandable: You like this company and you want to keep liking it, so you seek out the information that helps you do that. But this hinders your ability to get a holistic view of the business and to analyze it effectively. (Choosing not to read reviews that note a dishonest corporate culture or ill-advised financial decisions, for example, could cost you.)
The availability heuristic (fancy term for "rule of thumb") details the way we make broad judgments about the likeliness of an occurrence based on a fact, idea, or instance that quickly comes to mind. This happened to me recently: I was examining Boston Beer Company (NYSE: SAM) as a potential investment, and I was especially interested in their non-beer offerings, such as cider and alcoholic seltzer. I don't like the taste of beer (I know, and yes, I've tried a lot of different types), and many of my friends don't either. So before I began my research, I assumed that perhaps 20% of the population of alcohol-drinking age buys these offerings. The actual statistic is about 1%. I was wildly off, because I thought of myself and the many of my friends who don't like beer and extrapolated that experience, incorrectly, to the entire alcohol-drinking population.
The way information is presented to us influences how we interpret it. This was revealed in a 1979 study conducted by Daniel Kahneman and Amos Tversky (if you have any interest in learning more about this field, those are the two biggest names to know). The two researchers posited a situation in which 600 people had contracted a hypothetical disease. They then asked the same question framed in two different ways:
- If Program A is adopted, 200 people will be saved.
- If Program B is adopted, there is 1/3 probability that 600 people will be saved, and 2/3 probability that no people will be saved.
- Which of the two programs would you favor?
- If Program C is adopted, 400 people will die.
- If Program D is adopted, there is 1/3 probability that nobody will die, and 2/3 probability that 600 people will die.
- Which of the two programs would you favor?
When faced with Problem 1, more people (72%) chose Program A, opting to save 200 lives for certain. But when faced with Problem 2, respondents were more likely (78%) to opt for Program D, which is riskier. The simple change in framing brought a clear difference in response, demonstrating that people are much more likely to avoid risks when dealing with gains -- and just as likely to take risks with potential losses on the horizon.
As investors, we can learn from this. Be skeptical when listening to conference calls, looking at investor-day presentations, and examining all areas of a company. Be aware of how earnings are framed, how metrics are pitched, and how effective marketing is at framing the company well.
Can humans operate bias-free? There are a ton more biases where those came from, and they're not easy to counteract. The good news is that a lot of benefit comes from just being aware of them. Another Motley Fool analyst, Jim Mueller, recently taught a class on behavioral finance here at HQ, and at the end he gave some great advice: Slow down. Ask yourself questions as you make your decisions. That small piece of wisdom alone could be key to protecting your portfolio.
I also like to try to back up all my opinions with some statistics or studies. I may be objectively right about the taste of beer, but that doesn't mean I should rely exclusively on my gut feeling. It's always worth checking the data on what you "know" to be true. While we cannot erase our biases completely, by being aware of them and implementing steps to help try to counteract them, we can minimize their impact on our returns.