I've read 284 books in the last four years. Some were awful. Others were incredible. But I took notes on all of them.
I've never known what to do with these notes. Then I got an idea: I'll dump some of the highlights into a weekly article. This is the first one of what I hope will be many.
Risk Savvy by Gerd Gigerenzer shows why most people make dumb decisions: We were never trained how to interpret risk. Here are six things I learned from this book.
1. Risk is a language most of us don't speak:
Literacy—the ability to read and write—is the lifeblood of an informed citizenship in a democracy. But knowing how to read and write isn't enough. Risk literacy is the basic knowledge required to deal with a modern technological society. The breakneck speed of technological innovation will make risk literacy as indispensable in the twenty-first century as reading and writing were in previous centuries. Without it, you jeopardize your health and money, or may be manipulated into unrealistic fears and hopes. One might think that the basics of risk literacy are already being taught. Yet you will look in vain for it in most high schools, law schools, medical schools, and beyond. As a result, most of us are risk illiterate ...
People aren't stupid. The problem is that our educational system has an amazing blind spot concerning risk literacy. We teach our children the mathematics of certainty—geometry and trigonometry—but not the mathematics of uncertainty, statistical thinking.
And we teach our children biology but not the psychology that shapes their fears and desires. Even experts, shockingly , are not trained how to communicate risks to the public in an understandable way. And there can be positive interest in scaring people: to get an article on the front page , to persuade people to relinquish civil rights, or to sell a product. All these outside causes contribute to the problem.
2. The more complex a risk is, the simpler a solution we need to find:
When we face a complex problem, we look for a complex solution. And when it doesn't work, we seek an even more complex one. In an uncertain world, that's a big error. Complex problems do not always require complex solutions. Overly complicated systems, from financial derivatives to tax systems, are difficult to comprehend, easy to exploit, and possibly dangerous. And they do not increase the trust of the people. Simple rules, in contrast, can make us smart and create a safer world.
3. Technology and sophistication increases our confidence in predictions faster than the accuracy of those predictions:
Many of us ask for certainty from our bankers, our doctors, and our political leaders. What they deliver in response is the illusion of certainty, the belief that something is certain even when it isn't. Every year we support a multibillion-dollar industry that calculates future predictions, mostly erroneous, from market tips to global flu pandemics. Many of us smile at old-fashioned fortune-tellers. But when the soothsayers work with computer algorithms rather than tarot cards, we take their predictions seriously and are prepared to pay for them. The most astounding part is our collective amnesia: Most of us are still anxious to see stock market predictions even if they have been consistently wrong year after year.
4. Good decision-making requires multiple ways of thinking:
In an uncertain world, it is impossible to determine the optimal course of action by calculating the exact risks. We have to deal with "unknown unknowns ." Surprises happen. Even when calculation does not provide a clear answer, however, we have to make decisions. Thankfully we can do much better than frantically clinging to and tumbling off Fortuna's wheel. Fortuna and Sapientia had a second brainchild alongside mathematical probability, which is often passed over: rules of thumb, known in scientific language as heuristics. When making decisions, the two sets of mental tools are required:
- RISK: If risks are known, good decisions require logic and statistical thinking.
- UNCERTAINTY: If some risks are unknown, good decisions also require intuition and smart rules of thumb.
Most of the time, a combination of both is needed. Some things can be calculated, others not, and what can be calculated is often only a crude estimate.
5. Rules of thumb are really powerful:
Because those who took part in the experiment were German, we came up with questions about the population of German cities (which we assumed would be easy) and U.S. cities (hard ).
We chose the seventy-five largest cities in each country. For instance, "Which city has a larger population: Detroit or Milwaukee?" "Which city has a larger population: Bielefeld or Hanover?"
The result blew our minds. Germans didn't do best on questions about German cities, about which they knew lots, but slightly better on American cities, about which they knew little. We'd made an error in assuming that knowing more always leads to better inferences. The experiment was ruined.
But this error led us to discover something new, which we called the recognition heuristic: If you recognize the name of one city but not that of the other, then infer that the recognized city has the larger population. Many Germans had never heard of Milwaukee, and so they correctly concluded that Detroit has the larger population. Because they were familiar with both Bielefeld and Hanover, however, the rule of thumb didn't work for this question.
An American who has never heard of Bielefeld will correctly infer that Hanover has more inhabitants, but Germans have a hard time. Similarly, in another study, only 60 percent of Americans correctly answered that Detroit is larger than Milwaukee, while some 90 percent of Germans got it right. The recognition heuristic takes advantage of the wisdom in semi-ignorance.
6. This is really important stuff:
If we spent the same amount of money on developing a health literacy program to make children risk savvy as on developing new cancer drugs, I wager that the health literacy program would save many more lives from cancer.
We may not save every child from an unhealthy lifestyle, but if we save as few as 10 to 20 percent of the next generation, we will be more successful than further research on new drugs in the fight against cancer. We would also see more teenagers without obesity, smoking, and alcohol problems, as well as more healthy adults in general. We do not have to wait until the children grow old to see if it's successful. The efficacy of such a health literacy program can already be measured when the children are adolescents , by the number of those who smoke, get drunk, are obese, or have other health problems. And the skills children learn cannot only increase health in general but also help to lead a more self-controlled life.
Go buy the book. It's fantastic.
Contact Morgan Housel at email@example.com. The Motley Fool has a disclosure policy.