A few smart sentences can change the way you think. Here are a few I read recently that got my mind twirling.
First, Microsoft researcher Duncan Watts writes in his book Everything is Obvious (Once You Know the Answer) how deceiving common sense can be:
That what is self-evident to one person can be seen as silly by another should give us pause about the reliability of common sense as a basis for understanding the world. How can we be confident that what we believe is right when someone else feels equally strongly that it's wrong—especially when we can't articulate why we think we're right in the first place? Of course, we can always write them off as crazy or ignorant or something and therefore not worth paying attention to. But once you go down that road, it gets increasingly hard to account for why we ourselves believe what we do. Consider, for example, that since 1996 support among the general public for allowing same-sex couples to marry has almost doubled, from 25 percent to 45 percent. Presumably those of us who changed our minds over this period do not think that we were crazy fourteen years ago, but we obviously think that we were wrong. So if something that seemed so obvious turned out to be wrong, what else that we believe to be self-evident now will seem wrong to us in the future?
Common sense often seems right because we can't imagine thinking about a problem differently, and the reason we can't think about it differently is because we're emotional and closed-minded. It's one of the most dangerous ways of thinking.
Next, author Daniel Goleman writes in his book Focus: The hidden Driver of Excellence about the importance of personality:
Physicians who are sued for malpractice in the United States generally make no more medical errors than those who are not sued. The main difference, research shows, often comes down to the tenor of the doctor-patient relationship. Those who are sued, it turns out, have fewer signs of emotional rapport: they have shorter visits with patients, fail to ask about the patients' concerns or make sure their questions are answered, and have more emotional distance— there's little or no laughter, for example.
I wish they'd teach in school that if you're a jerk and can't work with people, it doesn't matter how smart or talented you are; you're probably going to fail.
Next, Cambridge economist Ha-Joon Chang throws the popular minimum-wage debate on its head:
Wages in rich countries are determined more by immigration control than anything else, including any minimum wage legislation. How is the immigration maximum determined? Not by the 'free' labor market, which, if left alone, will end up replacing 80– 90 per cent of native workers with cheaper, and often more productive, immigrants. Immigration is largely settled by politics ... If the same market can be perceived to have varying degrees of freedom by different people, there is really no objective way to define how free that market is. In other words, the free market is an illusion. If some markets look free, it is only because we so totally accept the regulations that are propping them up that they become invisible.
Most things we argue about in the economy are presented as black and white, but they're almost always some shade of grey. If you zoom out and look at the big picture, people probably agree on more than appears on the surface.
Next, in the book 100 Plus: How the Coming Age of Longevity Will Change Everything, Sonia Arrison writes about how fast the world is changing:
The historical record shows that the distribution of new technology is speeding up, not slowing down. In the book Myths of Rich and Poor, economist Michael Cox and author Richard Alm note that it took forty-six years for one-quarter of the population to get electricity and thirty-five years for the telephone to get that far. It took only sixteen years, however, for one-quarter of American households to get a personal computer, thirteen years for a cell phone, and seven years for Internet access."
The faster things change, the less reliable forecasts are. We really have no idea what the world is going to look like in 10 years – or even five years -- because things that will have the biggest impact probably haven't been invented yet.
Next, Nassim Taleb writes in his book Antifragile why we're so bad at thinking about risk:
Risk management professionals look in the past for information on the so-called worst-case scenario and use it to estimate future risks— this method is called "stress testing." They take the worst historical recession, the worst war, the worst historical move in interest rates, or the worst point in unemployment as an exact estimate for the worst future outcome. But they never notice the following inconsistency: this so-called worst-case event, when it happened, exceeded the worst case at the time. I have called this mental defect the Lucretius problem, after the Latin poetic philosopher who wrote that the fool believes that the tallest mountain in the world will be equal to the tallest one he has observed. We consider the biggest object of any kind that we have seen in our lives or hear about as the largest item that can possibly exist. And we have been doing this for millennia. In Pharaonic Egypt, which happens to be the first complete top-down nation-state managed by bureaucrats, scribes tracked the high-water mark of the Nile and used it as an estimate for a future worst-case scenario.
According to Nate Silver, the Fukushima nuclear reactor was designed to withstand a magnitude 8.6 earthquake, because that's what seismologists thought was the largest possible quake that could hit the region. Then came the 9.1 earthquake in 2011. Most things in life work well with a large margin of error.
Next, Scott Adams writes in his book How to Fail at Almost Anything and Still Win:
A person with a flexible schedule and average resources will be happier than a rich person who has everything except a flexible schedule. Step one in your search for happiness is to continually work toward having control of your schedule.
Having control of your time is the only reasonable financial goal.
Next, Joshua Foer writes in his book Moonwalking with Einstein:
People have been swimming for as long as people have been getting neck-deep in water. You'd think that as a species, we'd have maxed out how fast we could swim long ago. And yet new swimming records are set every year. Humans keep getting faster and faster. "Olympic swimmers from early this century would not even qualify for swim teams at competitive high schools," notes Ericsson. Likewise, "the gold medal performance at the original Olympic marathon is regularly attained by amateurs just to qualify as a participant in the Boston Marathon." And the same is true not just of athletic pursuits, but in virtually every field. The thirteenth-century philosopher Roger Bacon claimed that "nobody can obtain to proficiency in the science of mathematics by the method hitherto known unless he devotes to its study thirty or forty years." Today, the entire body of mathematics known to Bacon is now acquired by your average high school junior.
Skills grow just like compound interest, with one generation leveraging the talents of the last. Pessimists should keep this in mind.
Next, Elliot Aronson and Carol Tavris write in their book Mistakes Were Made (But Not by Me):
The brain is designed with blind spots, and one of its cleverest tricks is to confer on us the comforting delusion that we, personally, do not have any ... Social psychologist Lee Ross calls this phenomenon "naïve realism," the inescapable conviction that we perceive objects and events clearly, "as they really are." We assume that other reasonable people see things the same way we do. If they disagree with us, they obviously aren't seeing clearly. Naïve realism creates a logical labyrinth because it presupposes two things: One, people who are open-minded and fair ought to agree with a reasonable opinion. And two, any opinion I hold must be reasonable; if it weren't, I wouldn't hold it. Therefore, if I can just get my opponents to sit down here and listen to me, so I can tell them how things really are, they will agree with me. And if they don't, it must be because they are biased.
A lot of people enjoy reading about behavioral finance, and all the ways people can be stupid with money, but fail to realize they're reading about themselves.
Last, here's Sam Arbesman in his book The Half-life of Facts:
Two Australian surgeons found that half of the facts in that field also become false every forty-five years. As the French scientists noted, all of these results verify the first half of a well-known medical aphorism by John Hughlings Jackson, a British neurologist in the nineteenth and early twentieth centuries: "It takes 50 years to get a wrong idea out of medicine, and 100 years a right one into medicine." ... Max Plank codified this in a maxim: "New scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it."
Even when you get the facts right, you're still probably half wrong.
Food for thought.
Check back every Tuesday and Friday for Morgan Housel's columns on finance and economics.
Contact Morgan Housel at email@example.com. The Motley Fool has a disclosure policy.