Statistician Nate Silver correctly predicted the outcome of every state in the 2012 presidential election. It instantly shot him to fame in a field most people associate with the most boring class they ever took. He's been on The Daily Show twice. He has more than a million Twitter followers.

But the most important part of Silver's analysis is that he's not really making predictions. Not in the way most people think of predictions, at least.

You will never hear Silver say, "He is going to win the election." You might hear him say, "He has a 60% chance of winning," or "The odds are in her favor." Pundits make predictions. Nate Silver calculates probabilities.

All probabilities of less than 100% admit a chance of more than one outcome. Silver put a 60% chance of Obama winning Florida in the 2012 election, which, of course, implied a 40% chance that he wouldn't win. Silver's pre-election probability map gave Obama the edge. But, had Mitt Romney won the state, it wouldn't necessarily have meant Silver was wrong. In his book The Signal and the Noise, Silver wrote:

Political partisans may misinterpret the role of uncertainty in a forecast; they will think of it as hedging your bets and building in an excuse for yourself in case you get the prediction wrong. That is not really the idea. If you forecast that a particular incumbent congressman will win his race 90 percent of the time, you're also forecasting that he should lose it 10 percent of the time. The signature of a good forecast is that each of these probabilities turns out to be about right over the long run ... We can perhaps never know the truth with 100 percent certainty, but making correct predictions is the way to tell if we're getting closer.

What set Silver apart is that he thinks of the world in probabilities, while the punditry crowd of coin-flipping charlatans thinks in black-and-white certainties. His mind is open to a range of potential outcomes before, during, and -- most important -- after he's made his forecast. Things might go this way, or they might go that way. He adjusts the odds of certain outcomes when new information arrives. It's the most effective way to think about the future.

Why don't more people think like Nate Silver?

Twenty years ago, Berkshire Hathaway vice chairman Charlie Munger gave a talk called The Psychology of Human Misjudgment. He listed 25 biases that lead to bad decisions. One is the "Doubt-Avoidance Tendency," which he described:

The brain of man is programmed with a tendency to quickly remove doubt by reaching some decision.

It is easy to see how evolution would make animals, over the eons, drift toward such quick elimination of doubt. After all, the one thing that is surely counterproductive for a prey animal that is threatened by a predator is to take a long time in deciding what to do.

In other words, most of us don't think in probabilities. It's natural to quickly seek one answer and commit to it. 

If you watch financial TV, or read investing news, you will almost never hear someone say there's a 55% chance of a recession this year. They say there is going to be a recession this year. Rarely does an analyst say there's a 60% chance of a bear market this year. They say there is going to be a bear market this year. There's no room for error. There are no probabilities. People want exact answers, and pundits are happy to oblige.

Consumers of financial news are part of the problem. Not knowing what the future holds is scary. But you don't gain much confidence hearing someone say there's a 60% chance of one outcome and a 40% chance of another. We are more likely to listen to a forecaster who uses unwavering confidence to insist they know the future. It's like warm milk for our fears.

But thinking in certainties is usually a reflection of how you want the world to work, rather than how it actually works. Silver writes:

Acknowledging the real-world uncertainty in [pundits'] forecasts would require them to acknowledge to the imperfections in their theories about how the world was supposed to behave — the last thing that an ideologue wants to do.

If you have a view of the world that says raising taxes will slow the economy, no amount of information will change your mind. You won't tolerate a claim of an 80% chance a tax cut could slow the economy, because it leaves open the possibility that your entire world view about tax cuts could be wrong.

One of the top reasons investors make mistakes is that the world works in probabilities, but people want to think in certainties. It's why bear markets surprise people, banks use too much leverage, budget forecasts are always wrong, and most pundits make themselves look like idiots. 

As soon as you start thinking probabilities, all kinds of things change. You'll prepare for risks you disregarded before. You'll listen to people you disagreed with before. You won't be surprised when a recession or a bear market that no one predicted occurs. All of this makes you better at handling and navigating the future -- which is the point of forecasting in the first place.

Here's Silver again:

The more eagerly we commit to scrutinizing and testing our theories, the more readily we accept that our knowledge of the world is uncertain, the more willingly we acknowledge that perfect prediction is impossible, the less we will live in fear of our failures, and the more liberty we will have to let our minds flow freely. By knowing more about what we don't know, we may get a few more predictions right.

For more on this topic: