Conservatism Bias: How to know what new information to focus on



Traditional finance famously makes lot of incorrect assumptions about how humans make decisions. They assume that we are fully rational, and process infinite information almost instantly.

There’s also a whole wing of behavioral finance that tells us that our brains are simply not up to the task of the modern financial world so we should just quit. We should just admit that we are unfit to manage our affairs and submit wholesale to a passive mentality.

The study of behavioral finance shouldn’t be relegated to the extremes of each view. We demonstrate on a daily basis that we are quite suited to modern life, but we also see bad financial decision on a pretty consistent basis. The point is to understand the circumstances that get us into trouble, optimize decision-making when we can, and avoid the situations when we can’t.

This post is the first in a multi-part series examining different behavioral biases. For each bias, we will break it apart so you can identify it and either correct or avoid it.

This series is as much for our benefit as it is yours. Hope you enjoy and get something out of it.

Bias Class: Believe Perseverance

There is a whole class of biases that can be lumped into the category of belief perseverance (in the face of conflicting evidence). Some people refer to this as “cognitive dissonance.” When evidence changes in relation to an investment decision, we have a choice in how we act. Most people would rather cling to their original beliefs than admit they were wrong. So they resolve the dissonance by cherry picking what sources or information they take in (aka burying their head in the sand) when conflicting information comes their way. Or they tilt the weights of the information so that new information that doesn’t conform to our view can be dismissed or minimized. The phrases, “nobody cares about [insert important development],” or “[important development] is already priced into the market” come to mind.

Conservatism Bias

Conservatism bias has nothing to do with politics. It occurs people maintain their prior view without properly incorporating new information. You consider your original opinion and the information that formed to be quite meaningful, but new information learned after the opinion has been formed is considered to be not as important. The consequence of this is that likely you took action to old information, but aren’t willing to take action on new information that may conflict with it. Rationally when you read this you can see how ridiculous this sounds.

Practical implications: People are more likely to be slow at taking action on new information than they rationally should. Why is this? First, information can be hard to process. Second, truly bad information is sometimes difficult to come to grips with if, especially if you’ve publicly shared your thesis. It means you were wrong and doesn’t feel good at all.

The easy thing psychologically is to see the new information but subtly dismiss it as not important enough to change your original thesis. The hard thing to do is figure out if the information is important and whether you need to take action which might mean changing course and admitting your were wrong.

Fun fact: Let’s say you’re faced with two new pieces of information on the same company. The first piece of new info is easy to understand and explain like changing the company’s product, which is a major consumer good that you understand. The other is an accounting change in the way it tracks inventory. 99% of people will subconsciously weight the information that is easy to understand as more important.

This phenomenon of weighting easy to process info as more important is directly responsible for the problem we see on cable news shows. The only things that people want to discuss are things that can be explained easily in a narrative format. The audience doesn’t want to have complex issues that don’t have a clear answer thrust in their face. Spreadsheet screen grabs are notorious ratings killers. Narrative information is much easier to rationalize into a supporting argument.

But we’re living in a world where there is a constant flow of information spewed at us. How is anyone expected to know if what we’re hearing or reading is useless noise or critically important?

There’s a very simple rule for overcoming the problem of not knowing how to weight information. Ready for it?

Information weighting shortcut rule: If the new information involves math, or is difficult to uncover, verify or explain, ASSUME the weighting of importance is higher than new information you can easily understand.

A heavier weighting does not mean that you instantly take act or assume that it is negative before you analyze the information. It means that the information deserves to be looked at carefully. Especially if there is concurrent new information that is universally considered positive, and has the advantage of being easy to explain released around the same time. The only purpose of this rule is to short circuit the tendency we all have to underweight complex information that has a high mental load and overweight simple narrative information.

When you find yourself ignoring information because it is difficult to understand, you need to either take the time to figure it out or reach out to a professional that can explain what is happening.

Nobody can arrive at a perfectly optimal decision. The goal should be to make less less-bad decisions than everyone else.


Up next: Confirmation bias, representative bias, and illusion of control.

Disclaimer: Nothing on this site should ever be considered to be advice, research or an invitation to buy or sell any securities, please click here for a full disclaimer.

blog comments powered by Disqus
Dynamichedge Blog