Combating biases in IT consulting

Consultants should attempt to reduce the influence these biases (among others) might have on clients: mere exposure effect, hyperbolic discounting, and normalcy bias.

When helping clients make decisions about their plans for the future, we need to be aware of biases -- both those of our clients and our own. Although it's impossible to completely escape the effects of our biases (we're only human, after all), if we don't make our best attempt to elude their influence, then we won't be helping our client to the best possible outcome.

Wikipedia lists more than 40 types of bias related to decision-making and behavior. I'll only discuss a few of them here, and save the rest for another time.

Mere exposure effect. People are biased towards the familiar, and against the unfamiliar. "Go with what you know" is a maxim in support of this bias. I often encounter this bias in the choice of operating systems and programming languages. Their justification usually boils down to "this is what we do, and we're sticking with it." Even when research demonstrates a net benefit from changing to a new product, service, or vendor, most people will prefer to stay with the one they have. In IT, familiarity is a benefit in itself, because of the cost of learning a new system and developing new procedures, as well as the risk that a new alternative may not fulfill all expectations. So it can be difficult to separate that valid assessment from the bias towards the familiar simply because the unfamiliar seems threatening at a visceral, "will it kill me?" level. More detailed research, along with prototyping, can often render a new system more familiar and thus reduce the bias.

Hyperbolic discounting. A risk or benefit seems much less important the further into the future the payoff will be realized or the cost incurred. This seems natural, because a lot can change as time goes by, and perhaps the risk or benefit will never be realized. However, the rate at which people apply this discount follows a hyperbolic curve, which indicates a bias towards the near term. For example, if two alternatives offer a 5% gain immediately versus a 10% gain a year from now, most people will take the immediate 5%. But if the choice is between five years for 5%, or six years for 10%, they'll choose the six year plan. The difference is the same (5% advantage gained by adding a year to the plan), but the closer that year is to the present the more important the near term becomes. The difference between five and six years seems much less than the difference between now and a year from now, and an immediate gain of any size seems more enticing than anything you have to pursue with patience. That could be rational in a highly uncertain environment, but if the relative likelihood of outcomes is the same, then it makes more sense to pursue the greater gain in the long run. I've often seen this bias operate in the decision of what features to include in a product: those we can sell right now, or those that will enable us to grow in the future. Many times I have heard, "Heck, we'll all be retired or working somewhere else by then."

Normalcy bias. When you tell your client that they need to take steps to avoid an impending problem, and they respond "But in all this time, that hasn't happened yet" -- you're seeing Normalcy bias in action. People erroneously believe that just because something has never occurred, that constitutes evidence that it never will. Those who present reasoned arguments for why it might happen anyway get dismissed as "Chicken Littles." It's simply too scary and bothersome to have to account for eventualities that aren't a normal part of their experience. Thus, it could be tough to sell a client on hardening their security if they've never been cracked -- even though it's only because no competent cracker ever made the attempt. After penetration, then it will become a top priority.

Have you experienced these and other biases operating in the minds of your clients or yourself? Share your stories in the discussion.