People crave certainty like they crave food and water, and will go to almost any lengths to create certainty where none exists. Harvard Business Review writes as follows: “Of all the headwinds we face as decision makers, the power of one overshadows all others: our need for certainty. It is typically more important for us to feel right, than to be right — a difference that didn’t matter much in the lives of our ancestors, but now matters a lot.” And it explains it as follows: “The l
ockdown of our minds serves an important purpose: Generations of our ancestors wouldn’t have survived had they constantly second guessed their conclusions. In a harsh environment characterised by straightforward challenges that demanded quick responses, an indecisive caveman was a dead one.”
And then comes to this conclusion:
“Complex decision making requires we defer the feeling of being right, by tolerating the tension of not knowing.”
I have warned repeatedly about embracing research as the panacea. I have warned about fads and jumping on bandwagons, such as many people with ‘Neuromarketing’ after reading one popular book.
I am not alone in thinking that people who claim to know the answer (and few are more certain than scientists) really don’t know anything:
NN Taleb points out that a turkey will have growing confidence in his master’s desire to care well for it; until the master comes visiting with a big knife on until Christmas Eve. The point being, that risk is not a linear process. (Just because spreadsheets make it easy to extend rows of numbers doesn’t mean they have any value.)
Shane Parrish wrote an interesting few observations about The Dangers of Certainty.
Andy Grove (ex-Intel Chairman) published his take on corporate management and strategy, putting constant paranoia on the pedestal, and that means he weaves uncertainty into the fabric of the organisational culture, because fear is nothing but uncertainty.
If you are perfectly confident in your answer, you won’t listen and you won’t hear the warning signs that you are wrong.
The HBR academics don’t address how we can go about fighting this basic physiological response, but this is a little mental checklist I have learned to apply in decision making:
Is (what I think) true fact or disguised opinion?
What is the opposite of what I think and why is that not true?
If this is so self evident, why isn’t everyone doing it?
I am simply extrapolating like a turkey?
Naturally, no one will actually have mental checklist; but these types of responses in any decision situation becomes a mindset and a way of looking at things. Initially it may be acquired by being more conscious about the process until we become adept at distinguishing between what we know for certain and what we want to know.
I am not advocating analysis paralysis; on the contrary, I am promoting that executives become prone to action by recognising the fuzzy comfort of perceived certainty for what it is. That is exactly why movements like ‘lean thinking’ and ‘agile development’ came to prominence.
It is really all about finetuning your bullsh*t detector, and being honest enough to know that it must be aimed at our own conceptions and perceptions as much as other people’s.
You will be a better decision maker if you do this: reject the pursuit of certainty as a noxious weed growing in your garden of innovation.
And few organisations can afford to be lead by leaders who lead as if they are completely certain about everything, because certainty is a lens through which we view a world that does not exist.
Food for thought. Please share yours…
Dennis
PS: This LinkedIn post elaborates a bit more chaos theory.