We all think we have a pretty good idea about our strengths and weaknesses. In the longer term, we think we have a general idea where we are headed in our careers and aspirations. In the middle term, we think we have a pretty good handle on the current and future state of our projects etc. We also believe ourselves to be rational beings that make decisions based on the best information available. How much of that is really true?
If the science of psychology has anything to say about it, not much. At least for the majority of us. The Freudian school considers the human psyche to consist of 3 parts: the ID, the Ego, and the SuperEgo – each battling for control. This article should help to prepare you to listen to the separate voices arguing in your head and be better prepared to lead the discussion rather than only hearing from the victor. There are too many cognitive biases to get into, so here is a link to the lot: http://rationalwiki.org/wiki/List_of_cognitive_biases
Let’s pick one of the more insidious ones to get into more detail: Confirmation bias. This is when we employ Blink-style intuitive methods to arrive at a decision about something but since the decision making process was short circuited (implicit) we don’t know if it was scientifically valid or that it took into consideration all the information available. This is fine when you are in a dark alley and a guy wants you to get into a van because he has a TV to sell you (low reward, needing immediate response). But when you are making longer term decisions you need to be explicit about how you arrived there. Yet, we don’t. How many of us write down pros and cons in a spreadsheet when we buy a used car or, even if we do, follow a point system that covers various attributes such as maintenance and resale value, instead of being swayed by the paint job or a sales pitch? In a work environment, things are required to be a little more explicit and detailed, so we do employ project management tools, and status reports. But how often do we find ourselves underestimating tasks when the project is behind schedule and we’re hoping to catch up? Or end up arguing in a meeting primarily because we don’t like the person we are arguing against? These are instances of biases where we have made up our mind, and are simply rationalizing that decision by employing our best dialectic skill. These may seem innocuous but once you have stated an opinion in public, fear arise. Of being proven wrong or being seen as weak (due to vacillation), and we are soon painted into a corner.
Confirmation bias is especially disastrous when it affects decision makers with little external accountability such as startup founders. We make decisions based on our own past experience and what we hope the outcome will be. Hope is a powerful thing, leading founders to make the big leaps of faith needed to launch any enterprise. But, as is so often the case, our greatest strengths are also our most dangerous weaknesses. That leap of faith is also invoked when doubling down on losing bets (insert favorite founder-blunder here). At one of our meetups, we played a game (albeit slightly contrived) to illustrate this point to our members. It was quite revealing how exactly to script it played out (as below).
So what can we do? Well, the first step is to never rule out the possibility that you might be wrong. But, unchecked, that could lead to self-doubt and analysis-paralysis. An alternative is the Scientific Method or its popular, current exponent – The Lean Startup approach. By clearly listing our assumptions, experiments and success criteria before starting out we can prevent rationalization and “moving the goal post” later on. The idea is not to negate your human qualities of detecting patterns, creativity, and applied experience but having a scientific approach to balance them against the human frailities of over confidence, the need to be right, and other biases.
Before you lead others, you must lead yourself. So make sure your insecurities and weaknesses aren’t leading you. Good luck!