Fix the Six

We are intuitive, emotional and partial beings – we are human.

E. Kutsch, M. Hall
Adapted from Akwice on Pexels

Biases are unconscious and automatic processes designed to make us take decisions quicker and more efficiently. They result from efforts of our brain to simplify the complex world in which we live, work and operate.

However, biases are unconscious errors in thinking. They arise from problems related to our mental processes, such as memory, attention, social attribution, miscalculations and other mental mistakes and shortcuts. They are like “glitches” in our thinking that cause us to sometimes make questionable, inadequate decisions and reach erroneous conclusions – not so much the efficient and correct decisions we are aiming for!

Biases actually helped us to evolve to what we are now. But, as said by the Center for Applied Rationality, despite thousands of years through evolution “…. human intelligence itself remains demonstrably imperfect and largely mysterious”

Biases are studied in psychology, sociology, risk management and behavioural economics. They have been extensively researched since the 60s, with the subject reaching a wider audience thanks to the work of the Nobel prize Daniel Kahneman and his popular book “Thinking, Fast and Slow“.

Project management is a profession centred on decision-making. As pointed out by Kutsch and Hall, in project management biases affect the “ability of noticing, interpreting, preparing, containing and recovering”. So, leaving aside for a moment the controversies over the biases’ classification, their generative mechanisms or their utility, we should consider how they could impact project management. More importantly, we should see how we improve our awareness and try to mediate those biases.

Bias? Which bias??

Photo by fauxels on Pexels.com

Cognitive biases are well-researched, but it is a very complex subject. Apparently we have now …. 188 cognitive biases! They have been visualised in this graphics. A publication by the World Economic Forum has recently focused the attention on 24 of those biases, those that the WEF believes we should consider as a priority. I have also found an interesting paper that specifically focuses on 10 biases more relevant to project management.

As a result of limited time, resources, pressure from peers, from society and uncertainties in the world context, people are forced to rely on heuristics, or quick mental shortcuts, for help in their decisions. So we would all be biased…. When I tried to analyse myself against that long list of biases, I was surprised to see how many of those actually applied to me… “Where should I start to get them fixed?

Six Biases We Should Pay Attention To…

I found an insightful and useful course on LinkedIn by Drew Boyd. Although not specific to project management, the course focuses on six biases and it provides some hints on how to mitigate them. It is a good way to start. The biases are listed below, but not necessarily in order of importance or personal preference.

base rate bias

Photo by Anna Nekrashevich from Pexels

This is a type of fallacy in which people tend to ignore the “base rate” (i.e., the general prevalence) in favour of the individuating information (i.e., information pertaining only to a specific case, to a small sample or to a specific set of circumstances).

The base rate bias can lead us to make inaccurate probability judgments in many different aspects of our lives. For example, it can cause us to jump to conclusions about people, based on our initial impressions of them. This bias has a natural recurrence in project management as we would be prone to jump to conclusions about new or old project team members, contractors, consultants etc.

What’s more, as pointed out by Flyvbjerg in his article, the “story” of the project will have a huge influence, especially if this is a big, long running project or programme. Its legacy will force a bias in assessing probability of events during the project. We will also be ignoring the natural variance in things and events, be focusing on bad outcomes with small likelihoods, etc.

Photo by Christina Morillo on Pexels.com

To avoid incurring in this fallacy, the advice is to pay more attention to the base rate information available to us, as well as to recognise (and remind ourselves) that “personality and past behaviours are not as reliable predictors of future behaviour as we think they are“. It is important to take a wider view or an outside perspective on the subject or event. This requires us to put more time and effort when assessing the probability that a given event might occur.

To me, this does not sound so easy, as in projects time pressure is always there. Logically, it will be so natural to fall back on the automatic processes. Addressing this bias is going to be tricky.

confirmation bias

Photo by fauxels on Pexels.com

This bias is the tendency to search for, interpret, favour and recall information in a way that confirms (or supports) one’s prior beliefs, ideas, decisions or values. A manifestation of this bias is when people select information that supports their views, ignoring contrary information (think about politics!), or when they interpret ambiguous evidence as supporting their existing attitude, view or position.

This is typical of me – I have a tendency to overweight the good news and underweight the bad news. In some cases, I look for support in the justification of a decision I have already taken and in which I might firmly believe. If I had the intention to reject an option, I look for evidence to support my decision for the option’s rejection.

Photo by Pixabay

An interesting advice is that we should list all the reasons that support our case, then try to challenge each one as being wrong. We should be pessimistic and challenge our certainties. Also seek help from colleagues and peers. The “six thinking hats” technique (De Bono) also comes to my mind. Together with colleagues or peers, look at the decision or the problem in six different ways, the Analytical, the Emotional, the Sceptical, the Optimistic, the Structured, the Creative .

The problem I see here is related to the time allowed for this. As for the previous bias, we are persistently under time pressure. In addition, in a world where information and data are so abundant and in which we are not short of alleged experts, opinion leaders and gurus, how can we ensure we select information in an unbiased manner??

availability bias

Photo by Lukas from Pexels

This fallacy relies on immediate evidence or examples that come to a person’s mind when evaluating a specific topic, concept or decision. This bias operates on the notion that “if something can be recalled, it must be important” (or at least more important than alternative solutions that are not as readily recalled). In other words, we overweight whatever comes to mind and we underestimate what we do not readily know. 

Photo by Christina Morillo from Pexels

As pointed out by Flyvbjerg, people tend to assume “what you see is all there is,” called WYSIATI by Kahneman: people construct a coherent trajectory or a logical reasoning based on what is available to them. According to Kahneman, under the influence of WYSIATI, people tend to suppress doubt and ambiguity and fail to allow for missing evidence. The human brain has evolved to infer patterns and generate meaning based on loose, or even non-existent, evidence – this is intended to save us energy and time. It may not be a big issue most of the time, and may even be effective, on average (in evolutionary terms). But, for big consequential decisions (that are typical of project management) it is not a wise strategy.

The advice provided is for thinking hard about why we are making certain decisions. Give time to research and, for example, to look for reasons for the contrary to what we decided, to seek opposite views, ask opinion and be critical of the data made available (by the web or by the experts). It is about trying to “downplay what someone already knows about a situation“.

hindsight bias

Photo by Lukas on Pexels.com

Hindsight bias is a common tendency for people to perceive past events as having been more predictable than they actually were. After an event, people often believe that they knew the outcome of the event before it actually happened. It can result in an oversimplification of cause and effect and in a feed into the overconfidence bias.

I think it is a dangerous tendency in the workplace and especially in project management. Hindsight bias can cause memory distortion and I must say I am particularly affected by it. If something happened like I thought it would, I revise my memory of what I was thinking right before the event, re-writing history and the event probability. What is worse, I tend to use it to make future decisions.

Photo by Anna Nekrashevich from Pexels

The advice seems to be that we should make decisions based on what the data says is likely to happen (i.e. not based on what we think is going to happen!). In such a volatile and unpredictable world it is important to keep ourselves well grounded. If we make a prediction and that prediction comes true, we should not revise the odds (remember to consider the natural variance in things and events!). Probabilities may have not changed.

In a project it is important to have good planning with good contingencies, use current data and informed expert advice.

overconfidence bias

This is a well-established bias and very relevant to project management.

Photo by cottonbro from Pexels

This bias manifests as a person’s subjective confidence in his/her judgments being reliably greater than the objective accuracy of those judgments, (especially when confidence is high). Flyvbjerg uses a more general concept of overconfidence as “illusion of certainty”. According to him, we overestimate how much we understand and underestimate the role of chance events and lack of knowledge (i.e. what we do not have readily available to us). Overconfidence bias is found with both laypeople and experts, including project planners and project managers.

Project managers are typically strategically optimistic professionals, so they are likely to fall in the trap set by overconfidence. I found a paper that seems to focus specifically on that. According to this research overconfidence reduces risk awareness among project managers. Overconfidence leads them to assess risks more optimistically and to come to more positive conclusions about anticipated project success. When judging project success, project managers only consider the probability of a risk occurring; they do not factor in the impact of those risks on project success, should the risks arise. Risks thus seem to be insufficiently reflected in the prediction of project success.

I would say that project planners are bound to be less affected – what do you think?

Photo by George Becker from Pexels

What is the advice here? We should always remember that past success is no guarantee of future success.  It is also important to recheck the facts about a situation. It would be hard for me, but the advice is to try to ‘suspend’ the initial judgement and check the validity of the assumptions. A final recommendation is about taking time, looking for additional views and inputs, thinking at the consequences of the decision taken – hard to do when there are pressing timelines! Project managers should certainly be confident, but not…. overconfident.

sunk cost bias

Photo by Pixabay on Pexels.com

This fallacy describes the tendency to follow through on a project or endeavour if there has already been invested time, effort or money into it, regardless of the fact that current costs outweigh the benefits.

It seems that we, as humans, fail to take into account that whatever time, effort or money we have already expended, this will not be recovered. We end up making decisions based on past costs and instead of present and future costs and benefits, which are the only ones that rationally should make a difference.

This is a recurrent threat in project management: in front of the dilemma on whether we should terminate a struggling project, we repeatedly consider what we have invested in it as a driver for the decision. I witnessed this bias in full display in many projects….

Photo by Marco Bottacini

So how can we avoid the sunk cost fallacy? We should look at the decision to continue the project as a brand new decision. Why are we doing this project? Are those initial assumptions about the project’s benefits or its expected impact still valid? What are our revised assumptions going forward? And are these assumptions still valid?  The point is that unless we can effectively change the outcome in the present, we should stop the project. We should certainly have a look at the opportunity costs, consider what we are giving up by not stopping the project. It is also important to seek opinions from those not “emotionally” involved in the project.

Project management would be easy if it wasn’t for the people….

anonymous

Project management relies heavily on decision-making, so we should think and talk more about cognitive fallacies, those mental shortcuts that too often influence and drive our decisions. The 6 biases presented here may be a good starting point in the process of understanding own cognition traps. Increasing awareness in project manager professionals is critical. However, addressing these biases is a very personal journey, and each professional (project manager, project planner or a project team member) should look at the personal circumstances first, but be also vigilant and notice manifestation of the fallacies also in contributions by others to the project practices.

Addressing the personal fallacies is only part of the process. There is a huge responsibility on the organisations to set the suitable environment. This is where I will focus in one of the future articles on this blog. Stay tuned!

Marco Bottacini, Senior Portfolio Manager at GALVmed

The views and opinions expressed in this blog are those of the author and do not necessarily reflect the views and opinion of GALVmed.

Leave a comment

Design a site like this with WordPress.com
Get started