Tim Schmeising-Barnes to shares some valuable insights on our inherent bias during the decision-making process, so we can spot what we are doing before it’s too late.
With the Coronavirus pandemic, I got to spend more time with a couple of MSc Psychology students (my wife and daughter) and this is what happened!
I used to think of risk management as a logical process but experience has taught me that when it comes to evaluating and prioritising risks there is a very significant level of subjectivity which can have a big impact on the outcomes. I now understand a little more about the very real ways that humans react illogically.
Here are some of the key psychological factors that cause unconscious bias when we try to evaluate risks objectively. Look out for these when reviewing your risks.
Your mood influences your ability to objectively evaluate anything! When it comes to risks, this could mean that you overestimate the impact or a risk on a bad day or (more likely) underestimate the impact of a risk on a good day. The key problem here is that on different days you will apply different risk factors and will focus efforts on the wrong areas.
Solution: be prepared to review previous evaluations of risk to see if the original analysis still makes sense.
It has been demonstrated that people will prefer to take a higher risk to avoid loss and a lower risk to preserve gain. Even when the numbers don’t make sense. The net result is that you can respond to risks with the wrong reasoning.
Solution: if stakeholders or team members seem to want to prioritise risks then check if they have a heightened fear of the loss involved. It may be worth looking at the way the risk is described to ensure that it doesn’t invoke the wrong response.
Life is too short to fully consider everything and sometimes we shortcut the process to just consider recent experience. It makes decisions and action quicker but less valid.
Solution: you can probably feel when this is happening as you will be skipping over risks (maybe short of time in the risk review). Note these risks and be prepared to review them again another time to check that they really are as straight-forward as originally thought.
If something has never happened before, we are much less likely to accept that it will happen. Black Swans were assumed to not exist because no-one had seen one but that didn’t seem to bother the swans. We see Black Swan events all the time and are surprised by them. The Financial Crash was one, Coronavirus another. In hindsight these were perfectly predictable events that we should have been expecting (at some point).
Solution: look for Black Swans, to find them consider implausible events that would have a big impact and be prepared to question if they are as implausible as they first seem.
In a logical world, you gather the facts and prepare a response that resolves the key facts. However, we all too often skip to the solution and then pick out the facts that support it.
Solution: if the risk description isn’t clear then there is a good chance that the action is inappropriate. Ask yourself what would happen if you took an alternative action, if you are not sure about the answer then it’s possible that the original solution was not the best solution to the risk.
Humans are generally optimistic which enables us to confront difficult situations. However, this can also cause us to avoid risks where we don’t want to face them. “It will be alright on the night”. This will cause us to underestimate the impact (or overestimate our ability to cope with the risk should it materialise).
Solution: put on the Black Hat and ask yourself if you are being realistic!
This is about the apparent priority of now over later. Typically, this is the gain of doing nothing now compared to the future value of doing something. This makes people reluctant to spend now to respond to a risk that is a long time away, even if the probability adjusted cost of the risk is higher than the mitigating action.
Solution: for all of your risks, ask the question “what if we deferred taking action for 2 weeks”. If the answer is positive then that’s ok, if not then get on with the right response actions.
Note: because projects are uncertain there is a valid need to avoid activity or cost on something that may drop out of scope. For this reason, it’s useful to have 2 risk proximity charts.
The first shows the proximity of IMPACT and the second shows the proximity of when action is needed. If action is not needed then you can legitimately defer action and potentially save you effort (so long as you don’t take your eye of the ball when action is needed).
Subconsciously, we will put more emphasis on preserving something we have invested in rather than preserving something because it has value.
Solution: Napoleon used the saying “Don’t reinforce failure” which is a great maxim to keep in mind in any project as there will undoubtedly be wrong turnings and wasted cost/effort. Ask yourself if you are protecting delivered scope for the wrong reasons!
These responses do serve a valid purpose in life as they help make quick decisions by avoiding being distracted by the need to consider all consequences.
Sometimes, a bad decision (or not completely good decision) is better than not making a decision! However, when there is a danger that we don’t properly evaluate POTENTIAL events, bias can be very dangerous.
The key takeaway is to always keep in mind that logical, objective evaluations may be unconsciously biased!
PM3 is Bestoutcome’s PPM tool which supports risk management processes in projects, programme and portfolios. PM3 helps manages risks within projects but also supports pre-project risk assessments including specific risk assessments, e.g: Equality Risk Assessments and Data Protection Risk Assessment