“Risk” is a common word in conversations at the moment. It is also a familiar word in our daily lives as business continuity planners. But how does our working perception of risk differ from that used more widely. Should this affect the way we talk about risk? When we use the word risk, we often characterise it as a combination of likelihood and impact. If pushed for a more useful response, we may describe it as uncertainty of outcomes, where the outcomes can be either good or bad. But for most people risk is seen differently. Risk is usually seen as being wholly negative. In effect, “something bad may happen”.
Certainly, a detailed consideration of risk isn’t usually something that most people do as a regular part of their lives. This makes communicating risk very difficult. Particularly where senior people have to take business continuity decisions based on an assessment of risk presented to them by experts. To improve our clarity, we need to remember that risks presented in a business context may nevertheless be interpreted and analysed within this more general context. It may even improve the way we think about risks ourselves.
Over the years, research has found that attitudes and response to risk are quite complicated. A big division can exist between everyday risks that we are familiar with, and those that are new or unexpected. Unknown risks can provoke unhelpful responses, particularly when the impacts could be very large. This does not necessarily mean that this is a “wrong” way of thinking, but it is a different way of thinking about risk that can lead to problems particularly when we are trying to use our own preferred, technical language.
Consequently, this can lead to poor decision making where the resulting fear and uncertainty can drive poor choices rather than an effective behaviour change. And having to deal with these risks can also have longer term consequences, such as mental health issues.
Furthermore, explaining risk is even more complicated in the modern world as many risks are compounded. Whereas the risk of a power cut caused by bad weather is reasonably easy to think about, the risk of one incident causing multiple consequences to cascade through a highly interconnected system is not. For example, what risks (if any) does the Internet of Things bring?
Generally speaking, it is often found that our consideration of risk is heavily influenced by how we perceive the impact on ourselves, and what we think should or shouldn’t be done about it. We are often more concerned by new risks to which we have never encountered before rather than risks that may be just as harmful or likely but which we are more familiar. With more familiar risks, we are able to place the risk within a broader societal and personal context. We are much better at handling such risks where we perceive we have some control over (such as the risk of being involved in a car accident).
Understanding such responses is important when we try to prioritise risk to ensure that any response is both appropriate and proportionate. This is often done by setting a “risk appetite” but the use of risk appetite can be difficult. For a start, individuals presented with the results of a risk analysis that has been done be someone else will usually have their own personal views about the likelihood and/or impact of a threat, which will probably differ from the assessment.
In addition, they may also underestimate the true impacts. Assessing the impacts of an event before the event happens is very difficult. It is easy to underplay the effects the impacts will actually have, likewise it is also possible to overplay the impacts where the risk is something new and has not occurred before. Both behaviours mean that what may seem a sensible decision in terms of risk appetite during the assessment process does not seem to be the case once the incident occurs. This can lead to the “blame game” when that previously acceptable risk suddenly turns out not to be so.
There is also the tendency to think that low likelihood risks will just never occur as it is conceptually so difficult to imagine what an improbable incident would feel like when it does takes place. What may be expressed as “very unlikely” – i.e. less than a one in ten risk - may be interpreted as “it will never happen”, particularly when the risk has never occurred before. Or it may have occurred so long ago that a societal memory of it has almost gone (think of public knowledge around the 1918/19 ‘flu pandemic).
Given all this, what techniques can be used to try and anticipate problems? We must remember that, for senior colleagues, the presentation of a risk assessment will usually not be the first time they have had to consider these risks (unless they are very new). Many risks will have been covered in the print media, on social media. The more extreme (high impact low probability) risks may well have been the subject of TV documentaries or dramas. Any full discussion of risk has to be aware of these preconceptions and address them.
It is important to use a common and clear language. Quite a lot of work has gone into the communication of risk to the public over the last ten years. The UK cabinet office published several documents although these are sadly becoming out of date – this area of cabinet office work seems to have been cut back in recent years. The World Health Organisation have also produced some good guidance which can be used as guide.
Also, don’t forget that the language used is very important so do not try to overcomplicate discussions with technical language. Where possible, make use of supporting material from other organisations. Particularly those organisations that may be thought of as being more trustworthy. Infographics are also very useful, assuming you have the resources to produce them or can source them in a suitable format.
One helpful approach is to use scenarios to explore the range of plausible outcomes. Try to engage with questions such as “consider what would happen to the organisation in this context?” or, “what would happen to you if this occurred?”. But if you do try this approach, remember it is important not to allow the most strident voices to dominate, either because they have strong personalities or they are the most senior. Provide opportunities for all to contribute. Of course, this is easier said than done!
To help do this, give people time to think about the risk and reflect on it, anticipate questions and be ready to respond, and to understand that responses may not be “rational” in the sense that people don’t think in terms of a technical risk assessment methodology. Highlight similar incidents in the past and encourage them to think about early warnings such as what warnings (or not) were heeded in the past, and what early warnings could be useful in the future.