Account for non-obvious and unidentified risks
Risk identification is mostly a mental process, which from data analysis (i.e., processes, past events, financials figures, logs, analytics, etc.) and interviews lead to a list of identified risks.
Whatever the volume of data, the number of interviews, the cleverness, and mental or computer calculation power applied to it, this process is bound by humans and by the complexity and chaotic nature of the world. We will never be able to identify all the “would have never thought or imagined” risks and events, not to mention that, by virtue of nature, memory of past events fades away with staff turn-over, retirement, data loss and deletion, etc.
In the end, the one risk category that should always be on the top of the list is the one that groups non-obvious and unidentified risks. We often overlook these, just as we ignore many others risks in our daily lives. The General George S. Patton once said “Prepare for the unknown by studying how others in the past have coped with the unforeseeable and the unpredictable.” So let’s examine the sense of “unforeseeable and unpredictable” risks as related to cybersecurity and how we can cope with them.
Be reasonable and evaluate cost/benefits
Risk management introduces the “likeliness” or “probability” attributes of an event or risk which may occur. Past events and actuarial data can help in this assessment. In the end, however, this is always a “best guess” or evaluation.
From Laplace’s demon arguing that “the universe as the effect of its past and the cause of its future,” to Poincarré’s more human perspective on the meaning of chance “a very small cause which escapes our notice determines a considerable effect that we cannot fail to see, and then we say that the effect is due to chance,” the chaos perspective has gained momentum over the years.
In an organisation, it’s not the flap of a butterfly that sets off a tornado, but rather a tiny incertitude. One can just marvel over the sheer amount of investments and time devoted by organisations in attempting to achieve complete and accurate knowledge of their risks.
A key question is how much time and money should be invested in this quest? At what point does it become pure nonsense, creating a dangerous illusion of safety or security? We’ve all come across some of the exhausted Sisyphus of risk management, or found ourselves trapped in this insane loop. Uncertainty and incomplete knowledge are part of risk management, as are the domains examined and covered by risk management.
The threshold beyond which investing and spending time is unreasonable or unbearable will obviously be different for each organisation, and it is one area where insurance companies can bring valuable insights. Indeed, the measure or degree of the incomplete knowledge will be factored into the premium of, for instance, a cyber-insurance policy, and therefore it will bring a factual, external vision with sounder roots than any consulting firm could ever bring, as the insurance company is a real risk stakeholder. Simply put, the overall investment to achieve better knowledge should translate into a lower premium that compensates the investment. If it doesn’t, than it might not be worth it.
Prioritize crisis management over risk management
In order to prepare for unknown, unidentified risks, there is crisis management. One could certainly argue that crisis management should be the first and foremost control to put in place, and even argue that organised risk management is a logical second step to prevent or reduce crises. This might sound indeed challenging, but what is the value of risk management if the sole explanation given, over the smoking ruins of a business, to the Board is that “yes, we had that risk identified, but could not foresee the way it would unfold!”?
Crisis management enables one to find a sound compromise, enabling the proportionate time and investment allocation to risk management and the quest of complete knowledge. Indeed is it better spending the extra budget and people’s time on the quest for the perfect inventory or to invest in crisis management readiness? At some point, it becomes pointless to add safety measures on top of security measures, when no one is trained nor ready to deal with an emergency.
Conduct drills in adverse conditions
Crisis management is also a venue to explore the infamous Murphy’s Law (if something can go wrong, it will). Whereas in risk management, risks are dealt with sequentially, in crisis management, risks are – or should be – dealt with in parallel. In the chaos of the world, major power outages occur during fierce storms, while the family is driving back home, and not on a sunny day whilst doing a BBQ with a cooler full of cold drinks. Crisis management drills shouldn’t be delayed because some people are unavailable or whatever adverse circumstances, but, on the contrary, they should be precisely and willingly conducted on such occasions.
Learn from others’ achievements and mistakes
To get back to General Patton’s advice, it encourages us to study how others in the past have dealt with the unforeseeable and unpredictable to develop a sound crisis readiness and drills program.
Crisis management is a fruit of experience, from a tree rooted in the chaotic nature of the world. Lessons learned from recent natural and man-made disasters show us that many IT companies and other organizations are not prepared to face personal tragedy or wide scale infrastructure collapse. As a crisis unfolds, drill or real, it clearly tests priorities.
There are plenty of events and scenarios you can study and replay as part of a sustained drill program and give your organization a stronger footing when a crisis erupts.
Find the right balance
The bottom line: Finding the right balance between risk management and crisis management is key to achieving a self-regulating, stable and relatively constant state of security and safety.
In the end, we must all face the chaos of the world. It’s not a matter of “if,” but rather of “when,” as there will always be, somewhere, Lorentz’ butterfly flapping its wings.