Working in crisis management, it doesn’t take long to notice that the toughest cases can also be the most undervalued. It can be hard to tell just how much damage a crisis situation is capable of causing, so when effective damage control is accomplished, clients, and even specialists themselves, may not be able to fully appreciate the value of what was averted. When crisis managers help prevent negative reports from being published, create comms to crush budding crises, or train clients on what not to say, they are doing the part of crisis management that is most readily under-appreciated. At the start of a new year, no case feels more appropriate to illustrate this point than Y2K — the crisis that didn’t happen.
Y2K
As the new millennium approached, there was an impending crisis that had the world on edge. The Y2K bug had the media questioning if computers and systems that depended on them (virtually everything) would crash, and planes would perhaps fall out of the sky once the clock struck 12 midnight on New Year’s day, in the year 2000. However, when the new year came around, (almost) all was well. The revisionist narrative was almost immediate. The ‘crisis’ was written off as a conspiracy/overreaction. In fact, Y2K perfectly demonstrated what can happen when you anticipate and diligently address risks.
What was the Y2K bug?
The Y2K bug was a computer flaw that had the potential to cause problems when dealing with dates beyond December 31, 1999. Many computer programs represented four-digit years with only the final two digits — so, 1982 was represented by ‘82’. This meant the year 2000 would be indistinguishable from 1900. Minor as this glitch sounded, it had the power to cause major malfunctions, throw off important calculations, and overwrite records across industries including manufacturing, aviation and financial services.
Upon recognizing the risk and the potential scale of impact, entire departments were formed within various institutions, with the goal of preventing this crisis from occurring. These teams, composed of a variety of experts, were able to plan and implement tight regulation tied to mandatory testing, and provide trustworthy reports. These actions were vital in preventing what could have been an enormous catastrophe at the turn of the century. Today, Y2K is regarded by many as ‘insignificant’ and an ‘overreaction’, but had those experts who stepped up not done so, it might have been a different matter altogether.
As we continue to see crises arise seemingly on a daily basis, perhaps the most valuable lesson to take from Y2K is that we can prevent many crises by taking risks seriously, planning appropriately, and listening to experts.
