Protection and Decontamination
Preparing for the next emergency based on the past is futile

Tacit knowledge can be described as the kind of wisdom that experts accrue over their careers. They have deeply embedded this information into their mindset and it has helped to shape the mental models (or, to give them their right name, ‘heuristics’) that allow them to find optimal solutions in reaction to unexpected and emergency situations.

So, what’s the problem?

Well, tacit knowledge is so deeply ingrained in these individuals, they are often unaware that they themselves use it and therefore, extracting, communicating and sharing the tacit knowledge of experts is a major challenge in performance psychology.

Whilst we have our own experience of emergency protection and contingency planning situations, the focus of this article will seek to stimulate discussion around innovative new solutions based on the principles of good leadership and behavioural science. We hope that by doing so, you may become self-aware of some of your own tacit knowledge and be able to both appreciate its value and to share it with your colleagues.

We live in an increasingly uncertain world; the likelihood of major incidents, emergency situations and disasters has amplified in recent years. Climate change, globalisation, economic and political instability and the rise of automation and Artificial Intelligence have revolutionised the ways we do business. Whilst these situations are becoming more difficult to predict, effective management of such incidents is crucial to ensure that members of the public and our workforce are able and willing to take appropriate protective actions to take care of themselves and others. Those responsible for preparing for and managing such random events are finding it much more challenging in a rapidly changing world.



They know that “no plan will survive the first contact” and there is even evidence to suggest that preparing for the next emergency based on what has happened in the past is just futile. It is often said that “we plan for the last event, not the next one.” There is a tendency to base assumptions about the size and characteristics of each event that will be faced in the future based on the historical evidence of similar situations in the recent past. But what if the next event is entirely out of character or different to what has happened in the past?

Recently we’ve been starting our courses and training events in a slightly different way. We have always highlighted our “Ground Rules” for our time together, we create an environment of psychological safety for everyone to share their ideas, cultivating a feeling of curiosity and excitement; a fun element to our learning experience together. Then, there is the inevitable moment when we need to identify and discuss any emergency procedures or alarms that there might be on site. Very often, people have travelled from various locations around the world to be together and they might not be familiar with the risks on site or the systems in place that need to be followed.

Now when we get to the emergency procedures bit, usually everyone is aware that there will be a fire alarm or alert system on site. Of course, we say, “should the fire alarm go off today, we will need to evacuate the premises quickly but calmly”. We then ask if there are any other alarms or warning systems that we should be aware of, which is usually met with either a blank look or a confused stare. People tend to be much less aware of any alternative alarm systems that might be in place to cover the unlikely event of a chemical or gas leak, or even a terrorist attack warning system. Many of the locations that we find ourselves in should certainly have such warning systems and immediate action procedures. Surprisingly some organisations don’t have them at all and even those that do often have poorly designed emergency procedures that are never rehearsed or properly stress tested. Indeed, envisaging such events happening causes many of us physical discomfort. By implementing an emergency procedure, we are subconsciously accepting that it is a real possibility that could happen to us.

Scientists and engineers in high-tech industry nowadays are accustomed to working with potentially deadly substances and materials, whether it be radioactive sources or genetically modified retroviruses, these experts, by definition, are usually extremely well educated and trained to manage the risks of their work. Tightly regulated work environments and smart implementations of monitoring equipment make exposure or loss of containment incidents rare, yet the consequences and costs involved in rectifying these incidents is extremely high and time-consuming, often requiring international governmental efforts to rectify.

For these reasons, industries such as this try, as best they can, to operate in a highly controlled environment. The idea being that systems and staff have multiple checks, measures and in-built redundancies to ensure that the likelihood of a catastrophic event occurring is significantly reduced. Such systems are heavily reliant on operator compliance at critical moments and past experience suggests that reliance on compliance alone, even in such high risk environments, is not so effective.

Your smartest employees are your biggest liabilities


Yes, you read that right. In such highly regulated work environments, it is important to consider that both the extreme level of safety measures and the seniority and expertise of our workforce in industries such as nuclear power, pharmaceutical and medicine can breed complacency. When a threat is often invisible, whether it be radiation or viral particles, it is much harder to maintain vigilance and cognisance of risk when a threat is not visible or immediately apparent. In fact, unlike most industries where most accidents occur due to unpredicted slips or lapses from workers at the operational level, it’s the wilful acts of non-compliance, or violations that are of most serious concern in these high-tech sectors, particularly by mid-level to senior engineers and R&D staff.

These people would not be your typical ‘rebel’ employees, they do not delight in breaking rules or wish to rebel against them. They genuinely believe that they are making a better decision than what is advised by company protocols and SOPs and they generally have the leadership and influencing skills to convince others that they know best in these situations. These are the colleagues you probably look up to, the high fliers who make big decisions and have received a steady stream of positive feedback and accolades for their actions throughout their careers. These are perhaps the biggest dangers to high-tech organisations in emergency situations.

It’s not so easy to protect your organisation from these kind of factors, because they often inherently have the authority and confidence to override any checks and balances. Indeed, due to the seniority and expertise of these workers, warnings and consequences are even less effective on them than lower-level employees. They have less to fear; even in the event they are fired for a wilful violation, they will be convinced that they will be able to seek new employment relatively easily.

Bucking the system


An example of how this phenomenon has manifested in other areas of risk are the investment bankers that manage the $39 billion endowment of Harvard University. These are some of the brightest financial minds around, operating at the direction of one of the world’s most prestigious centres of learning. You would expect such a team of experts to make excellent investment decisions that perform significantly better than any returns gained by simply investing in an Index Fund; i.e. just betting on the whole market, rather than carefully handpicking which stocks to invest in.

Yet, over the last decade, whilst the Harvard Endowment saw a 4.1% return, the more modest Index Fund option, available to literally anyone, returned 8.1% over the same period. Even in the medical field, there has been a shift towards a ‘systems approach’ to diagnosis and treatment, providing a framework for doctors to work through in order to reduce the effects of human bias and overconfidence.

Indeed, some of the world’s most dangerous incidents have been caused, arguably, by the overconfident actions of specialists and experts; for example, the Chernobyl disaster in 1986 and more recently the failure to contain the Ebola outbreak of 2014, which lead to further spread of the disease and remains yet to be eradicated. Both incidents involved the failings of well-trained senior staff, whose overconfidence and wilful violations of procedures led to potential catastrophe.

Implementation of carefully considered collaborative plans across multiple NGO’s can only be as robust as the discipline of the employees on the ground making decisions. It is important to consider that unlike your typical ‘rebel’ employees who might eschew their lab coats or hard hats, some of the individuals involved in these incidents genuinely had selfless and well-intentioned approaches.

Altruistic acts


Back in 2014, nurse Pauline Cafferkey selflessly travelled from Scotland to assist with the treatment of Ebola patients in West-Africa. Cafferkey herself contracted the disease. The fact that Cafferkey contracted Ebola whilst working with patients in Africa is of course an innocent human error, a slip or a lapse that must have placed her in direct contact with the virus. However, on return to the UK and screening for potential Ebola infection, another medic and nurse who had accompanied her on the trip admitted to falsifying her body temperature reading, which if reported correctly would have triggered a closer examination and quarantine procedures.


SOURCE:

https://www.hsimagazine.com/article/protection-and-decontamination