Why Safety Drops Over Time
How many times have you heard a company had a good safety program, but their standards dropped over time?
The mobile phone bleeps while you are driving, and you can’t resist the temptation to look. After all it could be important. Most people check their messages and continue driving without incident.
Human psychology tells us every time we break a rule and nothing (bad) happens the more we believe the poor practice is OK.
For example, if you were told not to touch piping in your work area placarded DANGER – Do Not Touch – Hot Surface, yet every time you checked it was cold and safe to touch. Over time your experience (eg pipe cold and safe to touch) will reinforce and build a strong mental picture that this image (Danger – Do Not Touch – Hot Surface) is false and it is safe.
Now what happens when the boiler, which has been off for many months, is turned back on? You touch a hot pipe and burn your hand. The outcome of this example is generally not too serious, but if we follow this same mental process in more complex and higher-risk environments then the consequences can be catastrophic.
An interesting case study on “normalisation of deviance” can be found in CASA Flight Safety News Sep Oct 2017, provides further insight into how good safety standards can be compromised even in safety conscious and professional environments.
The NASA Space Shuttle disaster of 1986 when the frozen O-rings failed and again in 2003 when a foam-strike damaged wing failed demonstrates how normalisation of deviance can creep into organisations.
Like answering the mobile phone while driving or touching pipes with warning signs, so to high-consequence events with uncertain likelihood can build familiarity and at-risk practices can become normal.
The Challenger Space Shuttle blew apart on take-off in 1986 after numerous other flights in similar freezing conditions went OK, and Columbia Space Shuttle disintegrated on re-entry after previous flights had survived foam strikes without critical damage.
NASA had identified these risks and the engineering specialists were acutely aware of the potential consequences for flight safety and had established safety standards, but senior personnel under constant pressure to keep Space Shuttle flights on schedule adjusted their perception of the risks with each subsequent “safe” flight. Eventually, the identified risks combined with subtle changes in circumstances and resulted in disaster. This is what normalisation of deviance ultimately results in.
As a qualified safety engineer, I have always had the view that we must consider the worst-case consequences, and if we can’t live with that outcome then we must act to better control the risks.
Many organisations and governments wait until the disaster happens and then over-react, instead of applying good risk management practices consistently.
So, what should we do to minimise normalisation of deviance creeping into your organisation? We believe setting clear safety standards, training all relevant workers with at least annual refreshers, and regular auditing against the standards.