The aviation industry is hailed as a pioneer of safety practices, of open reporting, of just culture, and in learning from its mistakes. And given its remarkable safety record, this reputation is perhaps justified. Nevertheless, it would be both complacent and counter to those values themselves to believe that the goal of safety has already been achieved. Better is a journey, never a destination.
The paradox of safety is that safety can be dangerous, while danger can make us safe. The fact that a company might have gone twenty years without an accident or a serious incident might demonstrate a good safety record, but it does not demonstrate that it is safe. On the contrary, it is likely to be at its most dangerous. Why?
Because past performance does not determine future performance. Any system, even the economy, the environment, and the human body itself adapts to its surroundings. If the surroundings seem safer, these systems tolerate more risk. As time passes since a previous incident or accident, those surroundings will inevitably feel safer.
And as with any other system, the success of the safety culture in aviation demonstrates this paradox perfectly. Aviation has always been seen as an inherently dangerous activity, and it is the risks associated with the catastrophic consequences of air accidents that have driven it to become such an unusually safe activity. Of course, the reverse is also true, and that is where the aviation professional should beware. Because as soon as we pat ourselves on the back for setting the benchmark in safety, for leading the field across safety critical industries, and buy into our own hubris, we start to wrack up the levels of risk once again. We become more dangerous.
None of this is to say that we have a false sense of security, because it isn’t false: aviation really has become safer – all else being equal. However, all else is often not equal. As our environment becomes more complex, so do our interactions, and with them the potential for unintended consequences and catastrophe. The question is, how do we keep innovating and adapting our safety systems to keep pace with the constantly changing environment in which they are set, and the constantly changing risk profiles that even the most successful safety cultures must address?
Addressing the problem of extreme complexity.
Defeat under conditions of complexity occurs far more often despite great expertise, know-how and great effort, rather than from a lack of it.
In aviation, just as in many other technologically and organisationally complex fields, know-how and sophistication have increased so much in recent years that the result is a struggle to deliver on them. Another of the paradoxes of continuing technological development in aviation is that although its aim is to reduce workload, diminish the opportunities for human error, and to make a pilot’s life easier, the huge advances have at the same time turned a pilot’s job into the art of managing extreme complexity. This raises the ultimate question of whether a level of complexity is reached that can no longer, in fact, be humanly mastered.
There are degrees of complexity in aviation design concepts that have grown so far that avoiding mistakes is becoming impossible even for the super-specialised, and most experienced. What do you do when even the super-specialists fail? What do you do when expertise is not enough?
Globalise sharing of knowledge and expertise
The answer must be to increase the complexity of our response in line with the complexity of the problem. One pilot, one, crew, one company alone, will not have the sufficient experience, expertise, or understanding – even in a lifetime of flying, of work, or study – to match the demands of such complex systems.
Just as the first pioneers of safety management in aviation determined that improvement only comes from the analysis of failure, and that this is best achieved by sharing occurrences and learning from the mistakes of others, we must look for ways to progress this concept of safety.
At its core is communication, and the sharing of knowledge and ideas.
The same technological progress that has driven aircraft automation and complexity, has in the same short timescale given us the tools to do this. The internet age; the in-your-hand electronic encyclopaedia that is your smart phone or tablet. Imagine a kind of professional Google, a specialised distilling of aircraft knowledge, by type, worldwide. Imagine, a repository for every bit of know-how imaginable about your aircraft type, recorded and locked in, even from the design stage, or from experts long-retired. Imagine having access to a system with the potential to answer almost any question you have either by reference to published or passed on knowledge, or in real time by being able to put it to all of the most expert practitioners in the world at once. Any engineering conundrum, any obscure malfunction, shared for future reference, comparison, and training.
In terms of the practicalities alone, it is now perfectly possible – if not simple – to construct a worldwide system to facilitate the sharing of all knowledge and expertise; all safety incidents and occurrences and investigations; all the lessons learnt by individual flight crews through training, experience, errors and chance; all of the same by the maintenance teams. This would add up to many hundreds of thousands of hours of flight and maintenance experience available at the touch of a button or the interrogation of a search engine, to any interested party.
It would fast forward knowledge levels across aviation professionals by many years’ worth of experience all distilled and organised for universal consumption. It would go to the core of addressing the problem of expertise versus complexity. And it would have the added consequence of a step change in safety.
It would however require a paradigm shift in how we are prepared to share information. Have faith; it wouldn’t be the first time that aviation has shown itself capable of pioneering significant cultural change to blaze a trail in safety concepts.
I for one, would like to believe it is possible.
*Based on insights from:
-
Atoll Gawande, The Checklist Manifesto (London: Profile Books, 2011)
-
Greg Ip, Foolproof: Why safety can be dangerous and how danger makes us safe (London: Headline, 2015)
-
Matthew Seyd, Black Box Thinking (London: John Murray, 2015)
Shifting the paradigm of commercial paranoia is key. There are many instances I can remember where operators have all independently had similar issues with an OE or supplier or procedures and have each dealt with the issue alone, sometimes not even aware that others were doing likewise. Such problems may well have been more quickly solved with a united approach.
It is always good to talk (openly and honestly).
LikeLike