It was early evening on the 24th September 2022 when an offshore AW139 helicopter inbound to Houma-Terrebonne Airport in Louisiana, USA, declared a mayday. A lot had already happened in the cockpit by the time the co-pilot hit the press to transmit…
We take a look at Normal Accident theory in the light of a recent accident: Technology is both a risk control and a hazard itself. The act of adding technology is at best risk neutral. Continually adding more technology in the belief that we are adding more layers of defence in a system is flawed because we are in fact adding more combinations of possible failure modes. In other words, there is a direct trade- off between increasing safety by adding in more controls, and decreasing safety by adding complexity.
“For no other vehicle is the need for human factors research more critical, or more difficult.” Sandra G. Hart.
That’s a bold assertion that I had never heard anyone make before and consequently had never given much consideration to whether or not it might be the case. So let’s unpack that proposition a little by looking at the arguments that the author offers to back it up…
Human exploits in aviation have always been closely linked to our fascination for speed. We admire speed in its many guises and it remains a marker of achievement in almost any field you care to think of. In aviation, just as in many other walks of life, we often assume the faster the better. We associate speed with competence. But what if we could disassociate the idea of slowness with incompetence? What if instructors were made to teach the opposite? What if we came to associate a slow response with higher skill levels and greater professionalism?
What should startle and surprise training mean in an applied sense and how should we be approaching it? Do the differences between airline transport flight profiles and helicopter operations mean that we should be looking critically at how to approach the startle and surprise from a rotary wing perspective? Is it as significant a hazard in the low level, high workload, high obstacle environment in which helicopter crews spend much of their time?
Also published in AirMed&Rescue, Nov 2021 edition.
Automation reduces workload, frees attentional resources to focus on other tasks, and is capable of flying the aircraft more accurately than any of us. It is simultaneously a terrible master that exposes many human limitations and appeals to many human weaknesses. As we have bid to reduce crew workload across many different tasks and increase situational awareness with tools including GPS navigation on moving maps, synthetic terrain displays, and ground proximity warning systems, we have also opened a Pandora’s Box of human factors to bring us back down to the ground with a bump. Sometimes literally.
Pretty much everyone in aviation is familiar with the concept of situation awareness. But as research interest in SA grew, the concept expanded from the individual level to how SA might apply in the context of larger and more complex systems. What does distributed SA actually mean? The idea is that SA is held by both human and non-human agents. Myriad technological artefacts within a system also hold some form of SA. Now if, like me, you initially struggle with the idea that an artefact (such as a radio, or altimeter) can have ‘awareness’, then bear with me…
The importance of an effective lookout. We’ve heard it from day one in aviation, a constant through our flying training days and beyond. The dangers of mid-air collision, obstacles, and controlled flight into terrain (CFIT) will always be there. These are not static threats however, but are always evolving. Take the proliferation of drones asContinue reading “On Lookout and helicopters”
Now seems like a good time to look beyond the dark prism of the current COVID-induced crisis in aviation to consider a future beyond the mire. The Chartered Institute of Ergonomics and Human Factors (CIEHF) recently published a White Paper called “The Human Dimension in Tomorrow’s Aviation System”. It’s made up of a series ofContinue reading “Is Human Factors in aviation at a crossroads?”
Cognitive Readiness in Search and Rescue operations: What is it? Do you have it? How do you get it? There’s a problem with training to learn to deal with the unexpected: we simply don’t know in advance what the objectives of any training or instruction should be. If you haven’t come across it already, CognitiveContinue reading “Can you learn to deal with the unexpected & unpredictable?”
Human Decision-making: Extracts from Daniel Kahneman’s Thinking Fast and Slow. Daniel Kahneman, a Nobel prize winner for his work, first became famous for his article Judgement under uncertainty (1974) Heuristics and Biases. The Article was produced from research funded by US Department of Defense and Office of Naval Research. He expanded this into a bookContinue reading “A machine for jumping to conclusions:”
Memory and meaning I cdnuol’t blveiee taht I cluod aulaclty uesdnatnrd waht I was rdanieg. The phaonmneal pweor of the hmuan mnid. Aocdcrnig to rscheearch at Cmadrigbe Uinervtisy, it deosn’t mttaer in waht odrer the ltteers in a wrod are, the olny iprmoatnt thnig is taht the frist and lsat ltteer be in the rghitContinue reading “Human perception & the mental model”