Hindsight: blessing or curse?

 Have you ever read an accident report, had an incident related to you, or sat through a CRM case study that made you say out loud something like:

“What were they thinking?” “Why on earth did they decide to do that?” “How could they not have known?” “They must have seen that coming, surely?”

If you have – and we all have – then you have fallen victim to probably the most powerful and omnipresent psychological bias out there: the hindsight bias. You have substituted your reality – what you understand now about the situation in question – with what the individuals involved at the time knew and understood about their situation. 

No such thing as human error?

In his 2004 book Ten questions about human error, Sidney Dekker made the somewhat polemical argument  (given the focus of much safety work around the world before and since on the management of human error) that error does not, in fact, exist at all. People do not make errors, he says. Instead, what we understand to be an error is just a post hoc interpretation of an event constructed in the mind of the person examining it. In other words, it is only hindsight that allows us to claim an error was made.

Looking back, anyone can exercise expert judgement

Back when I was a junior co-pilot first learning SAR, a senior captain on my squadron had an incident whilst hoist training to a local lifeboat. To cut a long story short, the hoist cable snagged on a guardrail and parted. The remaining cable on the hoist end shot up under tension, peeled open the hoist housing  (narrowly missing the hoist operators face at the same time) and went through the main rotor blades, a length of it ending up wrapped around the rotor head.

Nick Martin Photography

That was not where the story ends however, because instead of landing the helicopter on a nearby clifftop and inspecting the damage, the captain then opted to fly back to base to continue with the second part of the planned sortie; namely a pressing task which involved one member of the crew donning a thick white beard and red felt over-suit, the aircraft adopting callsign “Santa 2-4” and the delivery of Christmas presents by air to a festive event taking place nearby.

It wasn’t until the happy task was complete and the aircraft was shut-down and pulled back into the hangar that the full extent of the damage from the hoisting incident became clear to one and all. The next morning as the squadron gathered to brief and word spread of the mishap, the customary hushed assessments of certain individuals’ decision-making began in earnest.

“What were they thinking?” 

“How on earth did they decide to do that?”

“How could they not have known?”

I was just a beginner, but I too thought I knew a clear cut case of poor airmanship when I saw it.

Locally rational decision-making

Hindsight is incredibly compelling in allowing us to construct our own view of what happened. When looking retrospectively from the position of bystander or interested observer, any error can look so obvious, the red flags waving away at us for all to see. “They failed to notice things they certainly should have noticed.” “They did not consider that which was staring them in the face.” “They should have done this… they could have done that…” But from the perspective of the crew inside this incident – any incident – perhaps that same decision was not erroneous. It made sense at the time. It is almost certain that they were doing what they were doing because they thought they were right, given their understanding of the situation and its pressures. 

If we want to begin to understand why it made sense for them to do what they did, we have to reconstruct the world as they were viewing it in the moment, and their individual circumstances that made the decisions and actions they took seem rational to them.

Dekker calls this ‘local rationality’. What exactly did they know and didn’t they know? What was their understanding of the situation in that time and place? What effect did their goals, pressures, and other constraints have on their thinking? Another person’s behaviour and decision-making can only be understood to be rational if you can truly account for the very specific situational context in which it takes place. The problem is that, in practice, doing this is almost impossible to achieve. The ability of human factors to provide such answers is well behind our ability to point to what was missed and where mistakes were made.

Human performance is embedded in, and systematically connected to, the situation in which it takes place.

Dekker (2004)

Professional helicopter crews who form part of high performing teams never climb into multi-million dollar aircraft with the intention of wrecking them, or the lives of themselves or their colleagues. Neither do most people turn up to work to do a bad job. When we pose ourselves questions such as “how could they not have known”? We should look for the answers not in the behaviour of those upon whom we are passing comment, but in our own failure to understand their behaviour due to the gulf between our privileged, counterfactual frame of reference, and the circumstances which framed their understanding of the situation at the time.

The cognitive function of hindsight

What is the cognitive function of hindsight? Dekker makes the case that hindsight isn’t only about the advantage of knowing the outcome of events. It is also a way we simplify and mentally order the extreme complexity of real life occurrences. Reality is messy. Reality is not made up of a series of clear decision-points or junctions on the road to a known outcome. When we debrief a flight or pick apart an incident or accident we invariably work backwards from on outcome – an outcome that we already know about. As we do so we tend to pick out a series of bifurcations in the path or opportunities where people had the choice to recover a situation, steer a route away from trouble, or avoid a negative outcome – but did not take it. 

Of course, knowledge of what data was critical at the time and what could be discarded, discounted, or ignored is privileged information that only comes with the omniscience of hindsight. To know what is truly relevant or critical to the outcome you need to know the outcome.

Data availability vs data observability

The difference between what data can be shown, post hoc, to have been physically available to the crew, and what was actually observable to them is an important distinction to make. Data observability is hugely complex and almost impossible to judge. It depends upon individual cognitive capacity, which in turn depends upon multiple demands on attention, workload, actual awareness of a situation, personal goals, expectations, knowledge, motivation and a multiplicity of other performance shaping factors. Yet, time and time again hindsight bias drives us to argue that because data can be shown to have been physically available, it should have been picked up by pilots or crew when immersed in an incident.

Dekker makes an interesting observation about the problem of a growing chasm between data availability and data observability resulting from the rapid technological advances of recent years. Cockpit technology makes it ever easier to capture and record the ‘reality’ of how we carry out tasks. Voice recorders and flight data recorders have been around a long time, but are more widely used than ever, particularly in smaller aircraft such as helicopters. They are now accompanied by cockpit cameras, Flight Data Monitoring, GPS flight following, ADS-B, and other cockpit tools such as TCAS and EGPWS, all of which create an extensive electronic footprint from which we can recreate and reconstruct the world “as it was.” All of this makes it very easy for us to point out that something should have been noticed and wasn’t. The problem is that capturing the data alone does not explain why it was not noticed, or why it was interpreted differently; it does not explain the mental models of the aircrew, their cognitive processes, and what they actually knew at the time. In summary, our ability to make sense of the human behaviour and understand the human interface with the machine and the system lags behind our ability to capture and register that behaviour.

The curse: pointing fingers

Hindsight is appealing to the human brain because we can use it to provide structure; to shape and order complexity and simplify indeterminacy. But human work in complex, dynamic environments such as aviation is very rarely a question of simple choices.

Counterfactual reasoning is frequently used in accident and incident analysis, both formally and informally. During this process it is easy to demonstrate inconsistencies between what is laid down in procedures and regulations with actual behaviours. With the luxury of time and focus anyone can expose a lack of knowledge or dig up rules which would have, or could have, been applied to fit the situation in question.

There is virtually always a mismatch between actual behaviour and written guidance that can be located in hindsight.

Dekker (2004)

But saying what people could have done in order to prevent a particular outcome does not explain why they did what they did. Focusing on what was not done explains nothing about what actually happened, or why. All of this can be – and often is – used to apportion blame. And perhaps that is the curse of hindsight.

The blessing: hindsight underpins learning

Where counterfactual reasoning does serve a purpose is where we want to identify and develop potential countermeasures against errors and future threats to our safety. So while hindsight isn’t always a useful tool for explaining, it is for learning. As we have seen, it allows us to build an abstract model, a simplification, away from the rich context of the local reality of those involved. These models allow us to make predictions. Confronted by our own failures or failures that occurred to other people learning is triggered: What do I do to avoid that from happening again? What do I do to avoid that happening to me? This is what the debriefing process is all about. It is also what any accident or incident investigation process is supposed to be primarily about. 

Counterintuitively then, hindsight bias is not about the past. Its purpose is forward looking not backward looking. We learn through failure. And we demonstrate learning by deciding or doing something differently in the future. The mechanism of hindsight plays a pivotal role as a catalyst for this to take place. Dekker himself concludes that “hindsight is not about explaining past failures. It is about preventing future ones.”

Based on Chapter 4, “Don’t Errors Exist?”, from Dekker, S. (2004). Ten questions about human error: A new view of human factors and system safety. CRC Press.

Leave a comment