Complacency. Merited with at least a dishonourable mention in almost all books on human factors, this supposedly pernicious attitude is universally villainised and cited as a factor in countless aeronautical mishaps and accident reports. But what actually is complacency? Can it be shown to exist? And if so, how do you know it is at play until Captain Hindsight points his all knowing finger?
Attributing complacency as a cause of human error is as easy as it is lazy. Why?
Just like error itself, complacency is a symptom and not a cause.
Chicken Pox is an illness commonly contracted in childhood resulting in a characteristic rash of small itchy blisters. The blisters are symptomatic of the varicella virus but they are not what makes you ill. Complacency, just like error, is a manifestation of other factors further back upstream in a system, the health of which – like the human body – is infinitely complex and influenced in a multitude of ways. Why then do we characterise complacency as a malaise in and of itself?
Sidney Dekker (2003) argues that the answer to this lies in the fact that we universally endow complacency with causal power without actually bothering to define what it is. He provides a raft of examples where complacency is blamed for pilot error – most popularly in automation management – but points out that no one has attempted to explain what complacency is and describe how it is manifested. Complacency is constantly claimed to cause error. It is often attributed as the source for attention and vigilance decrements, but always without any effort to scientifically deconstruct how it comes about and what component parts make it up.
Defining Complacency
Getting a grip on an insightful definition for complacency is as slippery as scooping an octopus from a bucket of baby-oil. A number of terms have been offered up to help define it, including boredom; overconfidence; contentment; unwarranted faith; uninformed self-satisfaction; over reliance; a low index of suspicion; and a lack of awareness. Let’s look at some of these:
Complacency is about threat awareness. Supposedly, complacency makes us unaware of potential dangers or threats. And as you can’t protect against dangers you can’t see, it makes us vulnerable to change, to the unforeseen, and even to progress.
It is sometimes equated to laziness, but this is unsatisfactory because it suggests there is an element of choice in being complacent, and while you can make choices that result in complacency very few people decide to be complacent.
Overconfidence and complacency go hand in hand. Chuck Yeager famously pointed to complacency as the biggest challenge facing experienced pilots. And that’s because overconfidence is usually born of success, even if that success is only defined by many years of uneventful aviating. Unlike laziness, overconfidence is not a choice, but a state of mind.
‘Folk Models’
If these words haven’t helped to clarify much, there’s a reason. The real trouble is that the definitions offered simply substitute one word for another and offer no useful explanation about what goes into to being complacent. Fundamental to scientific enquiry is the ability to break a concept down into more elemental parts, a process which allows greater insights into the behaviour which it hopes to explain.
The problem of the lack of a satisfactory definition is not just a semantic one, but one of explanatory power. Because no descriptions of complacency offer deeper insight into its meaning it becomes a concept which is immune to critique. Aircraft accident? Let’s put it down to a complacent pilot! No one has stopped to explain the mechanism responsible for complacency, what about it causes errors to be made, reduces awareness, or diverts attention; so it must be accepted as it is. And because it has not been scientifically verified it cannot be falsified. Sidney Dekker coined the term ‘Folk Models’ to describe popular concepts like this which have been attributed causal power without subjecting themselves to scientific rigour.
Complacency as a warning.
Complacency is often used as a warning although we are rarely told how to avoid it.

In CRM training we pick over accident case studies to learn from what went wrong and why. Causal and contributory factors are discussed in detail. With the benefit of hindsight and in the safe, stress free, and – hopefully – cognitively stimulating environment of the classroom, the lessons are clear and often painfully obvious. No one is short of an opinion. And in every case study I have ever done I guarantee that there’s at least someone in the room who’s thinking, “No way. Not Me. Never.” In hindsight they are convinced that the errors made are so evident and so avoidable that they wonder how the protagonists could have been blind to them. So we attribute complacency.
Whole books have been written on the subject without really coming to address the conceptual vacuum in which it floats. One was published just last year (Len Herstein, Be Vigilant – 2021) offering an explanation of both the causes of complacency and the supposed antidote: vigilance.
The traditional response to confronting complacency has always been to exhort people to be less complacent, to be more vigilant, or to strive for a greater level of flight discipline (Tony Kern, Flight Discipline, 1998). The trouble is, simply demanding a higher level of discipline and professionalism to avoid complacency is akin to addressing error conditions by telling pilots to think a bit more before they act. It reflects an outdated view of safety and error management. A more profound study of the the behaviour and preconditions that lead to complacency-induced errors has been neglected. Like error itself, complacency must have its roots intertwined in the organisational, task, and operating environments. It is a product of equipment, goals, pressures, limited resources, the team, culture, and all the other influencing factors of a complex system. If we really want to address complacency, that is where we must look to find answers.
