What is a complex environment?
Put simply, a complex environment is a system or situation that has too many elements and relationships to understand in simple analytical or logical ways. It is a landscape with multiple and diverse connections, and dynamic and interdependent relationships, events, and processes. While there may be trends and patterns, they are entangled in such a way as to make them indiscernible, complicated by time delay, and contain any number of feedback loops and knock on effects.
In turn, a complex problem is one that is difficult to define and may change significantly in response to any chosen solution. It may not have a single ‘right answer’. It will have many interrelated causes, few or no precedents, can have multiple external influences, and is often prone to surprise.
The decisions we are faced with in the aviation environment can often match this definition neatly. Ours is a landscape that is well cultivated with surprises, emergent changes, and an ever changing meteorology, both literally and figuratively. In this landscape, the problem or situation requiring a decision can often be unique, dynamic, unprecedented, difficult to define or bound, and have no clear set of solutions.
Decision-making in conditions of complexity
Sometimes we will be faced with a decision where we cannot apply a typical rule-based or practiced response. Some aircraft malfunctions or emergencies can be dealt with by defaulting to a simple, clear, and immediate rule based task. Others can be met with knowledge based actions that apply a measure of cognitive effort on behalf of the pilot. But when an ambiguous situation presents the pilot with symptoms or circumstances which they are unable to match to anything in their prior experience or knowledge database it is an altogether different prospect.
When we consider human limitations in the face of extreme complexity our capacity to respond effectively can seem futile and our decisions less an exercise of judgement than role of a dice. These are the situations of ‘unknown unknowns’ which prior risk assessments and threat and error management will not have taken into account. Here too lies the greatest risk of startle effect brought on by fundamental surprise where an entirely new situation takes the pilot outside of a framework which he understands, and requires a thorough re-evaluation of the situation starting from basic assumptions. In these circumstances, often – even with the benefit of our best judgement – the outcome of a situation will depend heavily on what some would term luck.
Despite this, and as is the case in any informed decision-making process, taking a decision under conditions of complexity does involve deploying a toolset that we all have to varying degrees. We can call it simply ‘Experience’ and it is made up of many elements. Your toolset may include experience, education, relationship networks, knowledge of past successes and choices, multiple frames of reference, cognitive insights, mental, physical and emotional wellness, knowledge, an awareness of related external and internal environmental pressures. We could go on.
When dealing with complex systems the ability to put your past experience and cognitive capabilities to best use is probably the most important consideration of all. This means applying both your conscious and unconscious mind (with its memory and associative processing power) to help understand the situation in which you find yourself and place it within some context that you can understand. In whatever small way, by doing this you are pulling yourself back from the unknown to the known.
The good news is that we all know much more than we think we know. We spend our lives soaking up data, information and knowledge, and through our experiences and internal thinking and contemplation we develop understanding, insights and feelings about things of which we are often unaware. Even when we are unable to actively retrieve a piece of stored knowledge or experience from our long-term memory it can still influence our unconscious decision-making process. This phenomenon is often interpreted as intuition. Like, for example, the firefighters who might ‘just have a feeling’ that a building is about to explode. In fact their senses are unconsciously matching the stimuli present with past experience of the same thing happening before.
The ability to use intuition and judgment to solve problems or react to situations without being able to explain how they know is a common characteristic of experts. As Malcolm Gladwell described in his book Outliers, it is said that it takes at least 10,000 hours or ten years of deliberate practice to become an expert in any given activity. However, many people may do something for 10,000 hours – for example, driving a car over the course of a lifetime – and still never get anywhere near expert level. Most people plateau and some even get worse. Those that do achieve what could be called expert status do so by actively learning through deliberate, investigative, and knowledge-seeking experience, developing intuition and building judgement through play and intensive interaction with the system and its environment. That is what is meant by deliberate practice.
A study of chess players concluded that “effortful practice” was the difference between people who played chess for many years while maintaining an average skill and those who become master players in shorter periods of time. The master players, or experts, examined the patterns over and over again, studying them, looking at nuances, trying small changes to perturb the outcome (sense and respond), generally “playing with” and studying the patterns. The report also noted that, “… the expert relies not so much on an intrinsically stronger power of analysis as on a store of structured knowledge.” (Ross, 2006, p. 67) In other words, they use long-term working memory, pattern recognition and chunking rather than logic as a means of understanding and analysing.
This indicates that by exerting mental effort while exploring complex situations knowledge becomes embedded in the unconscious. By sorting, modifying, and generally playing with information, manipulating and understanding patterns and their relationships to other patterns, a decision-maker can proactively develop intuition, insight and judgment relative to the domain of interest.
When it comes to error management in aviation, one of the principal countermeasures we use is the standardisation of practice and procedure. We employ key tools to assist us in this, such as the checklist in both normal and emergency forms. In some cases they are there to help guide our decision-making process. But while standardisation in all its guises is possibly our most powerful weapon in the most common and routine areas of error management, it does construct barriers towards the development of experience, insight, and the deeper level understanding as described above. Alongside other risk mitigating measures such as weather minima, go/no-go items, and SOPs which prescribe strict boundaries to flying parameters and manoeuvres, and prohibit others, we create a strangling effect on experience. Put most simply, we know that we learn most from making mistakes. If we are effective in preventing mistakes, then we are necessarily restricting our learning. We will no longer be able to stretch the boundaries of our experience by experimentation and learning from error.
In the context of routine flying it is not hard to see how the balance of cost-benefit topples rightly towards constraining risk. But it is a more difficult dilemma in the context of complex situations, where we fall back more heavily on experience and cognitive problem solving. The well known modern safety dilemma of automation dependency is an example of this. Safety is greatly enhanced by modern auto-flight systems which for the most part do the job of flying better than any pilot could. But when that system goes wrong the pilots that have only flown with its assistance have not always built up the experience, knowledge, and thought processes to operate safely without it. Furthermore, the growing complexity and ubiquity of these systems is such that no pilot will ever have a complete understanding of their intricacies even over the course of a whole career. The complexity has outgrown our capacity.
Impact of the team dynamic on decisions in a complex environment
Recognition of the above makes the contribution of a team more important than ever. The use of teams to develop multiple perspectives, engage in dialogue, and drive critical thinking can improve the overall understanding of a complex situation, thereby improving decision making.
As individuals, some things are not always clear to us because we’re just too close to them. As we take in the external world and events around us, we think that we observe, create a model in our minds, and thereby have an accurate representation of the external world. Unfortunately, this is not the case. How we view a situation, what we look for, and how we interpret what we see depend heavily on our past experience, expectations, concerns and goals.
It stands to reasons that two brains, and two perspectives are better than one: it increases the availability and understanding of relevant facts, data, context information and past behaviours. But it is not just the sum of the knowledge available that helps, the dynamic in group decision-making can also make a large difference. When we find ourselves in confusing situations, facing ambiguities or paradoxes where we don’t know or understand what’s happening, it is intelligent to recognise the limited mental capacity of a single individual.
Confusion, paradoxes and ambiguities are not made by the external reality of the situation itself; they are created by our own limitations in thinking, language and perspective or viewpoint. This is why teams can improve the understanding of complex situations. Multiple viewpoints, and the sharing of ideas and dialogue can surface, and clarify confusion and uncertainties to an extent far greater than any one individual mind can do.
Team-working also encourages mental flexibility. Mental flexibility means the capacity to learn to maintain an open mind, and not be prejudiced by past outcomes, organisational routines or standardised thinking. It is the ability to assess an occurrence in its environment objectively and have the wherewithal to take whatever rational action makes sense—either on the basis of logic or intuition and judgment—to achieve decision-making goals. This flexibility means that decision-makers must be willing to move beyond conservative solutions that have proven themselves in the past and be willing to consider new approaches where outcomes are uncertain (at best) and perhaps completely unknown (at worst). This also means that people must be capable and willing to work with each other, work with others they have not worked with before, work in new ways, and take unfamiliar actions.
Team Resource Management
Training in Crew Resource Management concepts in aviation and Team Resource Management in a wider context provides obvious benefits in handling complex environments. The development of resilience, strategies for responding to fundamental surprise, and the building of mental flexibility, and decision-making tools all contribute to this. So too does understanding the limitations of our situational awareness and our mental models, our own character and personality, and how we process and perceive the information hitting our senses. As ever in CRM though, nothing is more important to decision-making in complex environments than a properly functioning and productive team dynamic, with effective communication at its core, which allows ideas and actions to be shared, questioned, and critiqued. Setting the conditions to encourage these processes to take place should be our first priority in training and operating.
Adapted from: Bennet, Alex & Bennet, David. (2008). The Decision-Making Process for Complex Situations in a Complex Environment. Handbook on Decision Support Systems. (See link)