Complexity and Systems Thinking
We live and work in a world that is shrinking due to interconnectivity, but growing in terms of relationship diversity. Jostling world powers, enlarging economies, emerging technology, nefarious disruption, increased rate of change, and rapid diffusion of new innovations into society all act to increase threat, urgency and uncertainty (Lane and Down 2010, Sargut and McGrath 2011, Rohrbeck and Gemuenden 2011). The impact of this dynamic on the security environment has been to accelerate the development of current challenges and the emergence of novel challenges. Security challenges are fraught with complexity and have many of the attributes of complex adaptive systems.
Take radicalization, for instance, inspired by an increasing multitude of factors including disparity, inequality, economics, democratization, civil strife, population movement and interventions of many kinds from military, religious, corporate and political entities. At every turn, stakeholders with different values, priorities and views may explain the issue in numerous ways. There are never simple answers and the multitude of problem roots are intertwined and codependent. Every solution is a one-time pilot that causes the problem to mutate into another form for which there is no known and tested solution. There is no certainty in action, the risks are unpredictable, and there is an inability to explain what is going on solely by reference to the behavior of each component. Systems thinking is the only way to begin to get a grip on the situation.
Systems theory recognizes that the dynamic interactions of many elements play a role in shaping systems. It provides insight into complex systems of all types and explains the occurrence of constantly changing emergent behaviors that manifest due to system interactions (Skyttner 2005, Becker 2009). A part of systems theory, known as complex adaptive systems (CAS), was described by scientists as a means of explaining non-linear adaptation in the natural environment, such as economies, brain biology and immune systems (Holden, 2005). A fundamental feature of CAS is that simple interactions between individual elements at a micro level can lead to very complex behaviors at a macro level (Railsback 2001).
While practical advances have been slow to materialize, proponents believe that CAS concepts, such as systems emergence, could contribute significantly to our understanding of how decisions affect larger social system dynamics (Lansing 2003) or even provide opportunities for crisis scholars to explore the interactions between system elements that are related to the emergence of resilience (Holland, 1992). This is because CAS is perfect for analyzing systems that fluctuate between the extremes of stability and chaos and that are ‘moving targets’ (Cutter et al 2008). Resilience itself is a goal rather than an end-point for every community is on a path of constant adaptive improvement
(Rose 2007). Thus, any system based on crisis resilience is adaptive, open and responsive to feedback learning loops from previous crises (Holden 2005).
Strategic Thinking
Decisions that concern the overall strategy and direction of an organization require consideration of the future and are thus circumscribed by complexity, uncertainty, novelty and ambiguity (Harper 2014). Decisions concerning complex adaptive problems are never easy because of a disconnect between expected and actual outcomes. Uncertain complex situations render decision makers incapable of determining what will lead an organization to victory (Vecchiato 2012). Managing this dynamic is thus increasingly recognized as an essential leadership quality.
Ideally, decision makers would assess all possible decision outcomes and select the most efficacious option (van der Heijden 1997). However, in a complex situation, the cognitive limitations of the human mind result in the creation of a simplified model of the problem and full rationality is never achieved (Senge 1992, Robbins and Judge 2012).
Effective decision making in government requires an evaluation of potential social impacts and outcomes. In a complex system, any component may be a positive, neutral or disruptive force that impacts on services and political sustainability. Unfortunately, decision makers often shy away from taking the time to understanding complexity and default to characterizing complex problems in terms of simple linear relationships with the expectation that the future will be a continuation of the present (Senge 1992, Lane and Down 2010, Hammoud and Nash 2014).
This is why traditional risk assessment processes and problem solving models, designed for non-complex, linear systems, often fail to provide adequate solutions initially and are not recommended in situations where a problem continually evolves due to complexity. Risk assessment approaches that include an evaluation of probability are also not recommended for any crisis deemed low probability.
Natural simplification strategies are known as heuristics. They include various methods that help people deal with complex situations by moving forward instead of delaying important decisions due to information overload and inability to ascertain a safe level of risk (Senge 1992, Hall 2007). However, heuristics limit the quality of decision making in complex circumstances characterized by high risk, high stakes, uncertainty and urgency because the uncertainty is either simplified, ignored through commitment bias or confirmation bias, or minimized through over confidence (Sargut and McGrath 2011).
Rather than struggle with deciphering a novel uncertain situation, gaining awareness of inherent cognitive limitations, recognizing biases and making strategic decisions with insight and confidence, decision makers often default to an assumption that they are in possession of complete understanding of a complex system due to previous experience, knowledge and historic information (Hammoud and Nash 2014). Thus, a more systematic approach is required.
Strategic Simplicity
“Simplicity is the ultimate sophistication” is a quote that has been attributed to Leonard da Vinci and it has been grossly misused by those seeking an easy way out. Simplicity as a first resort in a complex situation is nothing more than laziness and foolishness. Complex situations and problems demand serious attention and a concerted significant effort is required to understand them as much as is possible. Only after a full appreciation of complexity does clever simplicity become sophisticated. Some of the means of achieving notable simplicity follow.
Improve awareness of information
In 1955, Luft and Ingham created the Johari Window model to illustrate and improve self-awareness and mutual understanding between individuals within and between groups. The model consists of a four-quadrant matrix with the ranges being known and unknown vs self and others (see right).
In 2002, United States Secretary of Defense Donald Rumsfeld made the Johari Window famous when referring to the lack of evidence linking the government of Iraq with the supply of weapons of mass destruction to terrorist groups.
Known to you | Known to you | |
Known to others | Public/Open Information that you and others know |
Blind Spots Information you do not know, but others know |
Unknown to others | Hidden Information that you know but other do not |
Unknown Information that you and others do not know |
He said, “Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns — the ones we don’t know we don’t know. And if one looks throughout the history of our country and other free countries, it is the latter category that tend to be the difficult ones” (Defense.gov 2002).
Dangerous rationalizations or assumptions can dramatically increase the exposure of an organization to risk (Pearson et al 1997). One of these is assuming that you know enough about a problem when you make a decision. This is dangerous because the lower right quadrant of the Johari Window represents information that is unknown to all parties. If it is small, then risk of unforeseen events is low, but if it is large, then risks abound. Going to war with a country about which you know little is a great example of huge unknown and therefore unassessed risk. In a complex situation, always reveal what you can to minimize the Hidden quadrant and interact with all relevant stakeholders to minimize the Blind quadrant.
Broaden analysis
Every crisis demands a broadening of analysis and an expansion of options, however, crises often have the exact opposite effect. During periods of stress and uncertainty, individuals and organizations typically narrow their analysis, which results in the consideration of fewer options. One of the features of most crises that is initially denied or resisted most vigorously is that of ethical action. Unfortunately, the longer an organization procrastinates the adoption of ethical actions the greater the chance is that the initial crisis will grow and that the organization will be perceived as a villain. Since ethical action is usually adopted or forced in the end, the fault lies in organizations misunderstanding the different aspects of a complex problem, which can lead to decisions that result in more problems.
To avoid bad decisions, organizations must look at problems from different perspectives, which requires an understanding of the body of knowledge associated with each perspective and a senior level person who can coordinate the use of this knowledge (Mitroff 1998). All problems have significant aspects from each of four perspectives and a failure to consider one or more of them almost guarantees bad decisions.
- Scientific/Technical/Impersonal knowledge concerns how and why things are the way they are. This basic knowledge is brought to bear on scientific and technical problems and is the foundation for most professional careers.
- Interpersonal knowledge relates to how people socialize, connect and relate to each other in families, institutions and communities.
- Existential knowledge refers to basic questions, such as, Why am I here? What is my purpose? How can I make sense of this crisis? Do I need to change to adapt?
- Systemic knowledge concerns identifying our place in the systems in which we exist, how our actions and ideas in the micro context transfer to the macro, and pattern recognition.
All four perspectives are not equally important in all problems, although in principle they are present in all problems. If one of these perspectives is overlooked or downplayed during problem analysis, our understanding of the problem will be incomplete and solutions may omit essential elements.
Prepare for categories of threats rather than individual threats
There are many different types of crises, but they can be categorized into major families according to basic similarities in preparedness and response requirements. Where a typical approach to crisis management is to plan for a few specific crises, often within the same crisis family, a comprehensive approach would include plans for at least one specific crisis in each crisis family. High-resilience organizations prepare for a broader selection of crisis families.
Mitroff and Anagnos (2001) provide six reasons to support this approach.
- Most organizations plan for crises in only one or two families and focus on natural disasters, such as wildfire, earthquake or flood.
- Organizations that broaden their preparations for crises other than natural disasters most often do it only for crisis that are specific to their industry. For instance, restaurants will prepare for food contamination, petroleum industries will prepare for explosions and hospitals will prepare for infection outbreaks. Such crises are considered a normal and routine part of the industry since they are unfortunately part of their regular operating experience.
- Organizations have to revisit their plans because crises continually evolve and major crises occur not only because of what an organization knows, anticipates, and plans for, but just as much because of what it does not know and does not anticipate.
- Every organization should plan for the occurrence of at least one crisis in each of the crisis families for the reason that each type can happen to any organization. Failing to plan for any of the major crisis families is not safe.
- Planning for every specific type of crisis within each of the families is impossible and unnecessary, and planning for only those types of crises that have been experienced, according to traditional risk assessment outcomes, is shortsighted. No crisis ever follows a predictable path and no crisis plan ever works entirely as intended, so it is acceptable to prepare for the occurrence of at least one type within each of the families. The most valuable aspect of this process is that people think about unthinkable crises prior to their occurrence. Leaders who anticipate the unthinkable are not paralyzed when it occurs.
- In a more complex world, any crisis is capable of setting off a secondary crisis or even a cascade of subsidiary crises. Every crisis is thus capable of being both the cause and the effect of any other crisis. The best organizations thus prepare for individual crises and the simultaneous occurrence of multiple crises. This is done by studying past crises and looking for patterns and interconnections between them. Systems mapping is useful in this context to understand how crises unfold over time and how their effects are felt both within and beyond the organization.
Narrow focus through discard
Eric Berlow presented a Ted talk on natural ecosystems entitled “Simplifying complexity” and stated that, for any problem with many moving parts that all influence one another, the more complex the problem – the more resistant it seems to change – the easier it may be to understand and solve.
This talk explored how embracing complexity can lead to simple answers, how complexity theory helps us harness more creativity to solve difficult problems, how complex systems can be mapped to identify their most influential agents, and how can we use our knowledge of ecosystems to help solve societal problems. According to Berlow, the answer to overcoming the challenges of complex systems is simpler than you think. Through his research, he demonstrates that “the more you embrace complexity, the better chance you have of finding simple answers.”
Using Berlow’s method, a complex system can be simplified using metrics and other technology tools that help visualize and reduce complexity. All elements in a model that are more than three steps away from a particular issue under examination are discarded and the complexity is thus deleted.
While the approach sounds convincing, it requires the identification of all agents and their relationships within a problem system before it can generate meaningful outcomes. It came from studies on natural ecosystems, which are more predictable and less complex than human social systems. In a complex human system, all the agents and their relationships are almost never entirely and accurately known and they are often in flux with new agents and relationships developing over time. The method is thus interesting, but would be almost impossible to implement when applied to a security problem that behaved like a complex adaptive system.
Narrow focus by identifying driving forces
System mapping exercises use tools, such as causal loops, domain mapping and systems mapping, to identify stakeholders, processes, structures and functions – the who, how, what and why – to improve and develop awareness of complex problems. These tools provide the basis for identifying and categorizing basic trends and information that are driving future change.
Driving forces tend to fall into the four categories of politics, economics, society and culture, science and technology denoted by the acronym PEST. More in-depth analyses may include consideration of environmental, legal and regulatory, and ethical drivers (STEEPLE). Analysis of change drivers is useful in identifying and categorizing basic trends and information about a range of different contextual issues that influence the future (Shoemaker 1995, Henry 2008). In business, PEST analyses are often used analyze the macro environment around organizations to understand market growth or decline (Weeks 2018).
In the context of security strategies, a PEST analysis can help us understand the external drivers that potentially affect outcomes. The process simplifies a complex problem by producing a list of drivers that require consideration when formulating a new strategy or intervention.
Narrow focus through framing
A very different approach to resolving complex issues relates to consequence manipulation and management where the objective is to minimize negative impact and maximize positive impact on one’s own organization and vice versa for targeted stakeholders. There are three levels of framing. The first is dictated by the type of entity involved, the second by the nature of interest in the entity, and the third by the nature of the messaging desired. The framing of an event is very powerful and may have a direct impact on the fate of politicians, policies and even institutions.
- Entity: At its most basic level, framing is a product of the type of organization involved. Of key importance is the nature of the crisis, the culture of the organization, the mission and vision of the organization and the leadership psychology. For instance, an Ebola outbreak would be framed differently by a health clinic, police, non-governmental organizations, government and commercial entities.
- Interest: Each entity is motivated by significantly different interests, aspirations and goals. They may strive for regulatory alignment, profit, influence and power or environmental protection. These stakeholder perspectives fall under the broad STEEPLES categories of society, technology, environment, economic, political, legislative, ethical and security. For instance, climate change may be viewed as a science problem, a communication problem, an operational problem, an existential problem, a pollution problem, a transnational cooperation problem, etc.
- Messaging:
- Silver-lining: The most common type of message is one that casts a complex problem in a positive light. For instance, management of a natural disaster provides experience necessary to do better next time, which is a sign of growth.
- One-off: Casting a problem as a one-off stand-alone disturbance is equivalent to one of the dangerous rationalizations mentioned earlier. The premise is that we are not to blame because this never happened before and we don’t need to change because it will never happen again.
- Blame: Constructing blame as being concentrated with certain other stakeholders or being dispersed among a network of stakeholders is a common tactic for shifting a problem onto other people’s shoulders that does not always work. For instance, when BP tried to blame their subcontractor for spilling oil into the Gulf of Mexico, they were punished by society.
- Failure: Casting a problem as a serious disturbance that is symptomatic of deep political and system failure. For instance, observe the jockeying of political parties when elections come close. They take particular issues and frame them as catastrophic failures to demonstrate why voters should vote a certain way.
- Violation: One of the most powerful framing methods involves casting the problem as a violation of a core value. For instance, the right to bear arms in the US is a core value that overrides any argument to limit arms. Likewise, the Me2 movement gained powerful traction as it tapped into the core values of equality and fairness.
- Denial: Complete obfuscation is attempted by some organizations as a means of sweeping a problem under the carpet. It is attempted often, especially by corporations and governments, but has devastating outcomes if the truth becomes known.
Conclusion
Crisis leaders and managers require a sound understanding of complex situations, which is best achieved through the use of proven tools that require systems thinking. The automatic inclination of many executive decision makers to prefer simple approaches and solutions renders them crisis prone. Inadequate investment in understanding the entities, relationships, and potential social impacts and outcomes in a situation can only result in sub-par decisions that have a higher chance of failure. In a complex crisis, any component of the system may be a positive, neutral or disruptive force that impacts on services and political sustainability. Crises demand serious attention and a significant effort is required to understand them as much as is possible. Once executives understand the complexity of a crisis, it is their job to identify suitable methods that simplify the situation for the purpose of making more effective and sophisticated decisions.
The views expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the Daniel K. Inouye Asia Pacific Center for Security Studies, the Department of Defense, or the U.S. Government.
References
Becker P. Grasping the Hydra: The need for a holistic and systematic approach to disaster risk reduction. Jàmbá: Journal of Disaster Risk Studies 2009;2(1):1-13.
Cutter S, Barnes L, Berry M, Burton C, Evans E, Tate E, Webb J. A place-based model for understanding community resilience to natural disasters. Global Environmental Change 2008;18(4):598-606.
Defense.gov. News Transcript: DoD News Briefing – Secretary Rumsfeld and Gen. Myers, United States Department of Defense (defense.gov), 2002. http://archive.defense.gov/Transcripts/Transcript.aspx?TranscriptID=2636.
Hall K. (2007). Looking beneath the surface: The impact of psychology on corporate decision making. Managerial Law 2007;49(3):93-105.
Hammoud MS, Nash DP. What corporations do with foresight. European Journal of Futures Research 2014;2(42):1-20.
Harper SC. Rising above the turbulence. Industrial Engineer 2014;46(4):34-38.
Henry A. Understanding strategic management. Oxford University Press 2008. http://hsctoolkit.bis.gov.uk/The-tools.html.
Holden LM. Complex adaptive systems: concept analysis. Journal of advanced nursing 2005;52(6):651-657.
Holland J. Complex adaptive system theory. Daedalus 1992;121(1):17-30.
Lane OA, Down M. The art of managing for the future: leadership of turbulence. Management Decision 2010;48(4):512-527.
Lansing S. Complex adaptive systems. Annual Review of Anthropology 2003;32:183-204.
Mitroff I. Smart thinking for crazy times: the art of solving the right problems. Berrett-Koehler Publishers: San Francisco CA, 1998.
Mitroff II, Anagnos G. Managing crises before they happen. AMACOM: New York NY, 2001.
Pearson CM, Misra SK, Clair JA, Mitroff II. Managing the unthinkable. Organizational Dynamics 1997;26(2):51-64.
Railsback S. Concepts from complex adaptive systems as a framework for individual based modelling. Ecological Modelling 2001;139(1):47-62.
Robbins SP, Judge TA. Perception and individual decision making. In Organisational Behaviour 15th Ed., pp. 199-234. Boston: Pearson, 2012.
Rohrbeck R, Gemuenden HG. Corporate foresight: Its three roles in enhancing the innovation capacity of a firm. Technological Forecasting & Social Change 2010;78:231-243.
Rose A. Economic resilience to natural and man-made disasters: multidisciplinary origins and contextual dimensions. Environmental Hazards 2007;7(4):383-398.
Sargut G, McGrath RG. Learning to live with complexity. Harvard Business Review 2011;89(9):68-76.
Senge PM. Mental models. Planning Review 1992;20(2):4-44.
Shoemaker P. Scenario planning: a strategic tool for the future. MIT Sloane Management Review: Winter 1995.
Skyttner L. General system theory: problems, perspective, practice. Singapore: World Scientific Publishing, 2005.
Van der Heijden K. Scenarios, strategies and the strategy process. Nijenrode Research Paper Series, Centre for Organisational Learning and Change, 1997;1997-01.
Vecchiato R. Environmental uncertainty, foresight and strategic decision making: An integrated study. Technological Forecasting & Social Change 2012;79:436-447.
Weeks A. PESTLE analysis. Chartered Institute of Personnel and Development. http://www.cipd.co.uk/hr-resources/factsheets/pestle-analysis.aspx.
Published: January 7, 2021
Category: Perspectives
Volume: 22 - 2021
Author: Deon Canyon