Understanding complex systems is vital to improving our understanding of risk; identifying critical points of failure, and optimising workflows and processes to bring greater safety and efficiency to all areas of society and industry.
Complex systems are many and varied, growing in number and impacting our daily lives, often in ways we don’t realise. Essential services such as health and social care, education, food and water supply, communications, finance, retail, transport and power supplies are all more interconnected and interdependent than ever before. As the COVID-19 pandemic has demonstrated, when one complex infrastructure system doesn’t work as expected or fails altogether, many other complex systems are also affected and impacts can be far-reaching, even catastrophic.
To understand complex systems better we need to be able to discuss them more widely and with people working in different sectors and disciplines and across national and international boundaries.
In Spring 2019, Engineering X launched a five-year mission, Safer Complex Systems, to enhance the safety of complex infrastructure systems globally. Engineering X is an international collaboration, founded by the Royal Academy of Engineering and Lloyd’s Register Foundation, that brings together some of the world’s leading problem-solvers in international communities of practice to address the great challenges of our age. Those of us who have been involved in this mission are learning that improving the safety and management of complex systems is not easy and the reasons why are in themselves complex.
Complex not complicated
There are challenges in defining what is meant by complex systems, in particular why a complex system is different from a complicated system. In the latter, predicting outcomes is possible if you know the starting conditions and system boundaries. In a complex system the same starting conditions can produce different outcomes, depending on the interactions of the elements in the system with other systems.
Some complex systems are engineered—such as the trains in a city metro system—there is a plan, the vehicle performance is known in advance, and there are protocols and regulations in place. There is little ambiguity over geographical extent, assets, operations and responsibility for the safety of the network. Other systems can be ad hoc—with no central authority, players joining and leaving at will, and regulation covered by multiple jurisdictions. But all systems interact with legacy systems making a system of systems that is inherently complex.
Getting to grips with complex systems means understanding and accommodating not only changing elements, interdependencies and fluid boundaries but also the variability of human behaviour, experience, organisation and language within and at the interfaces of those systems.
Living and working safely in an ever more complex and unpredictable world and the systems within it, requires new tools and more adaptable approaches to education, training, policymaking, governance and regulation.
Engineers are used to designing safe systems. Engineering processes use modelling that makes assumptions about continuity of behaviour. A culture of management and technical standards and procedures is the norm.
But the world we have created is not comprised of linear chains of cause and effect, and it is essential to understand safety in dynamic complex systems with fluid boundaries and in which there can be different perspectives and requirements for safety, resilience, robustness, efficiency, and antifragility.
Models and simulations can be helpful at the design stage at a range of conceptual levels but how people might act in the system needs to be included explicitly. While safety is a good measure for some complex systems such as self-driving cars, it is less useful when the adverse outcomes are hunger, destitution, civil unrest or shortened life expectancy.
Consideration of human behaviour in the design and analysis of complex systems remains, in most university courses, a final year option rather than a core competence and it is essential that this changes, especially for multi-disciplinary courses.
We also need a different regulatory mindset. The public likes the comfort of having a named individual who is held responsible for safety and much safety legislation is based on this principle. But when complex systems fail – particularly ad hoc systems – it is often impossible to identify an individual or single organisation that is responsible. Most engineers tend to think in a deterministic way; lawyers are similar.
Developing a safer future together
The Engineering X Safer Complex Systems mission is itself proving a study in complex adaptive learning but the case studies it is funding are already making a significant contribution to understanding whether it is possible to find common principles and new models for the governance, management and operation of complex systems. We must dispel the myth of the silver bullet and embrace diversity in people and approaches.
The Engineering X mission is seeking to expand its community of engineers and non-engineers working in academia, industry, government, non-profit and elsewhere to meet in the spirit of learning from each other. We know enough already to understand that if we can identify common principles and exchange insights every one of us, somehow, will experience the benefit.
This is the first in a series of guest blogs from experts in involved in the Royal Academy of Engineering’s Engineering X programme, an international collaboration founded with Lloyd’s Register Foundation and that aims to bring together some of the world’s leading problem-solvers to address the great challenges of our age. Professor Brian Collins is chair of the case study selection panel for Engineering X Safer Complex Systems mission. Republished from The Engineer.
Image credit: Martin Adams via Unsplash