Levels of Complexity

Levels of Complexity

Print Friendly, PDF & Email

Complexity is all around us. Complexity is in eco-systems. Where do they begin and where do they end, and what should count as the minimal unit of an ecosystem? Complexity is in the brain. How does a brain function, if  the neurons are continuously altered in consequence of their interaction? Complexity is in economy. What are the parameters we want to calculate, and how can we model the ever changing mind-sets of buyers and sellers?

Indeed, complexity has become a prominent issue in many sciences, from physics and physiology to ecology and economy. Likewise, the application of computer-generated studies ranges from highly theoretical issues (such as re-evaluating the explanatory power of Neo-Darwinism), to the most mundane affairs, such as re-organizing the cargo operations of Southwest Airlines.[1]

In the following pages, I offer a very brief introduction to the scientific vision underlying complexity studies (I). Against this background, I discuss different concepts of complexity and propose a hierarchy of different degrees of complexity: from complicated systems whose behavior are still governed by the collected properties of its constituent parts, to the unpredictable world of self-organizing, or even self-productive (autopoietic) systems (II). Finally, I propose offer some suggestions to the metaphysical and theological implications of complexity studies (III).

 I. Complexity Science: A New Interdisciplinary Field

With the advent of the computer, major parts of science are now shifting its emphasis from the study of the individual parts of matter — for example, atoms, cells, and molecules — towards the study of the general rules that produce the multiple forms of organization that we observe in nature. In this process of scientific revolution everyday phenomena have again come into the focus of disciplines like physics and chemistry. Early examples are Ilya Prigogine’s study of dissipative systems and the theory of chaos. However, self-organizing structures seem to appear everywhere in nature, also in systems that cannot be described by the specific mathematics of Chaos (e.g. the Lyaponov exponent). The building up of complex system thus seems to be generated by general principles.  In its strongest aspiration, complexity studies is the search for those general principles that generate the multifarious and yet highly structured world in which we live. Are we able to state a Fourth Law of thermadynamics, repsonsible for the over-all increase in the history of the cosmos?[2] Anyway, the scientific question is no longer only,  “what are the constituents of nature” (quarks, protons, neutrons, electrons, atoms, molecules etc.), but also, and more prominently, “how does nature work”? In particular, what are the principles that make nature organize itself as it does? Certainly, today’s sciences both presuppose and produce a vast amount of empirical data. But science is not primarily about gathering  itemized knowledge; science is not like stamp collecting. The practice of science is more often concerned with the formation of general models that are able explain, at least in theoretical manner, a vast variety of specified data — data which are already available or are found under the guidance of the proposed  theory. Science is about relating hitherto disparate phenomena and making them intelligible inside a comprehensive and coherent theory.

Complexity theory has radicalized this vision of science as something more than a empirical inquiry that aims to represent, or mirror, particular items of the world.[3] First of all, complexity theory is necessarily abstract since it deals with possible worlds. A complexity theory of evolution, for example, will not rest content with a historical account of the long and detailed route of evolution from molecule to man. Complexity theories would ask about possible evolutionary scenarios, and would aim to model different ways that might have lead from molecules to biologically organized organisms — on planet Earth or elsewhere in the world. The extremely fast reiteration of different models of  natural processes facilitates “games of life” and other ways of mimicking the possible avenues of evolution in “artificial life” simulations.[4] Imaginative thought experiments can now be tested, if not in reality, then at least in computer models.

Complex phenomena such as life and social systems cannot be handled with a step-to-step causal analysis; these systems are much too large, much too variegated, and much too fluffy to allow for a precise explanation in terms of its constitutive components. What we can do, however, is make more simplistic computer simulations of the complex realities of our complex everyday world.

Often it has been said — programmatically by the school of Empiricism — that the success of science should be identified with its predictive power. Obviously, the highly abstract nature of complexity theory defies this criterion. What can be predicted in computer simulations are not the details of the future, but statistical pictures about what probably would happen under these or those circumstances. Probability is all we have as guideline for systems whose complexity defies an analytical approach. But when probabilities increase and add up, they end up forming very convincing pictures about how nature organizes itself and how different pathways of evolution are possible.[5] It thus seems that complexity studies gravitate on the concept of information and on the computational aspects of pattern formation that we see around us. In an interesting way, complexity studies focus on exactly that aspect of material reality that has been relatively neglected in the modern physics. The modern concept of matter suggests that matter consists of an inseparable unity of (a) substance (matter in the classical sense), (b) energy (cf. relativity theory), and (c) information.[6] The concept of information, however, may thus achieve an important place in the explanatory scheme of modern physics.

However, Many disciplines are involved in the study of complex systems. Hardly a surprise, therefore, that there does not exist any consensus on the exact nature of  ‘complexity’. A provisional definition of complexity could be that ‘complex systems’ are systems in which there are large variability (Bak 1997, 5). A coastline, for instance, is like a Chinese box: the more one looks at the micro-scale, down to the turns of the rocks and even the curve of the individual sand grains, the longer will the coast-line appear. Add to this, that the coastline is ever changing as a consequence of waves and tides. Indeed, no computer is fast enough to answer the question of the length of any coastline. In this case, complexity is a simple result of the scaling problem.

An ontological definition of ‘complexity’ has been proposed by the systems theorist Niklas Luhmann: A complex system is one in which there are more possibilities than can be actualized.[7] According to systems theory, the large amount of elements in a given system (imagine an eco-system) means that not all elements can relate to all other elements; yet the survival of each individual organism is dependent upon not only the elements to which it relates directly (say, the prey), but also upon the state of all other elements within the ecosystem as a whole. “Complexity … means the need for selectivity, and the need for selectivity means contingency, and contingency means risk” [8]. Eventually, complexity becomes manifest in the  contingency of system-environments relations. Unpredictability is here seen as ontologically grounded.

However, we should not fail to notice that there are different families of complexity theory. Complexity studies can hardly claim to constitute a new “synthesizing theory” of all sciences. Rather, the sciences of complexity comprise a cross-disciplinary research field shared by specialists from disciplines as diverse as physics, biology, sociology, mathematics, computer science.[9]  In an interesting combination, the sciences of complexity seem to incorporate both a ‘modern’ spirit of unification and formalization, and a ‘postmodern’ sense of plurality and of the need for re-specifying the concept of complexity, as one goes from one domain to another, e.g. from physics to macro-biology, and from macro-biology to psychology and sociology.[10]  Some approaches are purely computational, whereas other are qualitative in nature. After all, ‘complexity’ is an observer-relative concept whose definition depends on the presupposed framework. As put by Richard Sol and Brian Goodwin, “My complexity may be your simplicity”. [11] For example, the bewildering variety of movements of ants in a colony may turn out to be relatively simple from the purview of a mathematical model of ant behavior.

II. Degrees of Complexity: A Typology

The computer scientist C.H. Bennett has suggested a measure for a structure’s degree of complexity by referring to its logical depth.[12] Logical depth is formally defined as the time needed (measured by the number of computational steps) for the shortest possible program to generate that structure; i.e., the time consumption from the input (the minimal algorithm) to the resulting output. On this definition, a system is complex, if it is difficult (or impossible) to compress into a logarithmic formula. A simple system, by contrast, is one which displays a regular, periodic pattern.

This definition of complexity is conformal with Claude Shannon’s concept of information from 1948. Shannon defined the meaning and transfer of information as equal to a state of entropy. The more disordered (i.e., algorithmic incompressible) a message is, the higher is its information content; thus we could imagine a system so complex that the shortest description would be to repeat the individual steps of the whole system, and then watch and see how it develops. By contrast, a series of equal signals (like blah-blah-blah) has a very low information content.

Appealing as this formal definition of information may be, it faces the problem of irrelevance. As phrased by the South African engineer Paul Cilliers, “if information equals entropy, then the message with the highest information concept is one that is complete random.” (Cilliers 1998, 8). In a similar vein, Bennett’s definition of complexity misses something that we aim to analyze when speaking about complexity, namely the complexity of organized systems. According to Bennett’s definition, the most complex picture on the TV-screen would be one in which there is only noise coming through and no discernable pictures. Bennett is right that a TV screen with a repetitive or periodic signal is very low in complexity (think of the still-pictures of a closed TV-station), but something more than disorder is needed in order to account for organized states of complexity. What is needed in order to have an organized system is a recognizable pattern.

In fact, the complexity that pervades biological systems appears in the fertile regimes between highly regulated orders (low complexity) and no order at all (high complexity). The genome, for example, entails both a generator of randomness (the different sorts of translocation and mutation in gene-replication that we see in the many different DNA-sequences) and a more stable structure responsible for processing the coding of proteins (the chemical bonding between the nucleotides A and T, resp. between G and C). As pointed out by Paul Davies, in living systems we have “not just any old random pattern, but a definite, narrowly specific … pattern”.[13]  In short, in order to have organized complexity with qualitatively interesting features, we need to have (1)  great variability (“logical depth”) plus (2) short- or long-term patterns.

What, then, would we need in order to have self-organized complexity? I think we would need having (1) variability, (2) stable patterns, plus (3) some internal law-like principles that are able to generate the patterns. What is surprising in the evolution of complexity is that many pathways are laid down in the process of walking, and yet they are informed by an internal generative program which is neither the result of an a priori design, nor an ad hoc-construct triggered by environmental factors.[14] In order to formulate the difference between complex and non-complex forms of organized complexity, one could use Paul Cilliers’ distinction between complex and complicated (1998, 3). A complicated system is one with a large number of components, say an aircraft, whose systemic behavior can be sufficiently described by reference to the parts of which it is made, and to a manageable recipe for combining the environmental influences. A complex system, by contrast, is one whose behavior cannot be adequately understood by reference to its elements, and whose interactions with its environment therefore cannot be controlled in advance.

But could we perhaps further differentiate between varied forms of complexity?  Even if it is always dangerous to propose definitions before one has introduced examples and test-cases, let me here propose some hopefully workable definitions of degrees of complexity. To each definition I will attach some examples. These are chosen in order to indicate how the same forms of complexity pervades technical systems, natural systems, and human beings. Also a human being may sometimes act as a quasi-mechanical agent whose behavior is rule-conducted (and thus computational). Thus, “utilitarian” rational choice agents of standard economical theory act according to specifiable recipes of behavior. The complexity of their behavior will be closer to the complicatedness of a jumbo-jet than to the self-organized complexity of a volcano. Similarly, a thermostat (which has an inbuilt program) has a higher degree of complexity than a hard-ware computer which is not yet soft-ware programmed.  Against this background I propose the following hierarchy of complexity – from mere complicatedness to the features of autopoietic systems that constantly generate and regenerate their system-specific elements.

A.    Complicated Systems, def. : Large systems with many distinct components whose aggregate behavior nonetheless (a) can be fully understood on the basis of the components of the systems, (b) can be compressed in algorithms, and thus (c) can be predicted. Examples are a Boing 747, crystals and other aggregates, or a system of rational choice agents.

 B.     Random Complex Systems, def.: Large systems with great variability of its elements whose behavior (a) cannot be fully understood with reference to the components of the systems, (b) cannot be easily compressed into algorithms, and therefore (c) cannot be predicted. Examples are chaotic signals on a radar screen, geological plates, and road users who only occasionally obey the traffic rules.

 C.     Organized Complexity, def.: Large systems which combine great variability of its elements with an internal program, that depends on external information. This program selectively  constrains the array of possible movements. Like randomly complex systems, the behavior of organized complex systems (a) cannot be fully understood with reference to the components of the system and (b) cannot be easily compressed into algorithms, but are (c) in principle predictable (though perhaps not de facto). Examples are thermostats, wired computers, searching programs in radars, ectotherms, and highly adaptive students.

 D.     Self-Organized Complexity, def.: Large systems that combine great variability of its element with an internal, autonomous program which constrains the array of possibilities and itself controls the system-world interactions. Like other  complex systems, their behavior (a) cannot be fully understood with reference to their components, and (b) cannot be easily compressed into algorithms, but, by virtue of their internal self-organization, they (c) acquire a high degree of autonomy and control vis-a-vis their environments, and (d) are thus in principle predictable (though perhaps not de facto). Examples are volcanos, endotherms, RNA-hypercykles, and purely rule-conducted (“Kantian”) agents.

E.     Autopoietic Complexity, def.:  Large systems that combine great variability of its elements with an internal, autonomous program which constrains the array of possibilities and yet itself produces new internal components and thus continuously triggers new system-world interactions. Like other complex systems, their behavior (a) cannot be fully understood with reference to its components, and (b) cannot be easily compressed into algorithms. Like self-organized systems, they (c) acquire a high degree of autonomy vis-a-vis their environments, but, by virtue of their self-productivity, they are (d) highly flexible, internally as well as externally, and (e) never in principle predictable. Examples are RNA-DNA-reproduction, immune systems, neural systems, language systems, historically self-reflexive (“Hegelian”) agents.

III. Hard and Soft Theories of the Emergence of Complexity

We have seen that complexity comes in different degrees. Accordingly one could  argue that we today have different approaches to the study of complex systems. Some refer to the difference between “weak” and “strong” programs in artificial life. Weak ALife argues that complexity studies only model biological processes; accordingly, computer models of, say, a world of “aintz”, should not be termed “living” and be conflated with the real world of ants. By contrast, Strong ALife holds that a suitable wired computer is able to behave in ways that we could not distinguish from a film of real-world creature (think of Karl Sim’s creatures); accordingly, they should also be deemed to be “alive”. [15] This distinction between weak and strong, however, revolves around a philosophical debate about the epistemological status of computer models. What is the relation between model and reality?

Parallel to the distinction between weak and strong, there are also various expectations about the explanatory capacities of complexity research. This difference revolves around the distinction between reductionist and post-reductionist programs. There is, first, a computational approach which believes that complexity emerges out of the aggregation of specified if-then movements of individual agents. A strong representative of this tradition is John Holland who was one of the first to introduce the term “evolutionary algorithms” to the study of artificial life. According to this research program, the emergence and evolution of complex systems — if not in praxis, then in principle — can indeed be reduced to computational relations between localizable elements and agents in their immediate  environments: “we can reduce the behavior of the whole to the lawful behavior of its parts, if we take the non-linear interactions into account”.[16]

However, bearing in mind the distinction between model and reality, one could argue that even if computerized algorithms, by definition, are deterministic in nature, the highly specified if-then mechanisms do not exactly mirror natural states. Thus even within the computational approach to complexity studies there is an awareness that we might be able predict the particular pathways of complex systems that are being carved out in the phase space of the actual biosphere. The reason is that the functions of complex systems are often highly context-dependent, and that a successful adaption to one environment might either lead to a long-term failure, or may constitute helpful pre-adaptation to future environments. Against this background  Stuart Kauffman, in his Investigations, has recently argued that we need to combine a computational approach to emergent phenomena with a sense a story-telling that reminds of the contingency of the configurations that are actually explored within the much wider set of possible evolutions. “I do not think it is possible to finitely prestate all the context-dependent causal consequences of parts of creatures that might turn out to be useful in some weird environment and hence be selected”. [17] Thus even a hard computational theory of emergence will have to supplement computer-models with stories how things actually have evolved. One can here see a parallel to the controversial question within evolutionary biology, whether or not macro-evolution can be explained by the micro-evolutionary processes studied by molecular biology.

But there is also a much softer approach to complexity, which is conceptual in nature rather computational. This is the tradition of systems theory which has proven particularly helpful in ecology and in the social sciences, where networks of  interdependencies are to be analyzed. The reason for thinking that complex systems are not always open for a computational approach is that the individual elements and/or agents cannot always specified individually, since they vary in form and function. It is a thus matter of a conceptual choice, how you are cutting up the world in theoretical entities and define the elementary actors in, say, an ecological system. Is the atmosphere an agent, or a boundary condition? Is the group dynamics of a herd the primary agent, or the individual macroorganism (such as the cow), or the fauna of microorganisms (bacteria etc) on which the life of the mouse depends? But also the collective behaviors are not always computational, because they  depend on the unpredictable system-environment interactions, that vary according to environmental location. One example of this approach to complexity studies is Susan Oyama’s proposal that we can never study isolated entitities, but only map the “constructivist interactions” between a multiplicity of entities, influences and environments. [18] Also the theory of autopoietic systems stand in this constructivist tradition.

IV. Re-describing the World of Complexity Theologically

Now, what are the available theological options vis-a-vis this vast field of complexity studies. Let us first face what might be seen as a potential threat to theology. At first glance, the concepts of self-organization and autopoiesis may seem to make it difficult to speak about God’s transformative presence in the history of evolution. Complexity “can and will emerge ‘for free’ without any watchmaker tuning the world”, as phrased by both Stuart Kauffman and Per Bak. Exactly the robustness of self-organizing principles implies a high degree of independence from external conditions. Self-organizing systems are not in need of an especially designed tuning of the environment as one goes from one system to another. Here is a significant difference to the so-called Anthropic Principle. According to this principle, the fundamental constants of nature (for example, the relation between matter and anti-matter in the universe, and the expansion rate of the universe) and the initial conditions of our universe (for all, its size) need to be delicately fine-tuned in order to create the material conditions for life. In this case, the hypothesis of a divine designer could be said to fit well with the fact that the physical conditions for life are exceptionally well attuned to one another. By comparison, there is no such extraordinariness about the dynamics of self-organizational systems.

For this reason some religious thinkers (for example within the Intelligent Design Movement) are concerned about the whole program of complexity research. The underlying argument is that since the concept of self-organization suggests that nature works on the basis of its own powers, God’s activity would automatically be ruled out of cosmos. In this view, natural self-organization and divine action are seen as competitive explanations.

However, with the majority of writers in the science-religion discussion I find this contrastive thought pattern theologically mistaken. God (or, with Thomas Aquinas: the universal cause of all-that-is) should not be put in competition with the laws of nature. Rather God is their creator. Taking this classical view one step further, we could say that God has created the world as self-productive, or autopoietic. This is the position that I have proposed elsewhere (in  Zygon 1998 and 1999). This theological view is compatible with both the paradigm of evolution and with the paradigm of self-organizing complex systems. Remember that also Darwin introduced a theological interpretation of natural self-development in the final sentences of his On the Origin of Species (from the third edition onwards):

There is a grandeur in this view of life, with its several powers, having originally been breathed by the Creator into a few forms or into one; and that, whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful, have been and are being, evolved.

Translating this theological vision to the new sciences of complexity, we might say God empowers the processes of diversification in the history of evolution by the means of relatively simple laws that guarantee, “for free” , that we in the end attain at increasingly complex orders. In the context of complex systems, it is not so much the delicate fine-tuning of the many parameters that is awe-inspiring, but rather the fact that  a principle of grace seems to be built into the mathematical order of the universe.

On this interpretation, the classic notion of design is not necessarily ruled out. Indeed one could combine a design-argument concerning the fundamental laws and constants of physics with an aesthetic awe vis-a-vis a world which seems to have been be given free by a generous God. Re-describing the world of self-organized complexity theologically, God is ubiquitously present in the world as the wellspring of the ever unprecedented configurations of order. In this view, God creates by giving free freedom – gratuitously. As such, God is not only a remote, a-cosmic designer of a world which supposedly is conceived and crafted as a harmonious whole. The transcendence of God is also revealed in the ever surprising richness of pattern formation throughout the history of evolution. At this point, there are surprising similarities with the Biblical idea of God as the creative Logos of the world (John 1) which manifests itself in the logos-structures within the world. A theological interpretation along this line involves a re-appropriation of the standard notion of ‘divine design’. A generalized notion of an a-temporal ‘meta-design’ (concerning the fine-tuning of the coincidences of the anthropic principle) can be supplemented with a theological appreciation of the freedom allotted by God to the multifarious ways of self-organizing nature. The many ways of evolution and co-evolution are given free for self-development, yet they remain both facilitated and tempered by the coordinated phase space of the laws of physics and the general properties of matter.

The metaphysical and theological implications of complexity research is further explored in a forthcoming book which features articles by Paul Davies, Stuart Kauffmann, Ian Stewart, Gregory Chaitin, Charles Bennett, Werner Loewenstein, William Dembski, Harold Morowitz, Arthur Peacocke, and Niels Henrik Gregersen.  From Complexity to Life: On the Emergence of Life and Meaning, Niels Henrik
Gregersen ed., New York: Oxford University Press 2002 (March).

 Notes:

1 See, respectively, Stuart Kauffman, Investigations, New York: Oxford University Press 2000, and Julie Wakefield, “Complexity’s Business Model”, Scientific American 284, January 2001, 31-34.

2 See Ian Stewart, “The Fourth Law of Thermodynamics”, in Niels Henrik Gregersen ed., From Complexity to Life: The Emergence of Life and Meaning, New York: Oxford University Press (forthcoming, Spring 2002).

3 On the general nature of complexity studies, see e.g. Per Bak, How Nature Works. The Science of Self-Organized Criticality, Oxford: Oxford University Press 1997, 9ff, cf. 171.

4 See, for example, Claus Emmeche: The Garden in the Machine: The Emerging Science of Artificial Life, Princeton: Princeton University Press 1994; Richard Sole and Brian Goodwin, Signs of Life. How Complexity Pervades Biology, New York: BasicBooks 2000.

5 Early, but fine introductions to the field are Mitchell Waldrop, Complexity. The Emerging Science at the Edge of Order and Chaos, New York: Touchstone 1995;  Peter Coveney and R. Highfield, Frontiers of Complexity. The Search for Order in a Chaotic World, New York: Ballantine Books 1995.

6 Jiri Zemann, “Energie”, Europ•ische Enzyklop•die zu Philosophie und Wissenschaften, Hamburg: Felix Meiner 1990, Bd. 1, 694-696, her 695: “Der Begriff der Materie hat drei Hauptaspekte, die untereinander untrennbar, aber relativ selbst•ndig sind: der stoffliche (mit Beziehung zum Substrat), der energetische (mit Beziehung zur Bewegung) und der informatorische (mit Beziehung zur Struktur und Organisiertheit)”.

7 Niklas Luhmann: A Sociological Theory of Law, London: Routledge, 1985, 25.

8 Niklas Luhmann: Soziale Systeme. Grundriss einer allgemeinen Theorie, Frankfurt a. M.: Suhrkamp 1984, 47.

9 Cf. Claus Emmeche, “Aspects of Complexity in Life and Science”, Philosophica 60, 1997, 901-929, 903.

10 In Paul Cilliers’ interesting book, Complexity and Postmodernism. Understanding Complex Systems, Routledge: London and New York 1998, he perhaps over-emphasizes the postmodern aspects of complexity studies.

11 Richard Sole and Brian Goodwin, Signs of Life. How Complexity Pervades Biology, New York: BasicBooks 2000, 27f.

12 C.H. Bennett, “Logical depth and physical complexity”, in R. Herken (ed), The Universal Turing Maschine. A Half-Century Survey, Oxford: Oxford UP, 227-257.

13 Paul Davies, The Fifth Miracle. The Search for the Origin and Meaning of Life, New York: Simon and Shuster 1999, 120.

14 Finely expressed by Paul Cilliers (1998, 91): “The structure of the system is not the result of an a priori design, nor is it determined directly by external conditions. It is a result of interaction between the system and its environment”.

15 Peter Coveney and Roger Highfield, Frontiers of Complexity (note 5), 239f.

16 John H. Holland, Emergence from Chaos to Order, Oxford: Oxford University Press 1998, 122.

17 Stuart Kauffman, Investigations, New York: Oxford University Press.

18 Susan Oyama, Evolution’s Eye. A Systems View of the Biology-Culture Divide, Durham and London: Duke University Press 2000, 3-7.