Incomplete Nature - LightNovelsOnl.com
You're reading novel online at LightNovelsOnl.com. Please use the follow button to get notifications about your favorite novels and its latest chapters so you can come back anytime and won't miss anything.
EMERGENCE.
. . . we need an account of the material world in which it isn't absurd to claim that it produced us.
-TRANSLATED AND REPHRASED BY THE AUTHOR FROM ILYA PRIGOGINE AND ISABELLA STENGERS, 1979.
NOVELTY.
Whereas the formation of the first stars, the formation of the nuclei of the first heavier-than-hydrogen elements, the coalescence of atoms into the first molecules, and so on, have each had a profound effect on the interactions and distributions of substances in the universe, a common set of physical equations is able to give a reasonably complete description of the properties that resulted. The same cannot be said of the transitions that led to life and to mind. Though the laws of physics and chemistry were not violated or superseded in these transitions, their appearance led to a fundamental reorganization in how these properties were expressed and distributed in the world and how they were organized with respect to each other. The reliable reversal of typical thermodynamic tendencies, and the insinuation of informational relations.h.i.+ps into the causal dynamics of the world, make the transitions from inorganic chemistry to life, and from mechanism to thought, seem like radical contraventions of causality-as-usual. So, although I am convinced that no tinkering with basic physics is required to make sense of these phenomena, we still need to account for the radical about-face in how these principles apply to everyday ententional phenomena. At some point, the reliable one-dimensional lockstep of just one thing after another that had exclusively characterized physical events throughout the universe took an abrupt turn and headed the other way.
The appearance of the first particles, the first atoms, the first stars, the first planets, and so on, marked fundamental new epochs in the 13-billion-year history of the universe, yet none of these cosmic transitions contorted the causal fabric of things as radically as did the appearance of life or of mind. Even though these living transitions only took place on a comparatively insignificant scale compared to other cosmic transitions, and even though no new kind of matter or energy came into existence with them, what they lack in scale and cosmic effect they make up in their organizational divergence from the universal norm. Consider the following: * There were no ententional properties in the universe for most of its 13-billion-year history (i.e., before significant amounts of heavier elements were produced by dying stars).
* There was nothing resembling a function on the Earth until just over 3 billion years ago, when life first appeared.
* There was no hint of mental awareness on Earth until just a few hundred million years ago, when animals with brains first evolved.
* There was nothing that was considered right or wrong, valuable or worthless, good or evil on our planet until the first human ancestors began thinking in symbols.
All these innovative ways of organizing matter and energy, producing unique forms of influence over the events of the world, popped into existence from antecedent forms of organization that entirely lacked such properties. Physics and chemistry continued as they had before, but radical and unprecedented changes in the way materials and events could become organized followed these transitions wherever and whenever they occurred.
Such major transitions in the organization of things are often described as emergent, because they have the appearance of spontaneous novelty, as though they are poking their noses into our world from out of a cave of non-existence. And while they are not exactly something coming from nothing, they have a quality of unprecedented discontinuity about them-an almost magical aspect, like a rabbit pulled from an apparently empty hat.
In the way the term is often used, there is a close kins.h.i.+p between the concept of emergence and ideas of novelty and newness, as well as an implication that predictability is somehow thwarted. Although I hope to show how these are misleading attributions, this view is both common and attractive. For example, one of the more widely read books on emergence, written by the eminent physical chemist Harold Morowitz, itemizes and describes over twenty "emergences" in the history of the cosmos, including everything from the formation of stars to the appearance of language. At each of these transitions in the history of the cosmos and of the Earth, new organizations of matter appeared that were not present previously, at least at those locations. From this perspective, the transition that separates living processes from other physical-chemical processes is only one among many emergent transitions, such as the formation of the first stars or the production of heavy elements in dying stars.
But just being the first instance of something, or being novel or unpredictable, are not particularly helpful distinctions. Newness is in one sense the very nature of all physical change. However, a consideration of the sorts of transitions that characterize emergences for Morowitz indicates that there is a hierarchic aspect to this conception of emergence. Each transition involves the formation of a higher-order structure or process out of the interrelations.h.i.+ps of smaller, simpler components. Emergence in this sense involves the formation of novel, higher-order, composite phenomena with coherence and autonomy at this larger scale.
What about predictability? Being unpredictable, even in some ultimate sense, is only a claim about the limits of representation-or of human intellect. Even if certain phenomena are "in principle" unpredictable, unexplainable, or unknowable, this doesn't necessarily imply a causal discontinuity in how the world works. There may be a determinate path from past to future, even to a radically divergent form of future organization, even if this causal influence is beyond precise representation. Indeed, all representations of worldly events are simplifications, so we should expect to find many physical transitions that exceed our capacity to represent the basis of this transition. And it is often the case that what was once beyond predication becomes more tractable with better representational tools and more precise measuring instruments. The history of science has provided many examples of what once were apparently mysterious phenomena, a.s.sumed to be intrinsically intractable, that eventually succ.u.mbed to unproblematic scientific explanation. Such was the case with the once mysterious forces constraining the possible trans.m.u.tations of substances, as explored by alchemists, which were eventually explained by chemistry, and to an even greater depth with quantum physics.
Without question, phenomena such as life and mind owe some of their mysterious character to limitations in our present state of science. I'm confident that these limitations of our current theoretical tools can be overcome and that these phenomena can also become integrated into the larger fabric of natural science. The question is whether in accomplis.h.i.+ng this, their distinctive ententional characteristics (function, representation, end-directedness, self, and so on) will be explained rather than merely explained away.
What is interesting and challenging about ententional phenomena is that they appear to involve a global reorganization of their component dynamical interactions and interdependencies that only makes sense with respect to non-intrinsic relations.h.i.+ps. So proving ultimate unpredictability isn't critical, nor do we need to demonstrate a kind of radical discontinuity of causal influence. But we do need to explain how such phenomena can exhibit causal organization that is (superficially at least) the inverse of the pattern of causal organization which is otherwise ubiquitously present throughout the inanimate world. Instead of postulating discontinuous jumps, in which novel physical properties appear like rabbits out of empty hats as we cross certain thresholds of compositionality, we might better focus on making sense of the apparent causal reversals that motivate these special accounts. As in a magic act, there can be subtle effects that are difficult to detect, and there can be cognitive biases that cause us to look in the wrong place at the wrong time, thus missing the important features. We just need to know what to look for, and what we can ignore.
THE EVOLUTION OF EMERGENCE.
The concept of emergence is fairly new in the history of science. This is because it was originally formulated as a contrast to a causal paradigm that only reached its full elaboration in the nineteenth century. By the mid-nineteenth century, a more thoroughly mechanistic and statistical approach to natural philosophy had begun to coalesce and displace the previously more teleological and platonic forms of explanation. At midcentury, electricity and magnetism were being tamed, the general concept of energy was becoming formalized, thermodynamic principles were being derived from micro-Newtonian dynamics, and alchemy was being replaced by an atomic theory of chemical interactions. Even the complex logic of organism design and adaptation appeared subject to entirely material processes, as outlined in Charles Darwin's theory of natural selection. Herbert Spencer had even suggested that these same principles might be applicable to human psychology and social processes. Mystical powers, intangible forces, immaterial essences, and divine intervention in the goings-on of nature were seen as prescientific anachronisms. Nature's designs, including living and mental processes, were now viewed through the lens of materialism: reducible to matter in motion. Antecedent teleology and metaphysical claims concerning the directionality of evolutionary change were no longer legitimate scientific a.s.sumptions.
This posed some troubling questions. Given their superficially inverted form of causal influence-with current processes structured with respect to future possibilities-how can the teleological appearance of living and mental processes be accounted for in these same terms? And if these ententional properties are not already prefigured in the inorganic world, how can their novel features be derived from those non-ententional processes alone? In response to these challenges, a number of scientists and philosophers of science realized the necessity of reconciling the logic of physical science with the logic of living and mental teleology. A true reconciliation would need to accept both the unity of material and living/mental processes and the radical differences in their causal organization. Investigators could neither accept ententional properties as foundational nor deny their reality, despite this apparent incompatibility. The key concept that came to characterize an intermediate position was that of emergence.
This use of the term was introduced by the English philosopher and critic George Henry Lewes, in his Problems of Life and Mind (187479), where he struggles with the problem of making scientific sense of living and mental processes. He defines emergence theory as follows: Every resultant is either a sum or a difference of the co-operant forces; their sum, when their directions are the same-their difference, when their directions are contrary. Further, every resultant is clearly traceable in its components, because these are h.o.m.ogeneous and commensurable. It is otherwise with emergents, when, instead of adding measurable motion to measurable motion, or things of one kind to other individuals of their kind, there is a co-operation of things of unlike kinds. The emergent is unlike its components insofar as these are incommensurable, and it cannot be reduced to their sum or their difference.2 Lewes appears to have been influenced by John Stuart Mill's effort to unite logic with the methodologies of such new sciences as chemistry and biology. Mill was particularly struck by discontinuities of properties that could be produced by combinatorial interactions such as chemical reactions. Thus two toxic and dangerous substances, chlorine gas and sodium metal, when combined together, produced common table salt-which is both ubiquitous and an essential nutrient in the living world. The chemical reaction that links these two elements neutralizes their highly reactive natures and in their place yields very different ionic properties on which all life depends. Mill viewed this radical change in properties due to chemical combination to be a.n.a.logous to the combinatorial logic that produced life from mere chemistry. He argues in A System of Logic that All organised bodies are composed of parts, similar to those composing inorganic nature, and which have even themselves existed in an inorganic state; but the phenomena of life, which result from the juxtaposition of those parts in a certain manner, bear no a.n.a.logy to any of the effects which would be produced by the action of the component substances considered as mere physical agents. To whatever degree we might imagine our knowledge of the properties of the several ingredients of a living body to be extended and perfected, it is certain that no mere summing up of the separate actions of those elements will ever amount to the action of the living body itself.3 Over the course of the previous centuries, the idea that living organisms might be machinelike and constructed of the same basic chemicals as inorganic objects had gained credence, but an account of how organism structure and function arose was still mysterious. Before Darwin, their exquisite construction was generally a.s.sumed to be the work of a "divine intelligence." But even centuries before Darwin, the successes of materialist science had convinced many that a spontaneous mechanistic approach had to be possible. This notion was given added support by the success of the empiricist theory of mind described by John Locke in the late 1600s. Locke's account of the gradual and spontaneous development of complex ideas out of the a.s.sociation of sense data acquired via interaction with the world provided an atomistic notion of knowledge as the compound resulting from the a.s.sociative bonds between these "atoms" of experience. That the complex interrelations of ideas could be the result of the environmental impressions on a receptive medium suggested to Locke that a material a.n.a.logue might also be conceivable. In the century and a half that followed, thinkers like Erasmus Darwin (Charles' grandfather) and later Jean-Baptiste de Lamarck struggled to articulate how an a.n.a.logous process might also explain the way that the adaptively a.s.sociated combinations of parts which const.i.tuted plant and animal bodies might have similarly arisen. As Locke had conceived of mind forming from an unformed merely impressionable beginning, couldn't the a.s.sociation of living processes arise from the living equivalent of a blank slate? If such functional a.s.sociations could arise by spontaneous mechanism, then antecedent design would be redundant.
This turned the cla.s.sic "Chain of Being" logic on its head. As Gregory Bateson has described it, Before Lamarck, the organic world, the living world, was believed to be hierarchic in structure, with Mind at the top. The chain, or ladder, went down through the angels, through men, through the apes, down to the infusoria or protozoa, and below that to the plants and stones. What Lamarck did was to turn that chain upside down. When he turned the ladder upside down, what had been the explanation, namely: the Mind at the top, now became that which had to be explained.4 Previously, starting with the a.s.sumption of the infinite intelligence of a designer G.o.d, organisms-including those capable of flexible intelligent behavior-could be seen as progressive subtractions and simplifications from G.o.dlike perfection. In the "Great Chain of Being," mental phenomena were primary. The mind of G.o.d was the engine of creation, the designer of living forms, and the ultimate source of value. Mind was not in need of explanation. It was a given. The living world was derived from it and originally folded within it, preformed, as a definite potential. The evolutionary reconception proposed by the elder and younger Darwins, Lamarck, and others therefore posed a much more counterintuitive possibility. Mind and its teleological mode of interacting with the world could be described as an end product-not an initiating cause-of the history of life. Lamarck's vision of evolution was in this sense ultimately emergent. He conceived of life in purely materialistic terms, and his evolutionary argument proposed a means by which mind might have emerged in a continuous process from otherwise mindless processes. Mill's a.n.a.lysis opened the door to thinking of this a.s.sociation process as potentially resulting in something quite unlike the "atoms" (whether experiential or chemical) from which the process began.
The event that probably played the key role in precipitating the articulation of an emergentist approach to life and mind was the publication in 1859 of Darwin's On the Origin of Species. Although Darwin was not particularly interested in the more metaphysical sorts of questions surrounding the origins of mind, and did not think of his theory in emergence terms, he was intent on addressing the teleological issue. He, like Lamarck, was seeking a mechanistic solution to the problem of the apparent functional design of organisms. But where Lamarck a.s.sumed that the active role of organism striving to adapt to its world was necessary to acquire "instruction" from the environment-a cryptic homunculus-Darwin's theory of natural selection required no such a.s.sumption. He reasoned that the same consequence could be reached due to the differential reproduction of blindly generated variant forms of organisms in compet.i.tion for limited environmental resources. This eliminated even the tacit teleological a.s.sumption of a goal-seeking organism. Even this attribute could be achieved mechanistically. Thus, it appeared that teleology could be dispensed with altogether.
The theory of natural selection is not exactly a mechanistic theory, however. It can best be described as a form of statistical inference that is largely agnostic about the mechanisms it depends on. As is well known, Darwin didn't understand the mechanism of heredity or the mechanisms of reproduction. He didn't have a way to explain how forms are produced during development. And he didn't have an account of the origins of spontaneous variations of organism form. But he didn't need them. As a keen observer of nature, he saw the regular consequences of these mechanisms (e.g., inheritance of traits, compet.i.tion for the resources needed to develop and reproduce, and individual variation) and drew the inevitable statistical implications. Fittedness of organisms to their environment was a logical consequence of these conditions.
What Galileo and Newton had done for physics, Lavoisier and Mendeleyev had done for chemistry (and alchemy), and Carnot and Clausius had done for heat, Darwin had now done for the functional design of organisms. Functional design could be subsumed under a lawlike spontaneous process, without need of spirits, miracles, or extrinsic guidance.
Of course, there were many (including Alfred Russel Wallace, the co-discoverer of natural selection) who felt that the argument could not be extended to the domain of mental agency, at least as is implicit in human experience. Even though Wallace agreed that the teleological influence of a divine intelligence was unnecessary to explain organism design in general, there were features of human intelligence that seemed to be discordantly different from what would have been merely advantageous for survival and reproduction. Despite these misgivings, as the nineteenth century waned and the twentieth century began, it became easier to accept the possibility that the mechanistic view of life could also be imagined to hold for mind. Teleology appeared to be tameable, so long as it was evolvable. If functional design could arise as an after effect of accidental variation, reproduction, and resource compet.i.tion, then why not mental function as well?
Yet it wasn't quite this simple. Mental processes are defined in teleological terms. Although the evolutionary paradigm offered a powerful unifying framework that promised to answer vast numbers of puzzling questions about biological function and design, it also drew attention to what was previously an irrelevant question.
If a teleological account of organism design is actually superfluous, then couldn't it also be superfluous to an account of the teleological features of thought as well? To put it more enigmatically, couldn't Darwinian logic allow science to dispense with teleological accounts at all levels and in all processes? This is of course the motivation behind the various "golem" arguments we critiqued in chapter 3. If something as complex as the fitted functional organization of a body and brain can be generated without the a.s.sistance of teleology, then why should we a.s.sume that complex adaptive behavior, including even human cognition, requires a teleological account to make sense of it? The rise of emergentism in the late nineteenth and early twentieth century can be seen as an effort to rescue teleological phenomena from this ultimate elimination by conceiving of them as derived from a sort of cosmic evolution. With the Great Chain of Being inverted, teleology must be constructed.
REDUCTIONISM.
Emergentism was also a response to another achievement of nineteenth-century science: a new methodology for a.n.a.lyzing nature known as reductionism. If complicated phenomena can be a.n.a.lyzed to component parts, and the properties of the parts a.n.a.lyzed separately, then it is often possible to understand the properties of the whole in terms of the properties and the interactions of these parts. This approach was highly successful. The atom theory of chemistry had made it possible to understand the properties of different substances and their interconvertability in terms of combinatorial interactions between more basic elemental units, atoms, and molecules. The understanding of heat and its convertability into motive power was explained in terms of the movements of molecules. Even organisms could be understood in terms of cellular interactions. Reductionistic a.n.a.lysis was a natural extension of the atomism that began with ancient scholars like Empedocles and Democritus. Atomism was just the extreme form of a tried-and-true explanatory strategy: break complex problems into simpler parts, then if necessary break these into even smaller parts, and so on, until you can go no further or else encounter ultimately simple and indivisible parts. Such indivisible parts-called atoms (literally, "not cut-able")-should, by a.s.sumption, exhibit less complicated properties and thereby reduce complicated collective properties to combinations of simpler properties. Of course, the quest to find nature's smallest units has led to the discovery that the atoms of matter are not indivisible, and even to the realization that the particles that const.i.tute them may be further divisible as well.
All macroscopic objects are indeed composed of smaller components, and these of yet smaller components. So the a.s.sumption that the properties of any material object can be understood in terms of the properties of its component parts is quite reasonable. Thomas Hobbes, for example, argued that all phenomena, including human activity, could ultimately be reduced to bodies in motion and their interactions. This was an a.s.sumption that birthed most of modern science and set it on a quest to dissect the world, and to favor explanations framed at the lowest possible level of scale.
Reflecting on this a.s.sumption that smaller is more fundamental, the Canadian philosopher Robert Wilson dubbed it "smallism."5 It is not obvious, however, that things do get simpler with descent in scale, or that there is some ultimate smallest unit of matter, rather than merely a level of scale below which it is not possible to discern differences. Nevertheless, it is often the case that it is possible to cleanly distinguish the contributions of component objects from their interactions in explaining the properties of composite ent.i.ties. Unfortunately, there are many cases where this neat segregation of objects and relations.h.i.+ps is not possible. This does not mean that complex things lack discernable parts, only that what exactly const.i.tutes a part is not always clear. Nevertheless, phenomena susceptible of simple decomposition led to many of the greatest success stories of Western science.
By the middle of the nineteenth century, it was becoming obvious that the chemistry of life was continuous with the chemistry that applied to all matter. Living metabolism is in this sense just a special case of linked chemical reactions. The discovery of the structure of DNA at the middle of the twentieth century marked the culmination of a century-long effort to identify something like the philosopher's stone of living processes, and was widely heralded as the "secret of life." In a parallel science, it was becoming clear that the smallest units of chemical reactions-atoms-were themselves composed of even smaller components-electrons, protons, and neutrons-and these were eventually found to be further dissectible. The study of brain function likewise progressed from b.u.mps on skulls to the ultrastructure of neurons and synapses. Contemporary neuroscience can boast a remarkably detailed and nearly complete theory of synaptic function, and only a slightly less complete understanding of neurons. In all these sciences, the push to understand the properties of ever smaller and presumably more basic components has led to a profoundly thorough map of the microverse of elementary physical particles, and yet has made comparatively less dramatic progress toward a theory of ordinary-scale compositional dynamics and systemic functions, especially when it comes to ententional phenomena.
A critical shortcoming of methodological smallism, despite its obvious successes, is that it implicitly focuses attention away from the contributions of interaction complexity. This gives the false impression that investigating the organizational features of things is less informative than investigating component properties. This bias has only quite recently been counterbalanced by intense efforts to study the properties of systems in their own right. The sciences of the twenty-first century have at last begun to turn their attention away from micro properties and toward problems of complex interaction dynamics. As the astrophysicist Stephen Hawking has said: "I think the next century will be the century of complexity."6
THE EMERGENTISTS.
In many ways, the first hints of a scientific reaction to the seductive influence of reductionism, and its minimization of the special ententional characteristics of life and mind, date to a group of late nineteenth- and early twentieth-century philosophers of science. They struggled to articulate an alternative middle path between reductionism and vitalism, believing that it must be possible to avoid the reductionist tendency to explain away the special features of ententional phenomena, and yet also avoid invoking enigmatic non-material causes or special forms of substance to account for them. It took some decades for the subtle implications and complications of this emergence perspective to become sufficiently explored to expose both its promise and its weaknesses. Indeed, its promise has yet to be fully realized. The first systematic efforts to explore these implications are a.s.sociated with a small circle of theorists who came to be known as "the British emergentists." Perhaps the most prominent were Samuel Alexander, C. D. Broad, and Conwy Lloyd Morgan, each of whom was influenced by the ideas of Lewes and Mill.
Mill argued that the principles governing the properties of higher-level ent.i.ties are often quite distinct and unrelated to those of the components that const.i.tute them. This is particularly the case with organisms, about which he says: "To whatever degree we might imagine our knowledge of the properties of the several ingredients of a living body to be extended and perfected, it is certain that no mere summing up of the separate actions of those elements will ever amount to the action of the living body itself."7 By "summing up," Mill is invoking an a.n.a.logy to Newtonian dynamics where different force vectors literally sum their component dimensional values. In contrast, he draws attention to chemical composition, where the properties of elements like sodium or chlorine gas are quite distinct from those of simple table salt that is composed by the ionic bonding of these two elements. His notion of levels is based on this distinction. Within a level, he says, interactions are compositional and summative while adjacent levels are distinguished by the appearance of radically different properties resulting from a non-summative interaction among components. Although subsequent emergence theorists differ in the way levels are defined and how properties are related to one another across levels, this conception of higher-level properties distinguished from lower-level properties by virtue of non-additive forms of interaction has been a consistent characteristic of the concept of emergence invoked by all subsequent theorists-a discontinuity of properties despite compositional continuity.
Mill distinguished laws for interactions taking place within a level from those determining relations.h.i.+ps across levels, by calling the former h.o.m.opathic laws and the latter heteropathic laws. h.o.m.opathic laws were expected to have an additive character, producing highly predictable patterns of causal interactions. Heteropathic laws, however, were presumed to be somewhat idiosyncratic bridging laws, linking quite different cla.s.ses of homeopathic properties across levels. Thus, the advent of organisms and minds involved the appearance of novel causal laws but not any new laws of chemistry or physics. These higher-level properties were not discontinuous from lower-level properties, just not predictable using the lower-level laws alone.
Though written before the appearance of Darwin's On the Origin of Species, Mill's argument for the appearance of unprecedented heteropathic laws, which were due to special combinatorial processes occurring at a lower level, left open the possibility that the evolution of novel functions might not merely involve simple mechanical combination. Mill's a.n.a.lysis only went so far as to provide a framework for dealing with the apparent discontinuities between causal laws and properties at different compositional levels; but the general logic of this approach made it reasonable to consider the possibility of the evolution of progressively more complex causal domains from simple physics to social psychology, an idea that Herbert Spencer was later to advocate (though largely remaining within a mechanistic paradigm).
An important bridge between evolutionism and emergentism was provided by the British comparative psychologist and evolutionary theorist Conwy Lloyd Morgan. In the 1890s, he co-discovered an evolutionary mechanism (independently with James Mark Baldwin, for whom the effect is now named) by which strict natural selection could produce Lamarckian-like effects. He argued that behavioral (and thus goal-driven) responses to environmental challenges, including those that could be learned, might become a.s.similated into inherited adaptations in future generations because of the way they would alter the conditions of natural selection. Where the strict neo-Darwinians of the period argued that functionality could only arise in a post hoc fas.h.i.+on from blind chance variations,8 Morgan suggested that prior functional and even goal-directed behavior (whether or not arising post hoc in evolution) could create conditions whereby these tendencies were more likely to become innate within a lineage. It could do so because it would partially spare those lineages able to acquire this adaptation experientially; and if by chance members of such a spared lineage inherited a slightly more efficient means to acquire this adaptation, this variant would likely replace any less efficient means.
In more general terms, this scenario implied that evolutionary change might not be limited to the generation of merely incremental adjustments of past mechanisms, but could at times generate truly unprecedented transitions that took advantage of what Mill would have called heteropathic effects at the organism level. There is no evidence that Morgan developed this idea following Mill's notion, but decades later Morgan enlarged upon the idea in a book t.i.tled Emergent Evolution,9 where he argued that the evolutionary process regularly produced emergent properties. Unfortunately, partly because of the theological implications he drew from this insight, his prescient antic.i.p.ation of the challenge of explaining major evolutionary transitions was not widely influential. The problem posed to natural selection theory by such major evolutionary transitions as the origins of multicellular organisms was not recognized again until the late 1980s and 1990s,10 and remains a topic of debate.
In many respects, the motivation behind the emergence theories of the early twentieth century was to bridge an explanatory gap between the physical sciences and the so-called special sciences of psychology and the social sciences (to which I would add biology). As we have seen, the critical fault line between these sciences can be traced to their approach to ententional processes and properties. The natural sciences must exclude ententional explanations, whereas the so-called special sciences cannot. Biology is in the awkward position of at the same time excluding ententional accounts and yet requiring ententional properties such as representation and function. The question at issue was whether some new kind of causality was at work in the phenomena considered by these special sciences (i.e., the life sciences), or whether their apparently unusual forms of causal relations.h.i.+ps could be entirely reduced to forms of causal relations.h.i.+p that are found in the physical sciences more generally.
Two prominent British philosophers brought the topic of emergence into mainstream discourse during the first part of the twentieth century. Samuel Alexander and C. D. Broad each had slightly different ideas about how to frame the emergence concept. Broad's view was close to that of Mill in arguing that emergent properties changed the causal landscape by introducing properties that were fundamentally discontinuous from any that characterized the component interactions from which they emerged. Alexander was closer to Lewes' argument, proposing that the princ.i.p.al characteristic of emergent properties is an intrinsic inability to predict these properties from the properties and laws at the lower level. This distinction in approach to the characterization of emergent phenomena has been made more explicit over the intervening decades and is generally identified with ontological versus epistemological conceptions of emergence, respectively. Despite this difference in characterizing the nature of the discontinuity and novelty of higher-level emergent phenomena, both men echoed the common theme that low-level component mechanistic processes could be the ground for non-mechanistic emergent properties at a higher level.
Broad, following closely on Mill's ideas, conceived of processes at different levels as obeying distinct level-specific laws that, although incompatible from level to level, were nevertheless related to one another by bridging laws (which he calls "trans-ordinal laws"). Broad further claimed that it is, in principle, impossible to deduce the higher-level properties even from complete knowledge of lower-level laws. Only after observing an actual instance of the emergence of a higher-level process can we retrospectively reconstruct the lawful relations.h.i.+ps that hold between these levels. In other words, even given a complete description of all the intrinsic properties of the parts plus their arrangements and interactions, one could not predict certain properties of the whole higher-level phenomenon.11 Invoking predictability is tricky in this account, since it is not clear whether Broad means that this is primarily a problem of not being able to extrapolate from knowledge of these relations.h.i.+ps, or something stronger: there not being any definite physical determination of the higher- from the lower-level laws.
Broad clarifies this somewhat by arguing that this unpredictability is the result of the ultimate irreducibility of the higher-level phenomena to properties of their lower-level components. For him, the unpredictability is a symptom of a deep incompatibility of causal properties distinguis.h.i.+ng levels. By this he must mean that the trans-ordinal laws do not directly map the higher-level properties onto more basic level properties, but only describe and itemize the lower-level conditions from which the higher-level property emerged in the past and will likely do so in the future. The incompatibility between levels, and the non-deducibility it entails, are for Broad evidence that there may be no fixed and definite unity to the fabric of causal laws.
Alexander, who was influenced by Morgan, also argued that higher-level emergent ent.i.ties and their properties cannot be predicted from component properties and interactions at a lower level.12 But for him this is not because of some metaphysical incompatibility of the ent.i.ties and laws at different levels. It is the result of problems intrinsic to the way this transitional relation must be represented. We are forced to recognize emergent transitions merely as matters of "brute empirical fact," which cannot be antic.i.p.ated or subsumed under a determinate explanation because of our limitations, not any fundamental incompatibility. He admits that a Laplacian demon with complete knowledge of the universe's laws and its state prior to life could in fact predict all future distributions of matter and energy in full detail, but he still would fail to be able to describe such qualities as life or consciousness. In this respect, Alexander is adamant that higher levels exhibit fundamental new qualities that are not predictable even from a full knowledge of all relevant lower-level facts. Living and mental processes are fundamentally new, created afresh, and not merely the "resultant" of chemical and neural interactions. But this doesn't mean that they aren't determined from the previous state of things, only that the logic of this determination is somehow intrinsically beyond computation. Here Alexander's proposal parallels contemporary theories based on notions of determinate chaos (see below), which argue that the non-linear features of certain composite systems makes prediction of their future behaviors incalculable beyond a short period. But this fact could justify his claim that despite emerging from lower-level roots, emergent laws could be special to the higher level.
Unfortunately, these early attempts to base the notion of emergence on a solid philosophical foundation raised more questions than they answered. In their effort to make sense of the relations.h.i.+p between the physical sciences and the special sciences, these philosophers mixed issues of predictability and incompatibility, novelty and discontinuity. Like Mill, they worked more toward justifying apparent descriptive discontinuities observed in nature, rather than deriving the emergence concept from first principles. This left both future emergentists and their critics to argue over the candidate principles. The result has been a wide divergence of emergentists committed to either a mostly epistemological definition (i.e., based on predictability and representational issues) or committed to a mostly ontological definition (i.e., based on a.s.suming fundamental discontinuity of physical laws).
Many other ways of defining emergence have grown up around the different ways that levels, discontinuity, novelty, and predictability are used to define the concept of emergence. So, for example, theorists are often distinguished as either being "weak" or "strong" emergentists, referring to their stance on the question of causal discontinuity and whether emergence is compatible or incompatible with reductionism. Strong emergentism argues that emergent transitions involve a fundamental discontinuity of physical laws; weak emergentism argues that although there may be a superficially radical reorganization, the properties of the higher and lower levels form a continuum, with no new laws of causality emerging. However, this distinction does not capture many more subtle differences, and the perspective developed in this book is not easily categorized in these terms.13 The conceptions of emergence offered by the British philosophers were subsequently treated to harsh criticism in the early part of the twentieth century, though the concept of emergence was also often invoked by prominent scientists to explain otherwise surprising natural phenomena. For example, the developmental biologist Paul Weiss in his explorations of spontaneous pattern formation in cellular, molecular, and embryological context regularly pointed to the importance of the spontaneous emergence of higher-order organization as a critical contributor to biological form. And Ludwig von Bertalanffy's effort to establish a general systems theory repeatedly stressed that complex systems almost always exhibited properties that could not be understood solely in terms of their components and their interactive properties, but also were characterized by novel system-level properties.
The n.o.bel Prizewinning neuroscientist Roger Sperry also updated cla.s.sic emergentist arguments about the nature of consciousness. In an article published in 1980, Sperry argued that although there is no change in basic physics with the evolution of consciousness, the property of the whole system of interacting molecules const.i.tuting brains that we call "consciousness" is fundamentally different from any collective property they would exhibit outside of brains. In this way, he offers a configurational view of emergence. He ill.u.s.trates this with the example of a wheel. Although the component particles, atoms, and molecules forming the substance of the wheel are not changed individually or interactively by being in a wheel, because of the constraints on their relative mobility with respect to one another, they collectively have the property of being able to move across the ground in a very different pattern and subject to very different conditions than would be exhibited in any other configuration. The capacity to roll is only exhibited as a macroscopic collective property. It nevertheless has consequences for the component parts. It provides a means of displacement in s.p.a.ce that would be unavailable otherwise. In this sense, Sperry argues that being part of this whole indirectly changes some of the properties of the parts. Specifically, it creates some new possibilities by restricting others. This trade-off between restriction and constraint on the one hand and unprecedented collective properties on the other will be explored more fully in the next few chapters.
More generally, Sperry uses this example to exemplify another common theme that recurs in discussions of emergence, and increasingly so in more recent discussions. This is the concept of downward causation. For Sperry, the wheel example provides a case of downward causation, because being a part in a particular kind of whole alters possible movement options available to the parts. In Sperry's example, we would not describe this as exactly changing any specific properties intrinsic to the parts, but rather altering how these properties (e.g., the possibility of translation in s.p.a.ce) are changed in their probability of being realized. Outside of their inclusion in a wheel, individual atoms are exceedingly unlikely to move by spiraling in the plane of forward movement; but inclusion in a wheel makes this highly likely. As we will discuss below, this configurational effect that the whole has on its parts might more accurately be described in terms of constraints.
Molecules forming the solid structure of a wheel are constrained in their ability to move in s.p.a.ce by their tight attachment to neighbors. The key factor is that this constraint on movement is additionally influenced by the geometric structure of the whole. Although still constrained by being within a solid aggregate, if the aggregate structure is round, this constraint is potentially overcome at the macroscopic level. Thus, if one wants to move a large ma.s.s of some substance, say a ton of hay, it is easier if the whole aggregate is shaped into a large cylinder and rolled. Sperry uses this a.n.a.logy to argue that the configuration of brain processes similarly changes what can be done with the parts-the brain's neurons, molecules, and ionic potentials. But it is a bit of a misnomer to call this a form of causation, at least in modern parlance. The downward (in levels) causation (from whole to part) is in this sense not causation in the sense of being induced to change (e.g., due to colliding or chemically interacting with neighboring molecules), but is rather an alteration in causal probabilities.
This "downward" sort of causality might better be framed in Aristotelean terms as a species of formal cause (see chapter 1), whereas the notion of being induced to change (e.g., in position or configuration) might be a.n.a.logized to Aristotle's efficient cause. The constraint of being incorporated into a wheel makes an atom's mobility more subject to formal (geometric) features of the whole, and mostly irrespective of its individual properties. It's not that a given atom of the wheel could not be moved in some other way, if, say, the wheel was broken and its parts scattered; it's just that rotational movement has become far more probable.
The notion of downward or top-down causation has, however, been subject to considerable debate. There are almost as many interpretations of this concept as there are emergence theories. It is considered by many to be the most important determinate of emergence and it is also one of the most criticized concepts. To some degree, these confusions reflect an unfortunately simplified conception of causality inherited from our Enlightenment forbears, but it also implicitly incorporates a.s.sumptions of smallism in the sense that it is treated as the reciprocal to a bottom-up notion of causal influence. These a.s.sumptions about the nature of causality will be extensively reexamined in the following chapters, but for now it is sufficient just to recognize that emergence theories are generally defined in terms that counter bottom-upism. They differ in how they do this.
It is not essential, however, to argue for a form of top-down causality in order to define emergence. For example, an alternative clarification of the part/whole issue has been provided by the American philosopher Paul Humphreys. Rather than arguing that the interactions of parts of an emergent whole produce new properties, inherit new properties by virtue of their involvement in the whole, or exhibit new properties imposed by the whole configuration, he argues that in many cases parts are significantly transformed as a result of being merged with one another in some larger configuration. Humphreys maintains that in some cases the very const.i.tution of parts is changed by inclusion in some larger unity. He calls this modification fusion. By virtue of their systemic involvement with each other, they are no longer entirely distinguishable. As a result, reductionist decomposition cannot be completed because what were once independently identifiable parts no longer exist.14 Humphreys' argument has its roots in quantum physics, where the individuation of events and objects is ambiguous (as in quantum entanglement, etc.). It is intended to circ.u.mvent the problem of double-counting causal influences at two levels. It solves this problem by arguing that fusion results in a loss of some const.i.tuent properties. They literally cease to exist as parts are incorporated into a larger configuration. To slightly oversimplify an example from quantum physics, consider the way that quantum properties such as quantum uncertainty (not being able to simultaneously specify the position and momentum of an elementary particle) effectively disappear in interaction with macroscopic instruments, resulting in retrospective certainty. This is the so-called quantum-cla.s.sical transition, often described as the "collapse" of the Schrodinger wave function, which defines a "probability wave." These strange quantum properties are effectively swallowed up in the transition to an event at a higher scale.
An argument that is loosely related to both Sperry's and Humphreys' accounts of higher-order properties can be applied to the presumed part/whole a.n.a.lysis of organisms. Because organism components (e.g., macromolecules) are reciprocally produced across time, the very nature of what const.i.tutes a part can only be determined with respect to its involvement in the whole organism. Indeed, the vast majority of molecules const.i.tuting an organism are enmeshed in a continual process of reciprocal synthesis, in which each is the product of the interactions among many others in the system. They exist and their specific properties are created by one another as a result of this higher-order systemic synergy. With cessation of the life of the organism-i.e., catastrophic dissolution of critical reciprocal interrelations.h.i.+ps-the components rapidly degrade as well. Thus their structures and resultant properties were in large part derived from this systemically organized dynamic. Even though these macromolecular properties were also constrained by possibilities offered by the atomic properties of the const.i.tuent atoms, these combinatorial possibilities are effectively infinite for molecules composed of hundreds or thousands of atoms. As part of a functioning organism, however, the range of their possible interactions and combinatorial configurations is vastly constrained, and the molecules that they form are themselves highly restricted in their interactions and distributions.
Notice that this is related to Sperry's notion of altering certain constraints affecting the parts by virtue of being incorporated into the emergent configuration. But while Sperry's a.n.a.logy of being affected by the geometric configuration of the whole does not fundamentally affect any local properties of the parts, Humphreys' notion of fusion (interpreted more broadly) does. In both the wheel and the organism, there is a change in the constraints affecting lower-level interactions; but in the wheel this affects relational properties and in the organism it additionally affects intrinsic properties. In the wheel, independent mobility of the contained atoms is lost, but intrinsic properties, like ma.s.s and charge, are unaffected. In the organism, the properties of the molecules are a consequence of their incorporation into this system. This apparent difference may in part reflect a complexity difference. The closer a.n.a.logy is between atoms in a wheel and atoms in a molecule.
As we will see in the following two chapters, this distinction exemplifies two hierarchically distinct orders of emergent transition. The organism example a.s.sumes the emergence of ententional organization; the wheel shape does not. There is also a clear sense in which the wheel provides us with an emergence notion that is straightforwardly compatible with a reductionistic account of the higher-order property. The wheel can be dissected into its parts and reconstructed without loss, but a living organism taken apart suffers the Humpty-Dumpty problem. Most of its parts are themselves unstable ent.i.ties outside of the context of a living organism, so it becomes a problem to decide what exactly are the "proper parts" of an organism that in interaction determine its emergent character. One of the crucial differences is that the emergent relations.h.i.+p in the wheel example is synchronic; but whereas there is a synchronic a.n.a.lysis possible for the organism, the property that is crucial to the const.i.tution of its proper parts is the dynamics of reciprocal synthetic processes that is intrinsically diachronic, as well as the larger diachrony of the evolutionary process that resulted in this metabolic system. Thus, at least for higher-order forms of emergence, the part/whole distinction and the synchrony/diachrony distinction are intertwined.15
A HOUSE OF CARDS?.
The most influential critiques of ontological emergence theories target these notions of downward causality and the role that the emergent whole plays with respect to its parts. To the extent that the emergence of a supposedly novel higher-level phenomenon is thought to exert causal influence on the component processes that gave rise to it, we might worry that we risk double-counting the same causal influence, or even falling into a vicious regress error-with properties of parts explaining properties of wholes explaining properties of parts. Probably the most devastating critique of the emergentist enterprise explores these logical problems. This critique was provided by the contemporary American philosopher Jaegwon Kim in a series of articles and monographs in the 1980s and 1990s, and is often considered to be a refutation of ontological (or strong) emergence theories in general, that is, theories that argue that the causal properties of higher-order phenomena cannot be attributed to lower-level components and their interactions. However, as Kim himself points out, it is rather only a challenge to emergence theories that are based on the particular metaphysical a.s.sumptions of substance metaphysics (roughly, that the properties of things inhere in their material const.i.tution), and as such it forces us to find another footing for a coherent conception of emergence.
The critique is subtle and complicated, and I would agree that it is devastating for the conception of emergence that it targets. It can be simplified and boiled down to something like this: a.s.suming that we live in a world without magic (i.e., the causal closure principle, discussed in chapter 1), and that all composite ent.i.ties like organisms are made of simpler components without residue, down to some ultimate elementary particles, and a.s.suming that physical interactions ultimately require that these const.i.tuents and their causal powers (i.e., physical properties) are the necessary substrate for any physical interaction, then whatever causal powers we ascribe to higher-order composite ent.i.ties must ultimately be realized by these most basic physical interactions. If this is true, then to claim that the cause of some state or event arises at an emergent higher-order level is redundant. If all higher-order causal interactions are between objects const.i.tuted by relations.h.i.+ps among these ultimate building blocks of matter, then a.s.signing causal power to various higher-order relations is to do redundant bookkeeping. It's all just quarks and gluons-or pick your favorite ultimate smallest unit-and everything else is a gloss or descriptive simplification of what goes on at that level. As Jerry Fodor describes it, Kim's challenge to emergentists is: "why is there anything except physics?"16 The concept at the center of this critique has been a core issue for emergentism since the British emergentists' first efforts to precisely articulate it. This is the concept of supervenience. Supervenience is in many respects the defining property of emergence, but also the source of many of its conceptual problems. The term was first used philosophically by Lloyd Morgan to describe the relations.h.i.+p that emergent properties have to the base properties that give rise to them.17 A more precise technical definition was provided by the contemporary philosopher Donald Davidson, who defines it in the context of the mind/body problem as follows: "there cannot be two events exactly alike in all physical respects but differing in some mental respects, or that an object cannot alter in some mental respects without altering in some physical respects."18 This defines an asymmetric dependency in this hierarchic relations.h.i.+p, which is sometimes stated in aphoristic form as: there cannot be changes in mental (aka emergent) properties without a change in neurophysiological (aka physical substrate) properties. It is an emergentist no-free-lunch restriction. The fundamental challenge of cla.s.sical emergentism is to make good on the claim that higher-order (supervenient) properties can in some critical sense not be reduced to the properties of their component lower-level (subvenient) base, while at the same time being entirely dependent on them. So, if one agrees that there can be no difference in the whole without a difference in the parts, how can it be possible that there is something about the whole that is not reducible to combinations of properties of the parts?
Looking at the world in part/whole terms seems like an unimpeachable methodology. It is as old as the pre-Socratic Greek thinkers. It figures prominently in the thinking of Plato and Aristotle, and remains almost an axiom of modern science. Philosophically, the study of compositionality relations.h.i.+ps and their related hierarchic properties is called mereology. The term quite literally means "the study of partness." The concept of emergence, which has its roots in efforts to challenge the notion that the whole is just the sum of its parts, is thereby also predicated on mereological a.s.sumptions. This critique, if coherent, suggests that this foundational a.s.sumption of emergentism renders it internally inconsistent. But mereological a.s.sumptions may prejudice the case.
Effectively, Kim's critique utilizes one of the princ.i.p.al guidelines for mereological a.n.a.lysis: defining parts and wholes in such a way as to exclude the possibility of double-counting. Carefully mapping all causal powers to distinctive non-overlapping parts of things leaves no room to find them uniquely emergent in aggregates of these parts, no matter how they are organized. Humphreys' concept of fusion appears on the surface to undermine this stricture by challenging the basis for whole/part decomposition. The example of macromolecules in an organism also suggests that at least simple decomposition is problematic. But this does not entirely escape the critique if we recognize that these system-modified parts still a.n.a.lyze into smaller ultimate parts. Shuffling the part/whole relations.h.i.+p at one level does not alter it all the way down.
Kim's critique is most troublesome for emergence theories that have been developed to support functionalism (see chapter 1). Functionalism claims that the same form of operation can be physically embodied in different media (e.g., the same program run on two different computers) and that the different physical implementations will nonetheless have the same causal powers, despite having entirely different ultimate components embodying this operation. As we saw in chapter 1, this claim for multiple realizability is also a feature of natural-kind concepts like solid, liquid, and gas. Thus water and alcohol at room temperature both exhibit surface tension, viscosity, the capacity to propagate transverse waves, and so on. It's only when we ask for an accounting of all the causal powers that different liquids can be discriminated. Indeed, there is a subset of behaviors in each case that are expressed despite the difference in physical substrates (sometimes called universality cla.s.ses); but because there is some level of detail at which differences can be discriminated, we must conclude that these differences as well as the similarities are attributable to the different physical const.i.tuents of each.
There have been many challenges and responses to this line of criticism, which traces back to the time of the British emergentists,19 but here I will focus on one that will serve as the starting point for considering a very different sort of alternative. This is the fact that the substance metaphysics that supports this mereological a.n.a.lysis does not accurately reflect what we currently know of the physics of the very small: quantum physics.
The scientific problem is that there aren't ultimate particles or simple "atoms" devoid of lower-level compositional organization on which to ground unambiguous higher-level distinctions of causal power. Quantum theory has dissolved this base at the bottom, and the dissolution of this foundation ramifies upward to undermine any simple bottom-up view of the causal power. At the lowest level of scale there are only quantum fields, not indivisible point particles, or distinguishable stable extended configurations. Quantum fields have ambiguous spatiotemporal origins, have extended properties that are only statistical and dynamically definable, and are defined by a dynamical quality, a wave function, and not any discrete extensional boundary. At this level, the distinction between dynamics and the substrate of dynamics dissolves. Quantum interactions become "cla.s.sical" in the Newtonian sense in interactions involving macroscopic "measurement," and only then do they exhibit mereologically identifiable properties (recall Humphreys' notion of fusion). But at this presumed lowest level, discrete parts do not exist. The particulate features of matter are statistical regularities of this dynamical instability, due to the existence of quasi-stable, resonantlike properties of quantum field processes. This is why there can be such strange unparticlelike properties at the quantum level. Only with the smoothing of statistical scale effects do we get well-behaved mereological features.
This is not meant to suggest that we should appeal to quantum strangeness in order to explain emergent properties, nor would I suggest that we draw quantum implications for processes at human scales. However, it does reflect a problem with simple mereological accounts of matter and causality that is relevant to the problem of emergence. A straightforward framing of this challenge to a mereological conception of emergence is provided by the cognitive scientist and philosopher Mark Bickhard. His response to this critique of emergence is that the substance metaphysics a.s.sumption requires that at base, "particles partic.i.p.ate in organization, but do not themselves have organization." But, he argues, point particles without organization do not exist (and in any case would lead to other absurd consequences) because real particles are the somewhat indeterminate loci of inherently oscillatory quantum fields. These are irreducibly processlike and thus are by definition organized. But if process organization is the irreducible source of the causal properties at this level, then it "cannot be delegitimated as a potential locus of causal power without eliminating causality from the world."20 It follows that if the organization of a process is the fundamental source of its causal power, then fundamental reorganizations of process, at whatever level this occurs, should be a.s.sociated with a reorganization of causal power as well.
This s.h.i.+ft in emphasis away from mereological interpretations of emergence evades Kim's critique, but it requires significant rethinking of the concept of emergence as well. In many respects, supervenience was one of the defining features of cla.s.sic emergence theories. But it shouldn't have surprised us to find that a synchronic understanding of this relations.h.i.+p is an insufficient basis for the concept of emergence. For one reason, emergence itself is a temporal conception. Most if not all of the higher-order properties considered to be emergent were not always present in the universe, and not always present in the local contexts in which they currently exist. Certainly in the case of life and mind, their emergent characteristics are relatively recent phenomena in the history of the universe and of this tiny sector of our galaxy and our solar system. These emergent phenomena and their ententional properties emerged as new forms of physical-chemical process organization developed among the existing atoms and their energetic interactions. As they did so, new kinds of components also came into existence. Static notions of part and whole are for this reason suspect since the wholes we are interested in are dynamical and the parts are constantly in flux, being constantly synthesized, damaged, and replaced, while t