Why assume that an organism’s variation is random? Although genetic variation is random with respect to selective conditions in the environment, genetic variation that contributes to the fitness of the animal can be propagated, organism variation necessarily changes only what already exists. What already exists is not random.
The Theory of Facilitated Variation shows in specific, biological terms, the role of cellular life in life’s evolution. Kirschner and Gerhart argue that evolution is not the totally random process that we’ve assumed. Though a perturbance (mutation) may be random, what is perturbed is not random. Nor is the response to the perturbation ramdom. Kirschner and Gerhart show how random mutation is used by organisms in a nonrandom, biased way. On their view, one key role of mutations is to change how, where, and when genes are expressed during the development of an embryo into an adult.
Kirschner and Gerhart’s grand synthesis identifies clues at the developmental level of the cell toward answering macro level questions of evolutionary biologists. Their Theory of Facilitated Variation offers a process-focused view of evolution that could not be achieved solely by analyzing evolution’s products as objects that compete for survival of their fittest genes.
Their Theory of Facilitated Variation complements traditional random mutation, describing complementary non-random mechanisms that evolution uses to generate novelty through adaptation. The acceleration of evolution increased with the invention of eukaryotic cells and the possibility of multicellularity.
Behavioral modification as a survival strategy
The traditional story of evolution focused on evolution’s objects, which would survive and replicate (or not) based on their fitness as objects. An object fixation might lead us to assume that evolution progresses as environmental selection operates on evolution’s objects. But if it’s not evolution’s objects, but their capacities to evolve effectively that evolution selects for, how does evolvability evolve? Who selects for evolvability? Would an environment be able to “see” this useful trait?
This question led Marc Kirschner and John Gerhart to focus on evolution as a set of interrelated processes. They describe how evolution relies on adaptive cell behaviors, later conserved — how evolutionary innovation is accomplished by adapting core processes for metabolism, information retrieval, signaling, and developmental mechanisms
Compartmentation
The role context plays is illustrated via the mechanism of compartmentation — the responsiveness of cells to “where they are” in the big picture. Compartmentation is a control mechanism, restricting where particular genes are activated. Compartmentation, separating different regions of the embryo, allows genes to have different functions in different parts of the animal, affording the system great creative freedom. A gene is turned on differently in the brain than in the kidney. Initial work on Drosophila compartments and selector genes in the 1980s expanded to explore how this principle is widely deployed to give genetic mechanisms variability in different contexts.
Cells that make bones or ribs or vertebrae look alike, but they know how to respond to their context. They reside in different compartments and so, although an observer sees no obvious differences in these cells, the cells respond differently to signals in different environments. The same cellular mechanisms generate ribs in one part of the body, vertebrae in another.
Compartmentation is one of life’s strategies for design development. Once evolution had invented the compartment concept, as an explanation of how the embryo regulated cell differentiation, earlier focus on the evolution of anatomical traits shifted to focus on how evolution evolved processes capable of generating those traits.
Compartmentation operates in four distinct ways:
1) Spatial compartmentation uses core processes differently in different spatial compartments, generating very different forelimbs and hindlimbs in the kangaroo, for example.
2) Temporal compartmentation uses core processes differently at different stages of development, as in the larva and adult.
3) Sexual compartmentation uses core processes differently in males and females.
4) Cell type compartmentation uses core processes differently in different subsets of the gene expression space that map stably to cell types. Compartmentation organizes the embryo into domains to control cell type, making it possible for organism anatomy and physiology to evolve toward increasing complexity.
Complexity, emergence and evolvability
If evolvability is the capacity to evolve, this circular definition gains meaning by being decomposed into two components.
First, the selection component of evolvability lies in the fortuitous relationship of an organism to its selective conditions. An organism with tolerance to temperature increase or acidity, if it happens to live in a lake that becomes hotter or more acidic, may survive, though these conditions were not predicted. In contrast to the conventional view that selection is non-random, this aspect of environmental selection has a random component, since we cannot precisely predict how the organism will relate to its environment.
Second, the most important variation component of evolvability measures the intrinsic capacity of an organism to vary its phenotype, its physical self, based upon its genetic makeup and other properties. This capacity to vary contributes enormously to its ability to respond to its environment, which again is non-random. In contrast to the conventional view that variation is random, this aspect of variation has a non-random component, explained by the Theory of Facilitated Variation.
In decomposing that circular definition, Kirschner topples two traditional assumptions:
The first misconception: Many take for granted that survival of the fittest through natural selection is non-random. Instead, Kirschner emphasizes that the relationship of the organism to selective conditions is fortuitous, i.e. random.
The second misconception: We assumed that the chance element in evolution was random variation. Instead Kirschner questions what we took for granted, arguing that Facilitated Variation is non-random.
Exploratory behavior & evolutionary dynamics
Exploratory behavior on the micro scale and evolutionary dynamics on the macro scale may be parallel processes characterized by similar attributes.
Exploratory behavior allows novelty to arise in evolution as organisms adapt to their life circumstances and environmental conditions, the non-random aspect of evolution. Scientists are fascinated by the many exploratory processes in living systems because exploratory behavior can generate a large number of outcome states from which the organism selects, retains, and stabilizes the most useful state. This is a clear instance where the organism itself is the selective agent, acting as its own regulator and stabilizer, exhibiting autonomy. This role of the organism as selective agent is non-random.
Darwin wondered how new structures could be made when so may components need to be coordinated to create a new structure. So, for example, a new skull shape entails creating not just the bones, but also the nerves, vascular system and muscles. Harvard geneticist Clifford J. Tabin studied how finch beaks vary by examining cellular change. He conducted genetic experiments with chicken embryos to generate larger beaks. Genetically altering a beak to make it wider did not require genetically altering the head size or other related components. Instead, the head self-adjusted so that the beak was attached to an appropriate head. Broadening the beak automatically broadened the size of the head, changed the skeletal structure, altered eye placement, and so on.
If evolution were purely random, the chance of coordinating all these systems that need to work together would be very remote. Since these systems are interrelated, they cannot be specified in isolation. The muscles must be connected to the bones, the nervous system and vascular system so that all can function together. But once the bones are in place, the muscles must take their place relative to the bones.
Exploratory processes affect the adaptation of muscles, vasculature, and nerves, requiring no simultaneous genetic change. They’re very adaptive to any change that happens to occur. The vascular system (blood vessels) looks for low oxygen levels, which explains how there’s increased supply of blood to a foetus or tumor without any planning. Tissues send out signals when they need oxygen, and the vascular system responds to those signals. So, because these processes are coordinated, they aren’t developed randomly.
How does our nervous system, with its trillions of cells, and connections between cells, define those connections? Since there aren’t enough genes to specify every nerve connection (if there were, we’d have non-adaptive nervous systems), there must be an effective developmental process. In brain development, nerve axons grow in many random directions in superfluous numbers. When they meet a target, they are stabilized. Only those that contact targets are retained. When a target is moved, the system still works. When axons grow in a direction where there are no targets, this superfluous, non-useful production is followed by pruning, a common creative strategy in both evolution and development. Synapse elimination through apoptosis (programmed cell death) retains only those nerve cells that happen to grow to a useful target.
The vascular system (blood vessels) and nervous system are prime examples of exploratory processes. There is no predetermined genetic map for the distribution of blood vessels in the body. “Useful” can be differently specified as needs change. Because the system isn’t hardwired; it’s highly adaptive. Muscles, vasculature, and nerves can adapt, without requiring simultaneous genetic change. The vascular system looks for low oxygen levels, which explains how there’s increased supply of blood to a tumor without any planning. Tissues send out signals when they need oxygen, and the vascular system responds to those signals, continually expanding to regions with insufficient oxygen supply. So the vascular system is responsive, adapting to oxygen need. Tumors are invaded by blood vessels and fully vascularized. Because all these processes are coordinated, they aren’t developed randomly.
Ant foraging also illustrates exploratory behavior, which starts random and gradually becomes constrained toward useful paths. At the beginning ants do not know where food is. They migrate out from the anthill in random directions. When ants find food, they secrete more chemical on the trail, signaling to other ants to prefer that trail over others. The exploratory process starts random, and gradually evolves, becoming progressively less random.
The growth of microtubules (components of the cytoskeleton of the cell) is an exploratory process investigated by Kirschner and Gerhart. Kirschner said, “My interest in exploratory behavior has its roots in our discovery of the dynamic instability of microtubules. That discovery had a large impact on cell biology in general. The fact that microtubules consumed energy to explore space randomly illustrated behavior that generated useful patterns through variation and selection (or pruning). The actual mechanism was unusual and interesting from a biophysical perspective. But it also struck me that this mechanism could be adapted for use in many different cell types in many different circumstances.”
“The general usefulness of this exploratory behavior may explain why it was conserved from all eukaryotic organisms. Even in prokaryotes it occurs, not with tubulin, but with another protein.
“Deconstrained exploratory processes can change the amount, time, kind, and place of gene expression, generating many different phenotypical outcomes, or states, from a limited number of genes. Some of these states are selected for physiological adaptability to their environmental conditions. Exploratory systems don’t need to change or adapt to make irreducible complexity more reducible. Because exploratory processes can adapt to unpredicted changes, the Theory of Facilitated Variation refutes the argument that irreducible complexity would make stepwise mutation incapable of inventing certain complex mechanisms.”
Evolutionary dynamics manifest similar attributes to exploratory processes on a macro scale. If evolution were merely random, one might expect several consequences. First, random mutation would probably keep reinventing the wheel. Although some good inventions were generated multiple times, in general, once evolution had a good invention to pass on, that good invention was retained and reused. Evolution then moved on to invent other things.
Second, evolution based solely on random mutation would logically generate novelty at a uniform rate. This did not occur, suggesting that evolution may not be as random as we once assumed. As Eldredge and Gould noted in their Theory of Punctuated Equilibrium, evolution came in waves, followed by periods of relative stasis.
The first wave of evolutionary invention came with the origin of life, which is estimated to have occurred more than 3.8 billion years ago, producing cell membranes, metabolism, DNA, RNA, proteins, and ribosomes. These great inventions persisted, and remain with us today. Simple, single-celled prokaryotes were the only life forms on the planet for nearly two billion years. If evolution were merely random, why didn’t it produce some other genetic codes over the next several billion years? But, with the universal genetic code problem solved, evolution moved on.
The second wave of evolutionary invention toward generating the human species came more than two billion years ago, when more complex eukaryotic cells were invented with nucleus, organelles, cytoskeleton, sexual reproduction, chromosomes and signaling molecules. Again these inventions persisted, and remain with us today. Once these more advanced eukaryotic cells had invented new cellular processes for metabolism, information retrieval, and development, these conserved core processes became part of evolution’s toolkit, enabling evolution to accelerate.
The third wave of evolutionary invention also came more than two billion years ago with the invention of multi-celled organisms, including new core processes, cell signaling and capacity for cell differentiation, processes that were also conserved to the present. This third wave of evolution brought multi-celled organisms.
The fourth wave came about 600 million years ago when evolution invented a range of body plans for worms, insects, vertebrates — bilateral body plans with head and tail ends. Ongoing discoveries are revising our view of evolution's fourth wave, the so-called Cambrian explosion. Although whether it was really an "explosion" or not is now debated, the point Kirshner and Gerhart make remains valid: there hasn’t been a proliferation of new body plans in the last 600 Million years.
Evolution proceeded to repeat this clever structural strategy throughout the living world. Once having invented a range of body plans around 600 million years ago, evolution stopped inventing body plans. Kirschner muses, “If evolution could make all these new body plans 600 million years ago, why didn’t evolution make some new body plans 200 million years ago?” But evolution didn’t, again suggesting that evolution was not merely random. Body plan problem solved. Evolution moved on.
Facilitated Variation
Facilitated Variation serves as a bridge from the micro-level of cell biology to the macro-level of more than three billion years of evolution, translating implications of recent discoveries in cellular and developmental biology to evolutionary biology.
The Theory of Facilitated Variation builds a process bridge between genetic variation and organism-species variation, questioning the nature of variation itself. Facilitated Variation explains the variation component of evolvability by examining the underlying embryology, cell biology, and biochemistry of organisms. Three key properties of Facilitated Variation are: first, the capacity to maximize effective phenotypic variation for given genotypic variation, amplifying effective variations; second, the capacity to minimize lethality of variation; and finally, the capacity to adapt phenotypic variation to environmental conditions, even when those conditions have never been experienced before by the organism’s ancestors.
This theory proposes a series of mechanisms through which genotypic variation is interpreted and extended by phenotypic variation, as life designs itself in the context of its environment. They examine how seemingly complex biological systems can arise from limited numbers of genes and variation mechanisms — the relation between genes and the organisms they produce.
Biologists Marc Kirschner (Chair of Systems Biology Harvard) and John Gerhart (UC Berkeley) provide evidence that our traditional view of evolution is incomplete in having overlooked the creative role played by evolutionary mechanisms. They set out to show that phenotypic variation is facilitated by the phenotype itself, that life is its own intelligent designer and that random changes in the genotype produce non-random changes in the phenotype.
Why would evolution harness Facilitated Variation? And why would an organism need to possess these mechanisms, both for its own survival and to contribute effectively toward evolution’s next steps?
The Theory of Facilitated Variation exposes our common view of evolution as far too simplistic. The traditional view of selection as non-random is turned on its head when Kirschner notes the fortuitous relationship of the phenotype to its selective conditions. And the traditional view of variation as random is tempered by recognizing that how an organism accepts and works with variation, and that organisms’ evolvability (capacity to evolve), are both non-random.
They asked a basic question: How does an organism interpret and express changes in its genes?
Or, in other words, How does an organism generate its phenotype from its genotype? The phenotype is what you see, an animal’s characteristics. It’s not easy to make a direct translation from genetic differences to physical differences. The organism we see, its anatomy, physiology and behavior is under selection. But how do genes determine the organism?
While all biologists admit that this translation is complex, few ask the obvious question. If it’s so complex, could this indicate that we’ve misunderstood something?
Facilitated Variation is life's way of designing itself. Increasingly, scientists are questioning whether the Darwinian model of evolution is adequate to explain evolution. I cite from the work of biologists Marc Kirschner and John Gerhart, Stuart Kauffman, and many others to make this argument.
State selection
Physiological systems can be stabilized, both environmentally and genetically, in different states. This is a property called state selection. State selection is a key feature of facilitated variation, enabling the organism itself to select its state. States are well buffered by other physiological circuits so that evolutionary (genetic) state selection tends to be non-lethal and adaptive. State selection is directed, not by the outside environment, but through internal developmental processes, by life designing itself, self-organizing, driving its own evolutionary arrow toward complexity. Although generating the needed variation in states may be complicated, it makes selection comparatively simple.
Tolerance of a system to adapt its inputs
Animal breeding inspired Darwin and continues shed light on evolution. Mating two different breeds of dogs generally produces a functional dog, showing how flexible the developmental process is to assemble new combinations and make them interoperable. This developmental capacity to crossbreed animals illustrates principles that are more broadly applied in evolution.
Computer scientists worry about interoperability when they build open source systems to plug and play well together. The term interoperability, common in computer science, but neglected in biology, is highlighted by their concept of weak linkage, which underpins our theory.
Weak Linkage
“Weak” is a misleading term because, from a design perspective, the great strength of weak linkages arises from the design principle described in the chapter, “Tolerance: precisely using imprecision.” Weak specification implies a high level of tolerance, offering greater potential to innovate.
Weak linkage, the foundation for all aspects of Facilitated Variation, is not instructional. It simply flips a switch to trigger a response, allowing the receiver to determine what the appropriate response should be in context and to select an appropriate response behavior. Each organism is a poised, rapid responder system. The more viable the organism, the greater its tolerance range and capacity to respond and adapt its responses to changes in environmental conditions.
Weak linkage facilitates evolutionary change that is responsive in context. Weak linkage allows the coupling of processes to each other, and to various inputs, such that a minimally informative signal can produce a response that is maximally adaptive from the perspective of the responder. The response can be shaped by conditions in the environmental context, which determine how the weak linkage is interpreted in context.
What Kirschner and Gerhart call weak linkage might have been called tolerant linkage. Weak linkage affords tolerance for varied interpretations of a given linkage, depending on the context in which a weak linkage is interpreted. A system with weak linkages can use flexible signals with low information content to control complex processes. Weak linkage is a mechanism through which evolution establishes “tolerance ranges within which interpretation can occur.” Larger tolerances enable evolution to advance more rapidly. Each message is adaptable, interpreted as appropriate in context.
"Some scientists think that how evolution proceeds is explained by the nature of the transcriptional apparatus and gene regulatory networks. True. Many changes occur in regulating gene expression. But these scientists neglect the fact that reordering, making different combinations of components, is not the really hard problem. The hard problem is to make components so that, when you recombine them, they’ll function together.”
Core processes with high adaptability have a high capacity for weak linkage. Weak regulatory linkages provide loose mechanisms for signal transcription, enabling genes to be expressed in new ways, producing variations that can be selected in the phenotype. Changes in, and combinations of, these weak permissive signals give rise to new novel functions at the structural and molecular level. Weak linkage occurs in signal transduction (when cells receive signals at their surface and relay them through the cytoplasm by means of controlled internal chemical changes) and signal transcription (translating a signal). Weak linkage can modify bone morphogenetic protein signals on many levels (receptor, secreted inhibitors, ligands, and cellular regulation).
Weak linkages have several special properties, which support all three of the core processes already described above:
A first special property of weak linkages is that they enable biological systems to be constructed to allow for facile evolution of new input-output relations, adapting on the fly to environmental conditions.
A second special property of weak linkages is their capacity to support exploratory processes at differing levels of complexity, again as a way to adapt to current demands.
A third special property of weak linkages enables what Kirschner and Gerhart refer to as state selection in physiology — the capacity of living organisms to alter their own states (because of their autonomy).
A fourth special property of weak linkages supports compartmentation, through which genetic information is specified in context.
|