The second law of thermodynamics takes science into the realm of the nonlinear. The trajectory of a falling apple, or the arc of a thrown ball are constant, predicable, linear events. They are predictable precisely because there are a limited number of measurable inputs: the strength and direction of the throw and the force of gravity.
The chaotic, the complex, the catastrophic are all nonlinear. They include systems like weather, business cycles, the heart, and the brain. These systems have vast sets of inputs, many of which cannot be accurately known. One can only model these systems using computers and a new kind of math. They require a new set of expectations. The mathematics predicting how all these systems will behave produce not pinpoint answers, but a range, something between zero predictability (completely chaotic) and dynamic systems of probability. The dynamic systems, when mapped out by computers, look like three-dimensional blobs. But they have some predictive power—they tend to orient around a distinct region or point. Those regions or points are called attractors.
For example, a pendulum will swing in imperfect circles, but will tend to fall toward a point of rest, called a point attractor.* If you design a more complicated system with more inputs, the computer generates a three-dimensional picture that looks something like a kidney bean, with the system tending toward two specific regions. The graph will look like a tangle of strings, but the strings will circle or swirl around two more or less circular regions. Real life looks this way: business cycles look rough and jagged and unpredictable, but they tend to repeat themselves. The weather changes and we are caught without an umbrella, but less often in some seasons than others. We know that change happens, but in a vaguely regular fashion.
Back to thermodynamics. All this gets back to fields or gradients—one end of the system has a lot of something (heat, entropy) and the other end has only a little. The abundant, as the second law tells us, tends to diffuse toward the nonabundant, the system strains toward an equilibrium state of microscopic mixing and macroscopic uniformity.
That sorting out can be completely chaotic, but often it is not. Instead it is complex, vaguely predictable. For example, put a pot of water on a hot stove. As the heat enters the system, it creates a gradient, or field. Water molecules at
* Another example of a self-organized structure is the stormy red spot on Jupiter. It perfectly matches Earth-bound models for dynamic gaseous fields organized around a point attractor.
the bottom, close to the heat, become active and rise, more or less willy-nilly, to the surface, where there is no heat. As the heat increases, the gradient becomes greater. But this is a complex system—water molecules have to deal with the heat and forces of mutual attraction and viscosity. With the added inputs comes added constraints. With all of these influences, water molecules simply can't rise fast enough if they remain completely unorganized.
As a result, convection currents form. As the water begins to boil, the bubbles come up in a regular honeycomb pattern. The currents look like elongated donuts, the water molecules rising and falling back down following essentially circular paths. This semi-organized behavior allows for more efficient dissipation of heat energy.
These sorts of self-organizing, dissipative structures occur all over in nature: hurricanes and tornadoes and the vortex of water running down your tub drain are all dissipative structures. Maybe living beings are just elaborate entropy-dissipating structures.
The elucidation of these structures induced Ilya Pri-gogine, a Russian-born physical chemist, to assert that the second law of thermodynamics is not just consistent with evolution, it helps explain it.
In his second year of medical school at the University of California, San Francisco, Stuart Kauffman was bored. He was interested in embryology, but mainly because he wanted to work out how natural selection might act on developmental steps he was studying. He had read theories of how regulatory genes might function in parallel rather than sequentially, and he was intrigued with the idea.
Kauffman had a background in philosophy and understood systems of logic. He decided to try to fit gene function into a Boolean network. Boolean networks harken back to George Boole, an English mathematician of the nineteenth century. In Boolean networks, elements can be in one of two states: on or off. Which state they are in depends on the activities of various numbers of modifiers. For example, an element may be on if one modifier is on, or it may be on only if all modifiers are also on. There are sets of rules that determine these contingencies, and all can be programmed into simple computer systems. For Kauff-man, the elements were genes, which were either expressed or not expressed. He set up his systems to have varying numbers of genes and modifiers for the genes.
Once programmed, all the permutations are tested and graphed. Depending on numbers of modifiers, Kauffman's models yielded two distinct, if complex, sets of results.
First, with very few modifiers, the results form a three-dimensional graph with lines circling around a few regions. These regions are the attractors; the results tend to be close, or attracted, to the results represented by the region. If he disturbed the system, the graphs would change but then settle back into cycling around the same attractors. From this Kauffman could see that genes subjected to few modifiers would tend to produce organisms with a predetermined set of characteristics. This model suggested that constraints on evolution were built into the system—that evolution was determined more by the original form of the organism than by any external forces. This looked bad for selection as an important force in evolution.
Kauffman took the model a step farther. He added more modifiers, just enough to bring the model to the edge of chaos. If he pushed it all the way to chaotic activity, there would be no predictability at all, only chaos. But Kauffman stopped at the edge, and there he found a very different picture.
There were attractors at the edge, but they were not consistent. That is, if he disturbed the more complex system in any way, it would settle down again, around the same sort of region as in the first model. But the attracting region would not be in the same place at all. A disturbance in the system could result in a new place of equilibrium—a new form—and it was quite a different place from the original.
This meant that external change could have a strong effect on the system. Or, in a model of genes and organisms, that the environment could drive significant change as long as the system was on the edge of chaos. As he manipulated this model more and more, he found that it liked being on the edge. Indeed, Kauffman concluded that selection itself, outside disturbance, tended to keep the system on that edge, so selection essentially selected for itself. In this way Kauffman, who at first gathered evidence against the Darwinian theory of evolution, in the end found himself with a system much more compatible with Darwin's ideas.
If all this seems too abstract and rather far from reality, you are not alone. Kauffman and others are squeezing themselves into a new niche, making a place for completely theoretical biology, analogous to cosmology's place in physics. John Maynard Smith even goes so far as to disparage it as a "fact-free science," since it exists only within a virtual realm.
Still, it offers a provocative new paradigm that seems smoothly continuous with the past: Darwin started biology on its path away from the static predictability of Newton; early population geneticists carried it into the realm of probability, as quantum mechanics carried physics into a world of uncertainty. Darwin took biology away from the static mechanics of Newton; thermodynamics moved it far from equilibrium; complexity carries it even farther, to the very edge of chaos. There is no longer one answer, not even a range of close guesses. Instead there are constraints and propensities, and that is what we should expect; that is the way we should now perceive the world. Life is neither predictable nor regular—not perfectly random, nor perfectly designed. As David Depew and Bruce Weber put it in their book Darwinism Evolving, "we now recognize that, in spite of what Einstein believed, God not only plays with dice, but the dice are loaded."
Was this article helpful?