Digital Complexity: What Computer Simulations Can’t Explain

A 2000 paper in PNAS by Adami, Ofria, and Collier, titled “Evolution of biological complexity,” uses a computer simulation called Avida to argue that genomic complexity is forced to increase over time. While the digital experiments are interesting, they demonstrate a pre-programmed optimization process, not the kind of creative power necessary to turn microbes into man.

What the Paper Claims

The authors claim to show that Darwinian evolution, acting like a filter, naturally drives an increase in complexity. They use digital “organisms”—self-replicating computer programs—and define complexity as the amount of information the program’s code stores about its fixed, artificial environment. In their simulation, programs that more efficiently perform pre-defined tasks (like simple logic operations) out-compete others. Over thousands of “generations,” the programs’ code becomes more refined for these tasks, which the authors measure as an increase in “physical complexity.”

Key Findings and Critical Analysis

The paper’s conclusions rest on a foundation of carefully chosen definitions and highly controlled, artificial conditions. A closer look at the authors’ own statements reveals why these results are not evidence for large-scale, creative evolution.

1. Sidestepping Real-World Complexity

Quote: “In this paper, we skirt the issue of structural and functional complexity by examining genomic complexity.”

What This Means: The researchers openly admit they are not studying the evolution of new structures (like limbs or eyes) or functions (like photosynthesis or flight). They are measuring a mathematical abstraction they call “genomic complexity,” which is essentially how much of a computer program’s code is non-random and useful for a task the programmers designed.

Analysis: This is the core issue. The story of microbes-to-man evolution is about the origin of new structures and functions, not the optimization of existing information. It’s one thing to show a computer program getting better at solving a math problem it was designed to solve; it’s another thing entirely to claim this process explains how a program could write itself and then build its own computer. The paper studies adaptation within a closed system, which is a real phenomenon, but it tells us nothing about the origin of the system itself or the creation of fundamentally new components.

2. A Filter, Not a Creator

Quote: “As a consequence, only mutations that reduce the entropy are kept while mutations that increase it are purged. Because the mutations can be viewed as measurements, this is the classical behavior of the Maxwell Demon.”

What This Means: The paper describes natural selection as a “Maxwell’s Demon”—a theoretical sorting agent that separates fast and slow molecules to create order. Here, the algorithm “sorts” mutations. Beneficial mutations (those that make the program better at its pre-defined task) are kept, and harmful ones are discarded. This process reduces randomness (“entropy”) and makes the code more specified for its environment.

Analysis: A filter, or a sorting mechanism, can only work with what is already there. Selection here acts to fine-tune a program for a single, unchanging goal. It doesn’t create new goals or new abilities. Think of it like a spell-checker. It can fix typos in a manuscript and make it a “better,” more readable version of itself (reducing entropy), but it cannot write a new paragraph or a new chapter. Real-world evolution would require a mechanism that can write new chapters, not just edit existing sentences.

3. The Limits of a Fixed World

Quote: “In particular, we show that, in fixed environments, for organisms whose fitness depends only on their own sequence information, physical complexity must always increase.”

What This Means: The authors’ “law of increasing complexity” only works under very specific and artificial conditions: the environment must not change, and an organism’s success must depend only on its own code.

Analysis: This is the opposite of the real world. Organisms live in dynamic, changing environments where survival depends on complex interactions with other organisms and the ecosystem. Furthermore, as Michael Behe points out in his work, most adaptations observed in labs, like in Richard Lenski’s long-term E. coli experiments, are the result of breaking or degrading existing genes to gain a short-term advantage in a simplified environment. This study’s model assumes a simple path to improvement, whereas real biological adaptation is often a trade-off that involves loss of function. The authors themselves note that their mechanism can fail in changing environments or with sexual reproduction—conditions that define the majority of life on Earth.

Why This Isn’t Evidence for Macroevolution

This study demonstrates computational optimization, not biological creation. The increase in “complexity” is simply the refinement of a program to better match a static, predefined fitness landscape. It does not show the origin of new information required for fundamentally new biological structures. For microbes to become people, evolution would need to invent new body plans, organs, and integrated biochemical systems from scratch. This paper offers no plausible mechanism for such creative power. Instead, it showcases a process of adaptation within narrow, pre-programmed limits.

Scientific Context

The claims made in this paper are part of a broader attempt to explain the origin of biological complexity through purely naturalistic mechanisms. However, critics like Steve Talbott have argued that digital evolution simulations are untethered from reality. The programmer defines the rules, the goals, and the very “physics” of the digital world, making it a poor analogy for the unguided processes it’s meant to simulate. Furthermore, as detailed by critics of Darwinian theory, a key challenge is providing a step-by-step, causally specific account of how complex systems could arise. Merely showing that an abstract value called “complexity” can increase in a computer model does not meet this challenge.

Bottom Line

This paper does not demonstrate that undirected mutation and natural selection can create the new information required to build complex life forms from simple ones. It shows that a sorting algorithm can optimize a computer program for a task in an artificial world, a finding that offers no support for the grand narrative of microbes-to-man evolution.


Evolution of biological complexity

Christoph Adami, Charles Ofria, and Travis C. Collier

Abstract:
To make a case for or against a trend in the evolution of complexity in biological evolution, complexity needs to be both rigorously defined and measurable. A recent information-theoretic (but intuitively evident) definition identifies genomic complexity with the amount of information a sequence stores about its environment. We investigate the evolution of genomic complexity in populations of digital organisms and monitor in detail the evolutionary transitions that increase complexity. We show that, because natural selection forces genomes to behave as a natural “Maxwell Demon,” within a fixed environment, genomic complexity is forced to increase.