Computer Model Confirms Need for Foresight in Network Evolution

The quest to explain how complex biological systems could arise from simpler predecessors is a central theme of evolutionary theory. A 2017 paper in Nature Communications by Tamar Friedlander and colleagues, “Evolution of new regulatory functions on biophysically realistic fitness landscapes,” is often presented as a significant step in this quest. The study uses a sophisticated computer model to simulate how a duplicated gene can specialize to perform a new function. The conclusion drawn by many is that this paper demonstrates a plausible, unguided pathway for the emergence of new information and complexity in biology. However, a careful analysis of the study’s methodology reveals the opposite. The model is not a testament to the creative power of unguided processes, but rather a powerful illustration of how goal-directed optimization works, inadvertently highlighting the necessity of foresight for building complex systems.

Modeling a Path to Specialization

The authors’ stated goal is to bridge the gap between abstract evolutionary models and the biophysical reality of molecular interactions. They set out to investigate one of the most fundamental proposed mechanisms for network growth: gene duplication followed by “subfunctionalization.” In this scenario, a gene that codes for a regulatory protein—a transcription factor (TF)—is duplicated. Initially, the two copies are identical. The evolutionary “problem” is to explain how one copy can diverge and specialize to perform a new, distinct regulatory task without losing the original, essential function along the way.

To explore this, the researchers constructed a detailed computational model. The key components were:

  • A System: Two identical TFs regulating two target genes, with two different external signals that can activate the TFs.
  • A Goal: The simulation’s “fittest” or optimal state was pre-defined by the researchers. This target state is a fully specialized system where TF1 responds only to signal 1 to regulate gene 1, and TF2 responds only to signal 2 to regulate gene 2.
  • A “Fitness Landscape”: The simulation was programmed to penalize any state that deviated from this pre-determined optimal outcome. The closer the simulated system got to the target, the higher its fitness score.
  • “Mutations”: The model introduced random changes to the TFs and their DNA binding sites at set rates, mimicking mutation.

The study’s contribution was to run this simulation under various conditions to see which “evolutionary pathways” were the most efficient and what timescales were involved in reaching the pre-programmed finish line.

The Ghost in the Machine: Why the Model Isn’t Unguided

While the paper is an interesting exercise in computational modeling, it fails as a demonstration of unguided evolution for one critical reason: the entire process is guided by a predefined goal. The researchers built foresight directly into the machine.

First, the “fitness landscape” is an artificial construct that presupposes a target. In the real world, natural selection has no foresight. It cannot “select for” a complex, integrated system that will only become advantageous many mutations down the road. It can only act on the immediate function (or lack thereof) of a given state. In this model, however, the fitness function creates a constant pull towards a specific, complex, and functional endpoint that the researchers defined from the start. This is not an analogue for unguided evolution; it is an analogue for a GPS navigator guiding a car to a pre-programmed destination. The simulation’s success in reaching the target is a product of its design, not a discovery about nature.

Second, the model does not account for the origin of the necessary information. It begins with a fully functional, if redundant, regulatory system. It does not explain how the first TF and its ability to bind to specific DNA sequences and respond to signals came to be. Furthermore, it does not generate fundamentally new information. It simply modifies existing components to match a supplied template (the “optimal” state). This is akin to being given two identical copies of a key and a new lock, and then filing one key down until it fits the new lock. The process demonstrates modification, but it does not explain the origin of locks or keys.

Finally, the study powerfully illustrates how difficult the path to specialization is, even when it is guided. The authors find that the process is often “usually slow,” and can easily get stuck in dead ends, such as losing one copy of the gene entirely. The “fast” pathways they identify depend on fortuitous conditions, such as “promiscuity-promoting mutations” that temporarily relax specificity. In an unguided world without a target, these “pathways” would be nothing more than a random walk in a vast, non-functional sequence space, making the probability of stumbling upon a new, integrated function infinitesimally small.

A Blueprint for Engineering, Not Evolution

Viewed from a design perspective, the paper’s findings are not surprising; they are a perfect illustration of sound engineering principles. The model is a classic example of a goal-directed search algorithm. Engineers routinely use such computational methods to solve optimization problems: define a goal, set the constraints, and run a simulation to find the most efficient path to the solution.

The concept of “duplication and divergence” is a well-established engineering strategy. A designer will often copy a proven, working module and then customize it for a new, specific application. This is an efficient way to innovate because it preserves the core functionality of the original design while allowing for new features to be added. The foresight to see the end-goal—a new, specialized function—is what makes this strategy viable.

The “fast” and “slow” pathways identified in the paper can be understood as different engineering trade-offs. The “fast” pathway, which relies on temporary functional promiscuity, is like using a general-purpose, adaptable component as an intermediate step before locking in the final, high-precision design. This is not a random process, but a strategic choice that requires looking ahead.

Conclusion: Simulating Design is Not Demonstrating Unguided Evolution

Friedlander and colleagues have produced a valuable piece of computational modeling that explores the logical pathways of system modification. However, it is a profound misinterpretation to claim this work provides evidence for how unguided evolution could create new regulatory functions. By defining the optimal, specialized state in advance and programming the fitness landscape to reward any step toward that goal, the authors hard-coded foresight and intelligence into their simulation.

The model does not solve the problem of how biological complexity arises; it assumes the most difficult parts. It starts with a working system and provides it with a map to a more complex one. Ultimately, the paper shows that transforming a redundant system into a highly specialized, integrated one is a daunting challenge that becomes tractable only when a functional goal is specified from the outset. This does not point to the creative power of blind mutations and selection, but to the indispensable role of a guiding intelligence.

Leave a Reply

Your email address will not be published. Required fields are marked *