I had a blog entry to enter in last week involving the 50 year anniversary of the Dr. Martin Luther King's "I have a dream" speech.
But I couldn't get it right. Sigh. I'll get it one of these days. Moving on.
One of the hardest concepts to grasp in evolution is its parallelism and interactivity. A mutation in one organism doesn't necessarily just affect the organism itself. It can affect its neighbors, predators, prey and its descendants.
A good example is feathers.
Last year the a team of Canadian, Japanese and American paleontologists announced the discovery of feathers on a newly discovered Ornithomimus specimen. (See here.) The discovery pushed the appearance of feathers back quite a ways, long before the birds appeared and certainly long before the feathers were used in any sort of flight. O. edmontonicus was flightless and weighed about 350 pounds. It had no flying ancestors to speak of. Consequently, the evolution of feathers had to have pre-dated flight and been used for other purposes. Two proposed uses for feathers are thermoregulation and social displays.
That is for the organism itself. Anybody who works with birds knows a few other uses. Birds use feathers to protect themselves from the elements-- especially aquatic birds. They use them for brooding eggs. In addition, birds have lice that love the protection and insulation of feathers.
Feathers affect predation by changing the physical appearance of an animal-- a feathered animal can appear much larger and more massive than it is. Predators have to adapt to the tactile difference between feathers, skin and (for mammals) fur. If feathers evolved in conjunction with warm-bloodedness, the resulting organism scales differently in terms of size, both in maximum and minimum sizes. In speed as well. All of which need to be adapted to by predators or exploited by prey. Nothing happens in a vacuum. This branch of biology is called evolutionary ecology.
If you consider a population of animals, each of which is given a unique combination of genes and developmental environment, each plays out a single thread of possibilities also unique to that organism. The possibilities are played out in real time and result in a statistical result: differential reproductive success for a given subset of the original population.
This is essentially a computational problem. If you take a set of different starting conditions and apply a computational algorithm to each of them, some will have a better solution set at the end than others. This is the basis for evolutionary computation, a subfield of computational intelligence.
Evolutionary computation operates by continuously optimizing the result using Darwinian selection methods. An evolutionary algorithm uses computational equivalents to reproduction, mutation, recombination and selection. "Fitness" is determined by how close the outcome maps to solution rules. Each "generation" is tested and those members that best fit to the outcome are selected for the next.
This can work both ways. Certainly there are algorithms that can be derived from evolution we might find useful. But can we view evolution itself as a computational process?
"Evolution" is an emergent property that derives from the lives of individual organisms-- how they cooperate, compete, eat and be eaten. We only see the process of evolution as it plays out over time. Each organism plays out the problem if its own survival. Evolution only emerges as a function of the reproduction of those individuals.
There is such a thing as DNA computing. This is using the chemistry of DNA to solve computational problems. Caltech researchers have managed to use DNA in implementing a circuit that can solve square roots up to fifteen. This article talks about multicellular computation networks. This article talks about proteins as computational units within the cell. And this one talks about computation using biochemical reactions.
Lee Segel has written this on computing a slime mold. He modeled it as a set of small automata that obey (relatively) simple rules. This looks to me as a step in the right direction. If a model of an organism is composed of computational units, can model of the organism be considered a computational unit? And, by extension, can the organism itself be considered computational. That would make evolution an emergent computational property.
So what is computation, anyway? And why would this be important?
Computation is the process of following an algorithm and obtaining a result-- transcription of DNA and copying your homework are both acts of computation in the most general sense. Computation is a physical process. That is, it is the product of physics and happens in the physical world. (A good article on the physical limits of computation is here.) Computational machines we normally use are made of silicon and use electrons. My favorite computational machine is between my ears is made of neurons and functions largely on Twinkies. (Also called a wetware computer or, sometimes, a brain.)
One type of computational entity is an automaton, an abstract machine. These are mathematical objects that can solve computational problems. One kind is a finite state machine, where a given machine is always in one of a finite set of possible internal states. A vending machine is a good example. It might be in a product delivery state, a money accepting state and a product selection state. You put your money in, you select the product, the product is then delivered. This machine cannot be in more than one state and the capabilities of a given state are specific to that state.
There's been a fair amount of research applying automata theory to biology. (See here and here.) How, then, to apply it to evolution?
The problem is that evolution and biology are complex statistical systems: a single solution, or even a single set of solutions, is not the goal. In addition any but the most trivial of biological systems are massively parallel. There's even a branch of biology for this: complex systems biology. There have been a number of interesting outcomes from this area. Wojciech Borkowski has proposed using cellular automata for the purpose of modeling macroevolution-- the macro processes that must be emergent and don't derive simply from genes and individual populations. There has even been some talk about another branch of automata theory, infinite automata theory, being applied to biology. (See here.) While the possible states of a biological system are very, very large, they are probably finite. But they may be large enough that they can be modeled as an infinite autumata.
But I got to thinking. Hm. A computational entity that is incredibly complex, massively parallel and whose outcome is always statistical. That sounds familiar...
Oh, yeah. It's a quantum computer.
And, when I looked, sure enough the late I. C. Baianu was looking into quantum automata (and here) and evolution. (See here.)
Now, I am not saying biological systems are Bose-Einstein condensates or entangled. I am saying there are enough similarities between how the systems behave that the math from one might actually apply to the other. I think Baianu was onto something.
Quantum computers represent a problem as all possible states in such a way that when the measurement event occurs a set of possible answers to the problem (with some probability of correctness) emerges.
Evolution is like that, too. Wherever a niche opens up a population of organisms try to take advantage of it-- consider it the initial problem state-- all trying their own unique approach. Approaches blend, compete and cooperate. At a later time, each path has reached a point of observation.
The difference is that while a quantum computer might function nearly instantaneously, evolution's solution is splayed out over millions of years.
Think of it as "real" time.
Sunday, September 8, 2013
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment