In investigating the subject a bit further, I discovered some things you and the others here might also find quite interesting. Chief among these is that there is a formulation for functional complexity in complex systems which includes biological systems. The evolutionary biologists however seem to question the applicability of the term to their domain and have offered yet another type of complexity, physical complexity. Heylighens article (1996) did not use the term which was evidently coined by Adami. Following is an article by Adami raising the concept and the results of modeling the theory:
Darwinian evolution is a simple yet powerful process that requires only a population of reproducing organisms in which each offspring has the potential for a heritable variation from its parent. This principle governs evolution in the natural world, and has gracefully produced organisms of vast complexity. Still, whether or not complexity increases through evolution has become a contentious issue. Gould (1), for example, argues that any recognizable trend can be explained by the "drunkard's walk" model, where "progress" is due simply to a fixed boundary condition. McShea (2) investigates trends in the evolution of certain types of structural and functional complexity, and finds some evidence of a trend but nothing conclusive. In fact, he concludes that "something may be increasing. But is it complexity?" Bennett (3), on the other hand, resolves the issue by fiat, defining complexity as "that which increases when self-organizing systems organize themselves." Of course, to address this issue, complexity needs to be both defined and measurable.
In this paper, we skirt the issue of structural and functional complexity by examining genomic complexity. It is tempting to believe that genomic complexity is mirrored in functional complexity and vice versa. Such an hypothesis, however, hinges upon both the aforementioned ambiguous definition of complexity and the obvious difficulty of matching genes with function. Several developments allow us to bring a new perspective to this old problem. On the one hand, genomic complexity can be defined in a consistent information-theoretic manner [the "physical" complexity (4)], which appears to encompass intuitive notions of complexity used in the analysis of genomic structure and organization (5). On the other hand, it has been shown that evolution can be observed in an artificial medium (6, 7), providing a unique glimpse at universal aspects of the evolutionary process in a computational world. In this system, the symbolic sequences subject to evolution are computer programs that have the ability to self-replicate via the execution of their own code. In this respect, they are computational analogs of catalytically active RNA sequences that serve as the templates of their own reproduction. In populations of such sequences that adapt to their world (inside of a computer's memory), noisy self-replication coupled with finite resources and an information-rich environment leads to a growth in sequence length as the digital organisms incorporate more and more information about their environment into their genome. Evolution in an information-poor landscape, on the contrary, leads to selection for replication only, and a shrinking genome size as in the experiments of Spiegelman and colleagues (8). These populations allow us to observe the growth of physical complexity explicitly, and also to distinguish distinct evolutionary pressures acting on the genome and analyze them in a mathematical framework.
If an organism's complexity is a reflection of the physical complexity of its genome (as we assume here), the latter is of prime importance in evolutionary theory. Physical complexity, roughly speaking, reflects the number of base pairs in a sequence that are functional. As is well known, equating genomic complexity with genome length in base pairs gives rise to a conundrum (known as the C-value paradox) because large variations in genomic complexity (in particular in eukaryotes) seem to bear little relation to the differences in organismic complexity (9). The C-value paradox is partly resolved by recognizing that not all of DNA is functional: that there is a neutral fraction that can vary from species to species. If we were able to monitor the non-neutral fraction, it is likely that a significant increase in this fraction could be observed throughout at least the early course of evolution. For the later period, in particular the later Phanerozoic Era, it is unlikely that the growth in complexity of genomes is due solely to innovations in which genes with novel functions arise de novo. Indeed, most of the enzyme activity classes in mammals, for example, are already present in prokaryotes (10). Rather, gene duplication events leading to repetitive DNA and subsequent diversification (11) as well as the evolution of gene regulation patterns appears to be a more likely scenario for this stage. Still, we believe that the Maxwell Demon mechanism described below is at work during all phases of evolution and provides the driving force toward ever increasing complexity in the natural world.
Conclusions: The physical complexity of those regions that do not code for the structural changes increases independently from structural complexity. There is no correlation between fitness of collectives and division of labor (possible causes will be discussed later). On the other hand, there is a correlation between the size of colonies and fitness. Thus, it seems that there is no additional increase of physical complexity other than those regions that code for structural changes
Complexity is ...[the abstract notion of complexity has been captured in many different ways. Most, if not all of these, are related to each other and they fall into two classes of definitions]:
2) ...the (minimal) amount of time it takes to create the system.
2) ...what a system does by virtue of its relationship to its environment that it would not do by itself: e.g. its function.
3) ...the act or process of becoming an emergent system.
According to this view, when we think about emergence we are, in our mind's eye, moving between different vantage points. We see the trees and the forest at the same time. We see the way the trees and the forest are related to each other. To see in both these views we have to be able to see details, but also ignore details. The trick is to know which of the many details we see in the trees are important to know when we see the forest.
In conventional views the observer considers either the trees or the forest. Those who consider the trees consider the details to be essential and do not see the patterns that arise when considering trees in the context of the forest. Those who consider the forest do not see the details. When one can shift back and forth between seeing the trees and the forest one also sees which aspects of the trees are relevant to the description of the forest. Understanding this relationship in general is the study of emergence.
Given a system whose function we want to specify, for which the environmental (input) variables have a complexity of C(e), and the actions of the system have a complexity of C(a), then the complexity of specification of the function of the system is:
The term "irreducible complexity" is defined by Behe as:
The key to understanding all the dialogue on the Chowder Society is communications. In ordinary conversation and indeed, in many of the articles from the science community at large information and message are treated as pretty much the same thing. A database, letter, diagram, DNA would all be considered information using various symbolizations or languages for comprehension.
But to understand the Shannon definition, one must view information as a reduction of uncertainty in a receiver the communication itself having been successfully completed. In molecular machines this is vital and can be best visualized by comparing a dead skin cell to a live one. The DNA, or message, is as good dead as alive. But the live skin cell successfully communicates with itself and the environment.
The state changes that occur in the molecular machinery is evidence of the reduction of uncertainty in the receiver. This is akin to the state changes which Rocha indicates would be required in an RNA world (abiogenesis) for self-organizing complexity to begin i.e. toggling between autonomous communication and communication with the environment.
To the contrary, IMHO, the whole notion of happenstance is dying with a whimper because of the work of general mathematicians, information theorists and physicists. Precious few of them have a perceptible metaphysical bias, but they keep coming back to theories which make an unintended theological statement, that biological systems did not arise by happenstance alone. This is much like the determination that there was a beginning the most theological statement to come out of science.
The biosemiotics approach looks very interesting! Here's a link, to Friedrich Salomon Rothschild, "protosemiotician":
Just found it, and it looks fascinating. I'll need some time to digest it. Maybe you might find it of interest, too?
Wonderful essay, Alamo-Girl! Thank you so very much!