Free Republic
Browse · Search
General/Chat
Topics · Post Article

To: FredZarguna; Alamo-Girl; TXnMA; Ha Ha Thats Very Logical; MHGinTN; marron; YHAOS; hosepipe; ...
I would point to my own brain, which like the trees and stars, is a part of nature.

Why do you believe it is "objectively false" that there is (as I suggested) "no known 'natural' cause of information?" If all of nature — the natural world — supposedly "supervenes on the physical" and/or material, how does such an intangible, immaterial, yet utterly necessary thing as information come about in the first place?

Is my inference correct that you regard your brain as a "natural" cause of information? Which would seem to entail the idea (if you are a materialist or physicalist) that mental processes — and mind itself — are caused by atomic activity in the brain? That mind is merely the epiphenomenon of physico-chemical brain activity, to which it reduces, and nothing more?

And yet as British biologist J. B. S. Haldane remarked (1927): “If my mental processes are determined wholly by the motions of atoms in my brain, I have no reason to suppose my beliefs are true . . . and hence I have no reason for supposing my brain to be composed of atoms.”

You wrote:

In information systems to which entropic proofs apply, computer networks (the network proper) is a closed system, whereas computers themselves generally are not (because they are connected to both sinks and sources of information.)

While your observation may be true, it completely overlooks a highly inconvenient question: Where did the information come from that was needed to build the computer and program its operations in the first place? Did the atoms of which it is composed do all this?

Such questions might strike you as simply dumb or tendentious. But really, all I'm doing by asking them is trying to figure out what your beliefs are....

Here's the situation as I see it:

No evolutionary theory can succeed without confronting the cell and the word. In each of the some 300 trillion cells in every human body, the words of life churn almost flawlessly through our flesh and nervous system at a speed that utterly dwarfs the data rates of all the world’s supercomputers. For example, just to assemble some 500 amino-acid units into each of the trillions of complex hemoglobin molecules that transfer oxygen from the lungs to bodily tissues takes a total of some 250 peta operations per second. (The word “peta” refers to the number ten to the 15th power — so this tiny process requires 250x1015 operations.) — George Gilder, "Evolution and Me," 2006.

And that's just one process. An astronomical number (from our perspective) of other processes are also occurring simultaneously, all of which must act together to preserve an organic system in a living state. Are we to suppose that "clever atoms" are responsible for the organizational information indispensable for achieving his result?

We started out quibbling about whether Kahre's Law applies to "open" systems in nature, or only to the "closed" ones. [Please allow me to correct a mistake I made; I realized it was a mistake from reviewing an old article I wrote on this subject, back in 2005: The "law" under question here isn't Kahre's, it's Ashby's.] You suggested this "law" of information theory applies only to closed systems. Which I tend to associate with non-living systems in nature, though that may not be technically accurate.

It seems evident (to me anyway) that biology cannot be reduced simply to physics ("matter in its motions" as described by the physicochemical laws, given initial and boundary conditions), leading to "random" mutations whose fitness value will be "rewarded" or punished by the environment — since the genetic, algorithmic, and symbolic information content of living organisms is much greater than the information content of the physical laws. Biology "uses" physics and chemistry, but does not reduce to physics and chemistry. "More" is needed; and that "more" is information.

Chaitin pointed out that the laws of physics have very low information content, since their algorithmic complexity can be characterized by a computer program fewer than a thousand characters in length. In 2004, in a private communication to a colleague, Chaitin wrote: “My paper on physics was never published, only as an IBM report. In it I took: Newton’s laws, Maxwell’s laws, the Schrödinger equation, and Einstein’s field equations for curved spacetime near a black hole, and solved them numerically, giving ‘motion-picture’ solutions. The programs, which were written in an obsolete computer programming language APL2 at roughly the level of Mathematica, were all about half a page long, which is amazingly simple.”

How does the complexity of living organisms increase if its main driver is the physicochemical laws, estimated to have an algorithmic complexity of only 103 bits? Certainly, the observed flow of environmental information is enormous, and tellingly, it is morphological information. But what is the source of the enormous environmental information flow?

Now Ashby’s Law (Ashby, 1962) states that “The variety of outputs of any deterministic physical system cannot be greater than the variety of inputs; the information of output cannot exceed the information already present in the input.” In accordance, Kahre’s “Law of Diminishing Information” reads: Compared to direct reception, an intermediary can only decrease the amount of information (Kahre, 2002, 14). Moreover, it is a widely held view nowadays that the chain of physical causes forms a closed circle. The hypothesis of the causal closure of the physical (Cameron, 2000, 244) maintains (roughly) “that for any event E that has a cause we can cite a physical cause, P, for its happening, and that citing P explains why E happened”. Therefore, not only Ashby’s and Kahre’s laws but the causal closure thesis is in conflict with the complexity measures found in physics and in biology. Now if the algorithmic complexity of one human brain is already around I1~1015–1017 bits, the information paradox consists in the fact that the information content of physics is I(physics)~103 bits while that of the whole living kingdom is ... I(biology)~1015–1017 bits. Taking into account also that physics is hopelessly far from being able to cope with the task to govern even one human person’s biological activity ~2*1021 bits per second, it becomes clear that at present, modern cosmological models’ algorithmic complexity is much less than the above obtained complexity measures characterizing life. — A. Grandpierre, "Complexity Measures and Life’s Algorithmic Complexity," 2005.

Lots of questions, my friend. Your thoughts?
152 posted on 08/05/2013 2:01:58 PM PDT by betty boop
[ Post Reply | Private Reply | To 149 | View Replies ]


To: betty boop

INFORMATION... where does it come from.. looking for the ultimate source..
The infinite question... a very sticky question like fly paper..

Even parsed with lawyerly tricks still sticks its head up to say “HI!”..
Its Poison to hip shot judgements, and mealy mouthed diversions..
Its like a toddler that says WHY?... and then WHY? and then WHY?...

It even defeats long screeds that make you forget what the original question WAS...
***


153 posted on 08/05/2013 7:33:45 PM PDT by hosepipe (This propaganda has been edited to include some fully orbed hyperbole..)
[ Post Reply | Private Reply | To 152 | View Replies ]

To: betty boop
Thank you so much for your outstanding essay-post, dearest sister in Christ!

The excerpts are also very informative!

There is no known origin for information [Shannon, successful communication] in the universe. Ditto for space/time, inertia, autonomy, etc.

Indeed, discussions of abiogenesis often arrive at the observation that a successful answer to to the origin of information also answers the origin of life. (Pattee, Yockey, Rocha, et al)

155 posted on 08/07/2013 8:41:13 PM PDT by Alamo-Girl
[ Post Reply | Private Reply | To 152 | View Replies ]

Free Republic
Browse · Search
General/Chat
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson