Skip to comments.In the Beginning Was Information: Information in Living Organisms (Ch 6)
Posted on 04/02/2009 7:05:41 PM PDT by GodGunsGuts
Information in Living Organisms
Theorem 28: There is no known law of nature, no known process, and no known sequence of events which can cause information to originate by itself in matter...
(for remainder, click link below)
(Excerpt) Read more at answersingenesis.org ...
(stay tuned for Chapter 6)
That is, stay tuned for Chapter 7!
Where's your proof? If you don't have any proof, it's not a theorm.
==Where’s your proof? If you don’t have any proof, it’s not a theorm.
The proof is there is no *known* law of nature, and no *known* process, and no *known* sequence of events which can cause information to originate by itself in matter. If you are aware of information originating by itself in matter, please share it now.
Bottom line: It's a matter of faith. You can believe that God created the universe(s?) or you can have faith that the big bang happened by chance.
It's a crap shoot. Einstein was wrong, God does play dice.
Please define what you mean by information.
The word is theorem, which only has meaning in logic and mathematics. In particular, it requires mathematical/logical proof. Without the logic, or mathematical operations, there is no theorem.
The failure, or inability to note, or know something, does not even amount to evidence, in and of itself.
Something always had to be there. It is either matter/energy or God. And since it is pretty clear that the Universe had a beginning, and that no known law of nature, process, or sequence of events can explain information originating by itself in matter, then isn’t it fair to say that information as the product of an eternal intelligence is a distinct possiblity given the available evidence?
I only posted the final theorem. Did you read the logic that proceeded the same?
Paradoxically, there being no *known* law of nature, and no *known* process, and no *known* sequence of events which can cause uranium decay to accellerate by many orders of magnitude with no apparent increase in heat output not only doesn't constitute proof, it's not ever considered sufficient reason to assume probability.
What a pile of Bachelor Science.
Quantum is not some “magic” that defies the second law of thermodynamics.
Quantum, if anything, makes us EVEN LESS SURE about what we OBSERVE in the universe, due to the Heisenberg Uncertainty Principle and the apparent affect or mere observation on “quantum events”.
When observing quantum, as in classical double slit experiments for example it is clear that simply observing an event has a “instant” affect on the object(s) being observed.
Since SCIENCE is based on observation of events (and assuming they operate the same way when not observed), this causes even more “whistling in the dark” from physicists.
Throw into that morass of bad juju the additional impact of Chaos Theory (which basically states there really ARE no Random events in the universe) and you being to realize our current level of understanding of the Universe is just as flawed as when “science” did not believe that germs were the cause of disease, or that worms spontaneously generated from horsehair. Noted scientists of the time (Read about Pastuer) had their reputations ruined by the Acadamy of Scientists for the crime of believing in pathogens and disbelieving in the formation of life from inanimate objects.
Information and order do not “grow” or “propagate” or “reproduce”. If so, we would be “growing” cars, books, houses etc. Order does NOT come from Chaos or disorder, because there can be no step-wise process that “preserves” information that is partial, but not usefull.
Take a look at DNA - it preserves information. Why don’t you - given EVERYTHING you have at technology’s disposal - nanotechnology, molecular engineering, robotics, etc - why don’t you design a simple life form that can reproduce itself forever, find its own food, and affect it’s own environment.
Quantum and Chaos Theory (albeit Chaos theory is poorly named) are very very interesting. It seems the universe cares what we think about it.
Name a single instance where it is untrue.
You posted no such thing. There are no theorms on that site. The statements are clearly not theorems, because they not only are illogical statements, they have no possibility of ever being proved. The use of the word theorm on htat site is fraud. "Did you read the logic that proceeded the same?"
There's no logic. There's simply strings of unsupported, illogical, or otherwise useless statements.
Learn what the word theorem means. The use of the word, as it's presented here and is being used on that site is fraud.
Nonsense. Quantum simply means discrete.
"Since SCIENCE is based on observation of events (and assuming they operate the same way when not observed), this causes even more whistling in the dark from physicists."
Ridiculous. This isn't even a complete thought. Also, there's no assumption that things operate the same way when they're not observed. It's a requirement that they do so. Otherwise A≠A !
"Chaos Theory (which basically states there really ARE no Random events in the universe)"
"you being to realize our current level of understanding of the Universe is just as flawed as when science did not believe that germs were the cause of disease, or that worms spontaneously generated from horsehair"
More of the same.
False. The formation of stars and planets is one such process. A planetary system has more information (in a technical sense) than the dust particles that went into making it.
If there is no logic, go back to the logic behind each theorem, starting with theorem 1, and show me how each successive theorem does not follow from the logic presented therein.
Please define what you mean by information.
What exactly do you mean when you speak of information?
Theorem 1: The fundamental quantity information is a non-material (mental) entity. It is not a property of matter, so that purely material processes are fundamentally precluded as sources of information.
Theorem 2: Information only arises through an intentional, volitional act.
Theorem 3: Information comprises the nonmaterial foundation for all technological systems and for all works of art.
Etc, etc...read Chapters 1-6!
The definition used in the article you cited is adequate: basically, the information in a system is the number of bits needed to describe it.
I am asking how you define it.
The article says that information is carried by a material medium, but the information itself is non-material.
Learn what theorem means. Here's a link.
"show me how each successive theorem does not follow from the logic presented therein."
Theorems require proof, not strings of illogical, or unsubstantiated, or unrelated statements.
Lets grab the first pile of text and look at it.
"It should now be clear that information, being a fundamental entity, cannot be a property of matter, and its origin cannot be explained in terms of material processes.
This erroneous statement follows from a string of similar claims. It is neither a conclusion, nor is it axiomatic. It is simply a false assertion.
"We therefore formulate the following fundamental theorem:
Ridiculous! The word therefore is unwarranted, because no logical operations whatsoever were performed.
" Theorem 1: The fundamental quantity information is a non-material (mental) entity. It is not a property of matter, so that purely material processes are fundamentally precluded as sources of information."
The first sentence is a statement attaching a quantitative attribute to the word information. In otherwords, it's part of the definition of the word information. The second statement is false. There has been no complete definition of the word information given, so the first claim in the sentence is indeterminate. The second part of the sentence is false, because it relies on an indeterminate value to arrive at some stated conclusion.
Thanks for the ping!
Have you by any chance been following this series? It touches on Shannon’s theory, and I think explains why Alex Williams feels like Shannon’s theory is relatively minor to the overall concept of information.
Information gets into tree rings and ice cores by natural means. Information also gets into DNA by natural means.
argue with wikipedia then see article at http://en.wikipedia.org/wiki/Quantum_mechanics which states (in the section on Quantum mechanics and Classical Physics)
“Quantum mechanics provides probabilistic results because the physical universe is itself probabilistic rather than deterministic”.
No Virginia, there are at the deepest level the universe operates, no discrete, small measurable states.....kinda makes we are all living in a seamless version of “the Matrix”, right?
A pseudoscientific paper authored by an engineer.
Has he submitted this for peer-review?
How would you falsify his theorem?
In order for any hypothesis to have any type of scientific merit it must stand up to peer-review.
That is how science works.
Also abiogensis has nothing to do with the theory of evolution.
Evolutionary theory deals mainly with how life changed after its origin. Science does try to investigate how life started (e.g., whether or not it happened near a deep-sea vent, which organic molecules came first, etc.), but these considerations are not the central focus of evolutionary theory. Regardless of how life started, afterwards it branched and diversified, and most studies of evolution are focused on those processes.
Symbolic Code is use to store and transmit information. In your model, what symbols are used? And how are those symbols generated?
The trail-blazing discoveries about the nature of energy in the 19th century caused the first technological revolution, when manual labor was replaced on a large scale by technological appliancesmachines which could convert energy. In the same way, knowledge concerning the nature of information in our time initiated the second technological revolution where mental labor is saved through the use of technological appliancesnamely, data processing machines. The concept information is not only of prime importance for informatics theories and communication techniques, but it is a fundamental quantity in such wide-ranging sciences as cybernetics, linguistics, biology, history, and theology. Many scientists, therefore, justly regard information as the third fundamental entity alongside matter and energy.
Claude E. Shannon was the first researcher who tried to define information mathematically. The theory based on his findings had the advantages that different methods of communication could be compared and that their performance could be evaluated. In addition, the introduction of the bit as a unit of information made it possible to describe the storage requirements of information quantitatively. The main disadvantage of Shannons definition of information is that the actual contents and impact of messages were not investigated. Shannons theory of information, which describes information from a statistical viewpoint only, is discussed fully in the appendix (chapter A1).
The true nature of information will be discussed in detail in the following chapters, and statements will be made about information and the laws of nature. After a thorough analysis of the information concept, it will be shown that the fundamental theorems can be applied to all technological and biological systems and also to all communication systems, including such diverse forms as the gyrations of bees and the message of the Bible. There is only one prerequisitenamely, that the information must be in coded form.
Since the concept of information is so complex that it cannot be defined in one statement (see Figure 12), we will proceed as follows: We will formulate various special theorems which will gradually reveal more information about the nature of information, until we eventually arrive at a precise definition (compare chapter 5). Any repetitions found in the contents of some theorems (redundance) is intentional, and the possibility of having various different formulations according to theorem N8 (paragraph 2.3), is also employed.
We have indicated that Shannons definition of information encompasses only a very minor aspect of information. Several authors have repeatedly pointed out this defect, as the following quotations show:
Karl Steinbuch, a German information scientist [S11]: The classical theory of information can be compared to the statement that one kilogram of gold has the same value as one kilogram of sand.
Warren Weaver, an American information scientist [S7]: Two messages, one of which is heavily loaded with meaning and the other which is pure nonsense, can be exactly equivalent . . . as regards information.
Ernst von Weizsäcker [W3]: The reason for the uselessness of Shannons theory in the different sciences is frankly that no science can limit itself to its syntactic level.1
The essential aspect of each and every piece of information is its mental content, and not the number of letters used. If one disregards the contents, then Jean Cocteaus facetious remark is relevant: The greatest literary work of art is basically nothing but a scrambled alphabet.
At this stage we want to point out a fundamental fallacy that has already caused many misunderstandings and has led to seriously erroneous conclusions, namely the assumption that information is a material phenomenon. The philosophy of materialism is fundamentally predisposed to relegate information to the material domain, as is apparent from philosophical articles emanating from the former DDR (East Germany) [S8 for example]. Even so, the former East German scientist J. Peil [P2] writes: Even the biology based on a materialistic philosophy, which discarded all vitalistic and metaphysical components, did not readily accept the reduction of biology to physics. . . . Information is neither a physical nor a chemical principle like energy and matter, even though the latter are required as carriers.
Also, according to a frequently quoted statement by the American mathematician Norbert Wiener (18941964) information cannot be a physical entity [W5]: Information is information, neither matter nor energy. Any materialism which disregards this, will not survive one day.
Werner Strombach, a German information scientist of Dortmund [S12], emphasizes the nonmaterial nature of information by defining it as an enfolding of order at the level of contemplative cognition.
The German biologist G. Osche [O3] sketches the unsuitability of Shannons theory from a biological viewpoint, and also emphasizes the nonmaterial nature of information: While matter and energy are the concerns of physics, the description of biological phenomena typically involves information in a functional capacity. In cybernetics, the general information concept quantitatively expresses the information content of a given set of symbols by employing the probability distribution of all possible permutations of the symbols. But the information content of biological systems (genetic information) is concerned with its value and its functional meaning, and thus with the semantic aspect of information, with its quality.
Hans-Joachim Flechtner, a German cyberneticist, referred to the fact that information is of a mental nature, both because of its contents and because of the encoding process. This aspect is, however, frequently underrated [F3]: When a message is composed, it involves the coding of its mental content, but the message itself is not concerned about whether the contents are important or unimportant, valuable, useful, or meaningless. Only the recipient can evaluate the message after decoding it.
It should now be clear that information, being a fundamental entity, cannot be a property of matter, and its origin cannot be explained in terms of material processes. We therefore formulate the following fundamental theorem:
Theorem 1: The fundamental quantity information is a non-material (mental) entity. It is not a property of matter, so that purely material processes are fundamentally precluded as sources of information.
Please, I mean no disrespect, but I have an observation. You know so much about theorem but seem to have no idea how to correctly spell the word... “theorm, theorems, theorm” one out of three.. I can see the word “that” being spelled “htat” because of a missed keystroke, but to spell the single word that your point refers to wrong not once, but twice is strange... odd.
Are you informed on the subject you are attempting to instruct others about?
Information “Got” into tree ring cores by a natural, rhythmic, cyclical process (the change of seasons). Random events did not “create” the information.
There is a always a non-random process behind the creation of information.
You might be right, if you would say that the weather is a completely random event - but we know its not. So therefore the “information” is not created, rather the tree is just recording the environment - no more than a book “writes” or “creates” the words recorded in it’s pages.
Moreover, tree rings will “never” be decoded to provide proofs, or poems, or record any information or knowledge not related directly to the growth of the tree. Compare this to a book, that can contain any idea, thought, formula, or whatever. Clearly it was *not* just a process that produced the information content in the book.
Exactly, natural, and that's the same way information gets into DNA.
Evidence can not render any statement proved. Conclusions follow, theorems do not. The use of the word theorem in these instances is fraud.
I love an evolutionist with a sense of humor...
Hmmm. I didn’t realize that this was such a hot issue for you. I never really knew what Alex Williams was referring to until I started reading Werner Gitt’s “In the Beginning was Information.” His thesis does not exclude Shannon’s mathematical theory of information. Rather, it merely stipulates that Shannon’s theory occupies the lowest level of information, namely statistics (as opposed to the highest level, meaning).
No. A theorem is a statement that is proven from a set of given axioms in a logical system. The proof is constructed by starting with the axioms of the system and using the permitted logical rules to derive “true” statements. Think of how you derived the mean value theorem in elementary calculus, or established the equality of alternate interior angles in geometry.
Now, back to your post, it’s not a theorem if it has no proof, as above. Without proof, it’s a conjecture. Unless you are assuming it to be axiomatic, which wouldn’t surprise me.
To improve - one must *know* or *reward*. But small, random changes (like in DNA my transpose errors or knockouts) do not produce by themselves any changes in the macroscopic organism that may or may not influence progeny.
This is why slow, gradual change has been discarded as theory for the somewhat more plausible punctuated equilibrium.
I like the classic million monkeys typing for a million years example some people throw out for the usefulness of random processes....except for one problem - suppose you get to nearly the end of the complete works of Shakespeare with all but the last word spelled correctly, one letter remaining.
What stops the first monkey from changing his letter? Nothing. Because no monkey “knows” he needs to keep the letter he got lucky with.
This shows the classic problem with statistical theory with regard to randomness. That is, ALL variables are INDEPENDANT. To hold some variables static - even for a moment in time, implies *something” is holding onto that state for a reason....
There is a recent TED talk (google TED talks) about the hundreds of people wrongly convicted on DNA evidence because of a lack of understanding about what variables are independent vs Dependant - essentially there were quoting that it was “One in a Billion” that the match was not correct, when in fact, it was more like “one in one thousand”. People were convicted unjustly because reasonable doubt was destroyed.
All due to lack of understanding of just the basic “what if’s” in simple combinations of genetic markers.
In the strictest sense of theorem you are right. In the more colloquial use which you point out, it is a "theorem". But unless you can name one of the objects which does provide a means for "information" to arise spontaneously and can rigorously prove it, it seems entirely reasonable to accept the "theorem".
I said nothing about randomness, just natural. That’s all.
The theory is mathematics, plain and simple. Meaning of the message has no bearing on the communication of it. That is where the Shannon theory ends.
Meaning in the biological message goes to complex systems theory, another subject altogether bringing in issues such self-organizing complexity, cellular automata, algorithmic complexity, Kolmogorov complexity, etc. Ditto for autonomy and semiosis.
The Shannon theory is a powerful argument in the intelligent design debate - indeed, in many theological and philosophical debates as well.
If the correspondent ignores it, minimizes it or mixes other issues into it, he is hurting his own argument.
Because the mathematical theory is universal as it is, it is portable between many disciplines. It is well established.
It is like a Caterpillar in these debates, why would anyone want to use it like a little red wagon?
My comment and challenge to the use of the term was meant to shed some light on the paucity of rigor and absence of logical process in the original document.
Apparently, the word theorem has a slightly different definition with respect to science:
“There are also “theorems” in science, particularly physics, and in engineering, but they often have statements and proofs in which physical assumptions and intuition play an important role; the physical axioms on which such “theorems” are based are themselves falsifiable.”
When considering a book B, a computer program C, or the human genome (the totality of genes), we first discuss the following questions:
How many letters, numbers, and words make up the entire text?
How many single letters does the employed alphabet contain (e. g. a, b, c . . . z, or G, C, A, T)?
How frequently do certain letters and words occur?
To answer these questions, it is immaterial whether we are dealing with actual meaningful text, with pure nonsense, or with random sequences of symbols or words. Such investigations are not concerned with the contents, but only with statistical aspects. These topics all belong to the first and lowest level of information, namely the level of statistics.
As explained fully in appendix A1, Shannons theory of information is suitable for describing the statistical aspects of information, e.g., those quantitative properties of languages which depend on frequencies. Nothing can be said about the meaningfulness or not of any given sequence of symbols. The question of grammatical correctness is also completely excluded at this level. Conclusions:
Definition 1: According to Shannons theory, any random sequence of symbols is regarded as information, without regard to its origin or whether it is meaningful or not.
Definition 2: The statistical information content of a sequence of symbols is a quantitative concept, measured in bits (binary digits).
According to Shannons definition, the information content of a single message (which could be one symbol, one sign, one syllable, or a single word) is a measure of the probability of its being received correctly. Probabilities range from 0 to 1, so that this measure is always positive. The information content of a number of messages (signs for example) is found by adding the individual probabilities as required by the condition of summability. An important property of information according to Shannon is:
Theorem 4: A message which has been subject to interference or noise, in general comprises more information than an error-free message.
This theorem follows from the larger number of possible alternatives in a distorted message, and Shannon states that the information content of a message increases with the number of symbols (see equation 6 in appendix A1). It is obvious that the actual information content cannot at all be described in such terms, as should be clear from the following example: When somebody uses many words to say practically nothing, this message is accorded a large information content because of the large number of letters used. If somebody else, who is really knowledgeable, concisely expresses the essentials, his message has a much lower information content.
Figure 12: The five aspects of information. A complete characterization of the information concept requires all five aspectsstatistics, syntax, semantics, pragmatics, and apobetics, which are essential for both the sender and the recipient. Information originates as a language; it is first formulated, and then transmitted or stored. An agreed-upon alphabet comprising individual symbols (code), is used to compose words. Then the (meaningful) words are arranged in sentences according to the rules of the relevant grammar (syntax), to convey the intended meaning (semantics). It is obvious that the information concept also includes the expected/implemented action (pragmatics), and the intended/achieved purpose (apobetics).
Some quotations concerning this aspect of information are as follows: French President Charles De Gaulle (18901970), The Ten Commandments are so concise and plainly intelligible because they were compiled without first having a commission of inquiry. Another philosopher said, There are about 35 million laws on earth to validate the ten commandments. A certain representative in the American Congress concluded, The Lords Prayer consists of 56 words, and the Ten Commandments contain 297 words. The Declaration of Independence contains 300 words, but the recently published ordinance about the price of coal comprises no fewer than 26,911 words.
Theorem 5: Shannons definition of information exclusively concerns the statistical properties of sequences of symbols; meaning is completely ignored.
It follows that this concept of information is unsuitable for evaluating the information content of meaningful sequences of symbols. We now realize that an appreciable extension of Shannons information theory is required to significantly evaluate information and information processing in both living and inanimate systems. The concept of information and the five levels required for a complete description are illustrated in Figure 12. This diagram can be regarded as a nonverbal description of information. In the following greatly extended description and definition, where real information is concerned, Shannons theory is only useful for describing the statistical level (see chapter 5).