Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

In the Beginning Was Information: Information in Living Organisms (Ch 6)
AiG ^ | April 2, 2009 | Dr. Werner Gitt

Posted on 04/02/2009 7:05:41 PM PDT by GodGunsGuts

Information in Living Organisms

Theorem 28: There is no known law of nature, no known process, and no known sequence of events which can cause information to originate by itself in matter...

(for remainder, click link below)

(Excerpt) Read more at answersingenesis.org ...


TOPICS: Constitution/Conservatism; Culture/Society; News/Current Events; Philosophy
KEYWORDS: aminoacids; code; creation; dna; evolution; genetic; genome; goodgodimnutz; information; intelligentdesign; proteins
Navigation: use the links below to view more comments.
first previous 1-2021-4041-6061-80 ... 221-230 next last
To: spunkets

Touche’.

I love an evolutionist with a sense of humor...


41 posted on 04/02/2009 8:51:46 PM PDT by Gordon Greene (www.fracturedrepublic.com - Jesus said, "I am THE way, THE truth and THE life." Any questions?)
[ Post Reply | Private Reply | To 37 | View Replies]

To: Alamo-Girl; betty boop; CottShop; AndrewC

Hmmm. I didn’t realize that this was such a hot issue for you. I never really knew what Alex Williams was referring to until I started reading Werner Gitt’s “In the Beginning was Information.” His thesis does not exclude Shannon’s mathematical theory of information. Rather, it merely stipulates that Shannon’s theory occupies the lowest level of information, namely statistics (as opposed to the highest level, meaning).


42 posted on 04/02/2009 8:52:34 PM PDT by GodGunsGuts
[ Post Reply | Private Reply | To 39 | View Replies]

To: GodGunsGuts

No. A theorem is a statement that is proven from a set of given axioms in a logical system. The proof is constructed by starting with the axioms of the system and using the permitted logical rules to derive “true” statements. Think of how you derived the mean value theorem in elementary calculus, or established the equality of alternate interior angles in geometry.

Now, back to your post, it’s not a theorem if it has no proof, as above. Without proof, it’s a conjecture. Unless you are assuming it to be axiomatic, which wouldn’t surprise me.


43 posted on 04/02/2009 8:58:07 PM PDT by Buck W. (The President of the United States IS named Schickelgruber...)
[ Post Reply | Private Reply | To 5 | View Replies]

To: Moonman62
Weather is NOT random, so how does a *random* event create *new* useful information that is useful? Again, there there is a term used for a method for convergence of mathematical approximation - called stepwise refinement.

To improve - one must *know* or *reward*. But small, random changes (like in DNA my transpose errors or knockouts) do not produce by themselves any changes in the macroscopic organism that may or may not influence progeny.

This is why slow, gradual change has been discarded as theory for the somewhat more plausible punctuated equilibrium.

I like the classic million monkeys typing for a million years example some people throw out for the usefulness of random processes....except for one problem - suppose you get to nearly the end of the complete works of Shakespeare with all but the last word spelled correctly, one letter remaining.
What stops the first monkey from changing his letter? Nothing. Because no monkey “knows” he needs to keep the letter he got lucky with.

This shows the classic problem with statistical theory with regard to randomness. That is, ALL variables are INDEPENDANT. To hold some variables static - even for a moment in time, implies *something” is holding onto that state for a reason....

There is a recent TED talk (google TED talks) about the hundreds of people wrongly convicted on DNA evidence because of a lack of understanding about what variables are independent vs Dependant - essentially there were quoting that it was “One in a Billion” that the match was not correct, when in fact, it was more like “one in one thousand”. People were convicted unjustly because reasonable doubt was destroyed.

All due to lack of understanding of just the basic “what if’s” in simple combinations of genetic markers.

44 posted on 04/02/2009 9:00:51 PM PDT by BereanBrain
[ Post Reply | Private Reply | To 38 | View Replies]

To: Buck W.; GodGunsGuts
Now, back to your post, it’s not a theorem if it has no proof, as above. Without proof, it’s a conjecture. Unless you are assuming it to be axiomatic, which wouldn’t surprise me.

In the strictest sense of theorem you are right. In the more colloquial use which you point out, it is a "theorem". But unless you can name one of the objects which does provide a means for "information" to arise spontaneously and can rigorously prove it, it seems entirely reasonable to accept the "theorem".

45 posted on 04/02/2009 9:12:04 PM PDT by AndrewC
[ Post Reply | Private Reply | To 43 | View Replies]

To: BereanBrain

I said nothing about randomness, just natural. That’s all.


46 posted on 04/02/2009 9:20:43 PM PDT by Moonman62 (The issue of whether cheap labor makes America great should have been settled by the Civil War.)
[ Post Reply | Private Reply | To 44 | View Replies]

To: GodGunsGuts; betty boop; CottShop; AndrewC
It is a hot issue with me as it would be with Yockey, Schneider and the others who have developed the field of "information theory and molecular biology" for real medical benefit in pharmaceutical and cancer research et al.

The theory is mathematics, plain and simple. Meaning of the message has no bearing on the communication of it. That is where the Shannon theory ends.

Meaning in the biological message goes to complex systems theory, another subject altogether bringing in issues such self-organizing complexity, cellular automata, algorithmic complexity, Kolmogorov complexity, etc. Ditto for autonomy and semiosis.

The Shannon theory is a powerful argument in the intelligent design debate - indeed, in many theological and philosophical debates as well.

If the correspondent ignores it, minimizes it or mixes other issues into it, he is hurting his own argument.

Because the mathematical theory is universal as it is, it is portable between many disciplines. It is well established.

It is like a Caterpillar in these debates, why would anyone want to use it like a little red wagon?

47 posted on 04/02/2009 9:22:58 PM PDT by Alamo-Girl
[ Post Reply | Private Reply | To 42 | View Replies]

To: AndrewC

My comment and challenge to the use of the term was meant to shed some light on the paucity of rigor and absence of logical process in the original document.


48 posted on 04/02/2009 9:24:15 PM PDT by Buck W. (The President of the United States IS named Schickelgruber...)
[ Post Reply | Private Reply | To 45 | View Replies]

To: AndrewC; Buck W.

Apparently, the word theorem has a slightly different definition with respect to science:

“There are also “theorems” in science, particularly physics, and in engineering, but they often have statements and proofs in which physical assumptions and intuition play an important role; the physical axioms on which such “theorems” are based are themselves falsifiable.”

http://en.wikipedia.org/wiki/Theorem#Theorems_in_logic


49 posted on 04/02/2009 9:29:14 PM PDT by GodGunsGuts
[ Post Reply | Private Reply | To 45 | View Replies]

To: Alamo-Girl; betty boop; CottShop; AndrewC
Yes, I realize that now. Nor am I in any way disparaging the practical benefits of Shannon's theory. I'm merely pointing out that according to Dr. Gitt, and apparently Williams, the statistics of the message is actually the lowest level of information. From the article:

4.1 The Lowest Level of Information: Statistics

When considering a book B, a computer program C, or the human genome (the totality of genes), we first discuss the following questions:

–How many letters, numbers, and words make up the entire text?
–How many single letters does the employed alphabet contain (e. g. a, b, c . . . z, or G, C, A, T)?
–How frequently do certain letters and words occur?

To answer these questions, it is immaterial whether we are dealing with actual meaningful text, with pure nonsense, or with random sequences of symbols or words. Such investigations are not concerned with the contents, but only with statistical aspects. These topics all belong to the first and lowest level of information, namely the level of statistics.

As explained fully in appendix A1, Shannon’s theory of information is suitable for describing the statistical aspects of information, e.g., those quantitative properties of languages which depend on frequencies. Nothing can be said about the meaningfulness or not of any given sequence of symbols. The question of grammatical correctness is also completely excluded at this level. Conclusions:

Definition 1: According to Shannon’s theory, any random sequence of symbols is regarded as information, without regard to its origin or whether it is meaningful or not.
Definition 2: The statistical information content of a sequence of symbols is a quantitative concept, measured in bits (binary digits).

According to Shannon’s definition, the information content of a single message (which could be one symbol, one sign, one syllable, or a single word) is a measure of the probability of its being received correctly. Probabilities range from 0 to 1, so that this measure is always positive. The information content of a number of messages (signs for example) is found by adding the individual probabilities as required by the condition of summability. An important property of information according to Shannon is:

Theorem 4: A message which has been subject to interference or “noise,” in general comprises more information than an error-free message.

This theorem follows from the larger number of possible alternatives in a distorted message, and Shannon states that the information content of a message increases with the number of symbols (see equation 6 in appendix A1). It is obvious that the actual information content cannot at all be described in such terms, as should be clear from the following example: When somebody uses many words to say practically nothing, this message is accorded a large information content because of the large number of letters used. If somebody else, who is really knowledgeable, concisely expresses the essentials, his message has a much lower information content.

Figure 12: The five aspects of information. A complete characterization of the information concept requires all five aspects—statistics, syntax, semantics, pragmatics, and apobetics, which are essential for both the sender and the recipient. Information originates as a language; it is first formulated, and then transmitted or stored. An agreed-upon alphabet comprising individual symbols (code), is used to compose words. Then the (meaningful) words are arranged in sentences according to the rules of the relevant grammar (syntax), to convey the intended meaning (semantics). It is obvious that the information concept also includes the expected/implemented action (pragmatics), and the intended/achieved purpose (apobetics).

Some quotations concerning this aspect of information are as follows: French President Charles De Gaulle (1890–1970), “The Ten Commandments are so concise and plainly intelligible because they were compiled without first having a commission of inquiry.” Another philosopher said, “There are about 35 million laws on earth to validate the ten commandments.” A certain representative in the American Congress concluded, “The Lord’s Prayer consists of 56 words, and the Ten Commandments contain 297 words. The Declaration of Independence contains 300 words, but the recently published ordinance about the price of coal comprises no fewer than 26,911 words.”

Theorem 5: Shannon’s definition of information exclusively concerns the statistical properties of sequences of symbols; meaning is completely ignored.

It follows that this concept of information is unsuitable for evaluating the information content of meaningful sequences of symbols. We now realize that an appreciable extension of Shannon’s information theory is required to significantly evaluate information and information processing in both living and inanimate systems. The concept of information and the five levels required for a complete description are illustrated in Figure 12. This diagram can be regarded as a nonverbal description of information. In the following greatly extended description and definition, where real information is concerned, Shannon’s theory is only useful for describing the statistical level (see chapter 5).

50 posted on 04/02/2009 9:35:19 PM PDT by GodGunsGuts
[ Post Reply | Private Reply | To 47 | View Replies]

To: Buck W.; GodGunsGuts
My comment and challenge to the use of the term was meant to shed some light on the paucity of rigor and absence of logical process in the original document.

I believe the target audience of the book is the semi-technical populace, not a purely technical one. And it is not written as a text book. People do have opinions. You seem to be chafed that it was not written as a proof. Okay, so ignore it.

51 posted on 04/02/2009 9:42:36 PM PDT by AndrewC
[ Post Reply | Private Reply | To 48 | View Replies]

To: AndrewC; Buck W.

Personally, I find the book fascinating. And as I probe a little deeper into the meaning of the word theorem, it is becoming clear that there are multiple definitions, depending on the discipline involved. And give the Wikipedia definition of a theorem re: science, it seems to me that Dr. Gitt’s use of the word is appropriate to the field of knowledge he is pursuing.


52 posted on 04/02/2009 9:49:41 PM PDT by GodGunsGuts
[ Post Reply | Private Reply | To 51 | View Replies]

To: GodGunsGuts
"Apparently, the word theorem has a slightly different definition with respect to science:

"“There are also “theorems” in science, particularly physics, and in engineering, but they often have statements and proofs in which physical assumptions and intuition play an important role; the physical axioms on which such “theorems” are based are themselves falsifiable.”"

This is wrong. where did it come from? You gave a link to WIKI, but the quote is not contained there. The following quote is given there and it's correct.

"Theorems in mathematics and theories in science are fundamentally different in their epistemology. A scientific theory cannot be proven; its key attribute is that it is falsifiable, that is, it makes predictions about the natural world that are testable by experiments."

Notice that the words theory and theorem are two different words.

53 posted on 04/02/2009 10:55:41 PM PDT by spunkets
[ Post Reply | Private Reply | To 49 | View Replies]

To: spunkets

It’s there, you need to read a little further down.


54 posted on 04/02/2009 11:05:37 PM PDT by GodGunsGuts
[ Post Reply | Private Reply | To 53 | View Replies]

To: GodGunsGuts

That describes bits pretty well.


55 posted on 04/02/2009 11:15:19 PM PDT by AZLiberty (I hope Obama changes.)
[ Post Reply | Private Reply | To 26 | View Replies]

To: GodGunsGuts
OK, I found it. The statement is still wrong. There are no theorems in science that are based on evidence, or axioms, as the person that penned that indicated. Noether's theorms are examples of two theorems that are held in science(physics), but the important part to note is that they are theorems, because the respective statement of each theorem is proved with mathematics. No evidence, or axioms are required, or ever used in any instance of a theorem relevant to physics to prove it. THe evidence is used to indicate applicability.

It's the applicability of any theorem to model reality that requires evidence. That evidence is never a part of determining whether, or not some statement is a theorem.

Note that the "no hair theorem" is still a theorem, but it was shown not to be an accurate model of reality and thus is only an element of mathematics, not science(physics). The no hair theorem said in summmary that, black holes mask their contents.

56 posted on 04/02/2009 11:36:46 PM PDT by spunkets
[ Post Reply | Private Reply | To 54 | View Replies]

To: spunkets
Bad link... Noether's theorems.
57 posted on 04/02/2009 11:39:07 PM PDT by spunkets
[ Post Reply | Private Reply | To 56 | View Replies]

To: spunkets

If you are so adamant then you should easily be able to prove it untrue. You can’t.


58 posted on 04/03/2009 5:20:01 AM PDT by Blood of Tyrants (Socialism is the belief that most people are better off if everyone was equally poor and miserable.)
[ Post Reply | Private Reply | To 16 | View Replies]

To: GodGunsGuts
Looks like the evos “successfully” shifted the discussion to the definition of theorem, rather than the essential point of whether information can originate itself in matter.

Oh, and it only took 32 posts for someone to attack the author's “worthiness”.

GGG, count it as a victory.

59 posted on 04/03/2009 5:34:02 AM PDT by Cedric
[ Post Reply | Private Reply | To 54 | View Replies]

To: spunkets

“Also, there’s no assumption that things operate the same way when they’re not observed. It’s a requirement that they do so. Otherwise A?A !”

Not going to get into the theology, and generally agree with your assessment of the site, but your above statement is factually incorrect. I’m guessing you have studied QM much. The reality is that it is provable (and has been proven) that at the QM (and in some cases if set up properly macro) level, A(observed) does NOT equal A(unobserved). The double slit exeriment is the classical (pun intended) experiment that shows this. It’s very counter-intuitive stuff. Also google for the double slit experiment using star light - great example of the distemporal nature of this QM effect. PS It has been a while since I took my graduate level QM classes and I don’t use this stuff much anymore, so apologies for inacuracies in terminology.


60 posted on 04/03/2009 6:13:04 AM PDT by piytar (Obama = Mugabe wannabe. Wake up America.)
[ Post Reply | Private Reply | To 17 | View Replies]


Navigation: use the links below to view more comments.
first previous 1-2021-4041-6061-80 ... 221-230 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson