Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

Is Randomness Really Random?
Dr. Robert A. Herrmann, Professor of Mathematics, U.S. Naval Academy ^ | 2002 | Robert A. Herrmann, Ph.D.

Posted on 01/31/2003 11:43:00 AM PST by CalConservative

The Abuses of Randomness: My Almost Final Thoughts on this Subject

Robert A. Herrmann Ph.D.


Physical science assumes that a string of symbols or images taken from a "language" is an accurate description for an actual physical event. Such correspondences between physical behaviors that can be represented by "strings of symbols or images," in one form of another, are the absolute foundations of modern physical science. In all that follows, it is assumed that such correspondences are being used.

In [1, pp. 48-54] and under limited conditions, Bohm gives a descriptive explanation for Brownian motion and similar molecular behavior within liquids and gasses. These explanations are also predictions that under specific general conditions such behavior is certain. [In what follows, I will use a few terms as they are defined or discussed in reference [3].] In [4], it is formally established and informally discussed in [3], that such descriptions must rationally follow from a science-community's logic system and from the mathematical operator called a consequence operator that characterizes such logic-systems. As defined in [3], this means that all such behavior is intelligently designed, at least, on the level of human intelligence.

Throughout these Bohm descriptions, another concept is utilized continually. He states that, within this limited context, individual molecular motion is "random or irregular." Bohm points out that this has led to the assertion that such individual behavior is lawless; it is neither sustained nor guided in any manner. Within the limited context employed by a specific science-community, where the term "random" is used to discuss physical behavior, it is rather subjective in character. It seems to mean that the behavior is "unpredictable," that it is not guided in its behavior using certain science-community logic-systems. Further, the behavior may seem to have no "purpose," where the notion of "purpose" is considerably philosophic in character.

There is a significant philosophic type of "randomness." This general or absolute randomness asserts that such behavior

. . . is not considered as being arbitrary and lawless relative to a certain limited and definite context, but rather as something this is so in all possible contexts [1,p. 63].

Since the word "all" is used here, any absolute verification of this statement is not possible except by induction, a method that cannot lead to absolute fact. Indeed, as shown in [3], this philosophic assertion is false if science-communities allow the language and logic-systems they use to be extended so that the feature termed "randomness" can be further investigated. Relative to human comprehension and operationally, general randomness certainly means that there is no language, no theory, that will "ever" be able to predict the exact occurrence of an event. Thus, if you have one event, then this would apply to any other event in a finite sequence of events. Further, there could not be an exact relation that requires two or more events to be in a specific order, for then in the context of "order" the behavior is neither arbitrary nor lawless. This lack of order leads to the union consequence operator notion. And this one general randomness requirement leads specifically to behavior that's guided by an intelligent agent. Further, using this weak consequence operator, single events can always be considered as produced by an intelligent agent. Of course, depending upon the theory used by a science-community , other stronger intelligent agents can also guide this "unordered" behavior. [For the definition of this weak consequence operator, see "Further Explanations Page 153" at this URL<\font> (www.serve.com/herrman/thebooks.htm)<\font> ]

As to the claim that randomness is an absolute and final feature of a theory, Bohm states:

. . . the assumption of the absolute and final character of any feature of our theories contradicts the basic spirit of the scientific method, which requires that every feature be subject to continual probing. . . . [1, p.132]

Thus, if a recognized scientific method is used, such as mathematical modeling, than adjoining to a science-community's logic-system further explanations for apparent lawless behavior as it relates to a particular theory, does not violate the scientific method, according to Bohm and many others. The form of randomness that is restricted to a particular scientific language and theory, I term as theory-randomness. Theory-randomness can be subdivided into other categories as well.

This lawless theory-randomness is in direct conflict with the notion of determinism.

Mathematical determinism is defined as follows: Assume that the behavior of a natural-system is defined by a set of parameters expressed in a specific mathematical form. If you are given the exact expressions for a specific set of these parameters, then there exists a relation, implicit or explicit, between these expressions that allows one to predict the mathematical expressions for all of the remaining parameters.

But we also have a more general notion of determinism.

General determinism is defined as follows: Assume that the behavior of a natural-system is defined by a finite set A of characteristics taken from a language L. If you are given a second finite set of chacteristics contained in L, then there exists an implicit or explicit logic-system that allows one to predict the set B from A.

Technically, mathematical determinism is an example of general determinism. Further, by a special construction both A and B can be considered as containing by one image. It is shown at this URL <\font> (www.serve.com/herrman/thebooks.htm)<\font> that for any A and any B such a logic-system always exists.

Suppose that it's possible to conduct the following experiment with photons and a piece of flat glass. You place a photon detector that will "click" each time a photon is "reflected" from the glass at the angle of 45 degree to the flat glass surface. You also count the photons, one at a time, as they leave a photon generator. You know the photon speed and can tell whether a specific generator emitted photon has caused the detector to "click;" indicating whether the generated photon is "reflected" or scattered within the glass. When there is no "click" for a generated photon, you write down a 0. But, when for an emitted photon there is a "click," you write done a 1. During each of three days you conduct this experiment with 20 generated photons. This yields the following three lists of zeros and ones.

(a) 00100100101110011011

(b)10000110001110000101

(c)11110010100100010111

Studying these partial sequences of zeros and ones, it might appear that there is no mathematical expression that will deterministically generate partial sequences that "look" exactly like these. Indeed, one might conclude that they appear to be "randomly" selected. Suppose that these zeros and ones pass every statistical test for independent or individual "random" behavior. If we, however, add the numbers 0 and 1 in succession and create a ratio of the result of these additions divided by the number of zeros and ones we have added, we get the following partial sequences of rational numbers.

(a) 0/1,0/2,1/3,1/4,1/5,2/6,2/7,2/8,3/9,3/10,4/11,4/12,6/13,6/14,6/15,7/16,

8/17,8/18,9/19,10/20

(b)1/1,2/2,2/3,2/4,2/5,3/6,4/7,4/8,4/9,4/10,5/11,6/12,7/13,7/14,7/15,7/16,

7/17,8/18,8/19,9/20

(c)1/1,2/2,3/3,4/4,4/5,4/6,5/7,5/8,6/9,6/10,6/11,7/12,7/13,7/14,7/15,8/16,

8/17,9/18,10/19,11/20

Notice that under this addition law, the last ratios in each case are equal to or nearly equal to 1/2. This does not mean that these ratios will stay "near to" 1/2 if I continue these experiments to say 30 generated photons. But, statistical analysis seems to indicate that there is a high probability that if I continue these partial sequences "far enough," then the last term in the sequence will more closely cluster about the number 1/2 and stay "near to" this number as I continue adding more and more of the zeros and ones. Suppose that you use certain statistical tests that assert that this is a type of "randomness" for individual events and indicates indeterminate unguided behavior. One might conclude that such sequences of zeros and ones cannot be deterministically generated. That with the language you use you cannot predict from your knowledge of the previous event, the 0 or 1, the value of the next event and still maintain such "randomly" generated sequences of zeros and ones.

In reference [7], one of the foremost statisticians, Mark Kac presents what has been known for more than one hundred years. If you extended your language to include additional notions from basic mathematical analysis, then the claim that such sequences are not guided exactly is false. Take any real number x such that 0 < x <1. Then consider the completely deterministic sequence {2x,4x,8x,16x, . . . . ,(2^nx), . . .}. Now consider the following additional rules. For each of these numbers consider the fractional part. For example, suppose you took x = (1/2)^(1/2). Then we have the sequence {2x=1 + .414,4x= 2 + .83,8x= 5 + .66, 16x = 11 + .31, . . . }. Now to get the sequence of zeros and ones, you follow the deterministic rule that states; write down a 0 if the fractional part of the number is < .5 and write down 1 if the fractional part is > or = .5. Hence, in this case, the sequence would look like {0,1,1,0, . . .}. There is a mathematical statement that says that there exists a vast "quantity" of irrational numbers x that in this deterministic manner will generate sequences of zeros and ones that cannot be differentiated from the photon generated sequences. Indeed, they will pass every statistical test for independent, unguided or "random" individual event behavior. Further, these sequences will have the exact same "convergence" property as these photon generated sequences. One can state as fact that it is rational to assume that such sequences depicting these photon events are intelligently designed and guided by an intelligent agent via such a deterministic expression.

Because of this and other considerations Kac writes:

From the purely operational viewpoint, however, the concept of randomness is so elusive as to cease to be viable [4, 406].

Although my above-italicized statement is fact, some physical scientists object to the use of this fact since, in my example, the displayed form for the x may not be the correct irrational number that generates a photon generated sequence. However, there is an absolute counter to this rejection. In the subject of Quantum Electrodynamics (QED), the basic interactions are produced by "virtual" photons. By very definition, these assumed "physical" objects couldn't be physically displayed in objective reality. They are used to mediate sequences of physical interactions within the microphysical world. Although QED predicts how gross matter will behavior relative to these interactions, the interactions themselves cannot be physically displayed within objective reality. You can even claim that these "hidden" QED processes are purely imaginary in character and simply model what, in reality, are humanly incomprehensible processes. Hence, for the photon generator in the example, it is just as rationally correct to accept that there is such a rational deterministic process that generates the photon reflection behavior although we might not be able to display some of its features in objective reality. These results are physical science results. One simply needs to acknowledge that, in a classical sense, a deterministic "physical" law of nature is at work here and it guides the photon behavior.

I point out that there are hundreds of mathematical expressions that deterministically yield what many physical scientists maintain is physical behavior that is neither controlled nor rationally predicted. What is the true reason that many in the scientific community continue to expound their notions of "random" rather than to acknowledge that it's just as possible that such behavior is designed by an intelligently constructed process? One of the foremost builders of mathematical models states:

As to the inherent randomness of Nature, this appears to be as much a question of subjective psychology as it is a matter of physics and mathematics . . . [3,p. 405]

Indeed, the term "randomness," with but a vague definition, is used by many science-communities in order to present a psychological foundation for their philosophy. Their only recourse, when presented with the above deterministic account for what they claim is unguided lawless individual behavior, is to reject or ignore the additional mathematical language. They reject a basic process allowed by the scientific method. What happens is a complete ad hoc rejection of anything that might be deterministic in character (e.g. classical dynamics). Consider the following totally psychological and philosophic statement, which is typical of how this fact is handled by many science-communities.

. . . there is no place for true randomness in deterministic classical dynamics (although of course a complex classical system can exhibit behavior that is in practice indistinguishable from random) [9, p. 4].

Shortly, I'll show that it is absolute fact that the term "random" is used for philosophic reasons and can be eliminated from any scientific description for natural-system behavior. Note that relative to an intelligent agent language, one can contend that all of these mathematically presented deterministic statements are intelligently designed, at least, on the human level of intelligence and they guide exactly this behavior. But, even in the most general case where no such deterministic statement is found, it is shown in [3] and [5] that the following statement is still fact. It is rational to assume that all probabilistic natural-system behavior is intelligently designed and an intelligent agent decides upon the occurrence or non-occurrence of each physical event. Thus, an intelligent agent is associate with all claimed "random" behavior and such behavior is neither lawless nor without guidance, when a science-community's theory is extended to include these new features.

Although not using a specific term for the notion, one of the first individuals to argue that some "randomness" appears to be language dependent is D. Bohm [1]. He uses the idea of different levels of chance behavior. At one level, the claimed randomness that is encapsulated by Born's probability distribution is considered as the final unexplained property of matter. But, on the other hand, Bohm has a form of "random" behavior at a lower level, so to speak, and what was previously thought to be lawless behavior at a previous level is now guided behavior [1, pp. 111-115]. [Please note that in [5] all such probability distributions are shown to be intelligently designed and an intelligent agent guides each occurrence of an event.]

Does an actual description for physical behavior require that the notion of "randomness" be mentioned in order to convey an accurate mental image of the behavior? On page 48 of [1], Bohm gives the usual explanation for what is know as Brownian motion. I will copy this explanation but remove any reference to "random" or "unregulated" behavior. The portions removed are indicated by the symbols [].

. . . we first note that, although each smoke particle is small, it still contains of the order of 10^8 atoms or more. Thus, when it is struck by a molecule of gas in which it is suspended, it will receive an impulse which causes it to change its velocity slightly. Now the gas molecules are moving quite rapidly (with velocity of the order of 10^4 cm/sec.), but because the smoke particle is much heavier that an atom, the result of its being struck by an individual atom will be a comparatively small change of velocity. Since it is being struck continually [] by the gas molecules, we expect to obtain a corresponding slow [] fluctuation in the speed of the small particle. The larger the particle, the less will be the fluctuation. Thus, some fluctuation in velocity will persist even for particles of macroscopic size (such as a chair), but its magnitude will be completely negligible. To obtain an appreciable effect we need to go to sub-microscopic bodies.

When the mean speed of the fluctuation for particles of a given size was calculated, it was found to agree with that observed, within experimental error. [] Later more direct evidence was found; for with modern techniques and apparatus it became possible to measure the velocities of individual atoms, and thus to show that they are really moving [] with the distribution of velocities predicated by the theory.

The operational notions of finding a statistical mean and also showing that the velocities satisfy a physical probability distribution are retained. Since one uses statistical tests to determine that a distribution would be an appropriate way to predict, via a probabilistic language, such behavior, then even at this additional depth of analysis the philosophic notion of unregulated or random behavior need not be included in the description. Indeed, a science-community that is really interested in presenting truth, rather than forcing their philosophy upon an individual, would replace any mention of unregulated behavior with a new term. I propose that the new scientifically verified technical term be mindom. [Said mine'dum].

Mindom. A noun or adjective that means natural-system behavior that is intelligently designed, and produced, sustained or guided by an intelligent agent via the theory of general intelligent design and that is either classified as (1) modeled by a probability model, or (2) composed of individual or group events that are considered as unregulated or unpredictable via other forms of scientific analysis, or (3) (prior to the twenty-first century) considered as random (i.e. classically random).

Hence, from this moment on one should write "mindom walk," "mindom quantum fluctuations," "mindom mutations" and the like.

Another terms that are abused by scientific-communities are the terms "order" and "disorder." These are most often used in connection with various applications of the Second Law of Thermodynamics and exhibited configurations when entities are in "thermal" (i.e. energy) equilibrium. But these terms are entirely misleading since they don't actually refer to what is either a mathematical order or even an ordered array as humanly observed. The use of this term is but another attempt to force upon us the philosophic notion of "randomness."

In a major textbook and in the section entitled "Entropy and Disorder," relative to energy states, Tipler fills a box with gas at assumed thermal equilibrium and moves the box along a frictionless table and then the box is stopped by a wall. The notion is of "ordered" energy that can do work in the sense that the entire system, the box with the gas, is in an energy state that can do work until stopped by the wall. But, once the box has stopped, then the gas's internal energy is in such a state that it cannot do this same type of work.

This is the gas's internal thermal energy, which is related to temperature; it is random, nonordered energy. . . . The gas now has the same total energy, but now all of it is associated with the random motion of its molecules about its center of mass, which is now at rest. Thus the gas has become less ordered, or more disordered, and it has lost the ability to do work [10, p. 577].

In the usual case, the gas molecules satisfy a well-defined deterministic energy distribution function. Does the notion of non-ordered really have any meaning? Tipler tries to illustrate that this might refer to statistical probability for an isolated system of a few molecules. He separates the box into two sections. The "random" notion comes into play when he requires each gas particle to have an equally likely chance of moving into the "left" or "right" section. Under these conditions, Tipler claims that the probability for one and only one of the 10 gas molecules to move into the left section is 1/2. The probability for all ten particles to move simultaneously into the left section is (1/2)^10. Hence, he claims that one molecule in the left section is "disorder" while all ten being in the section is order [10, p. 583]. This is rather not the case. Usually, what we need is a macroscopically large quantity of gas confined to a box. Then evidence indicates that, as time progresses for this isolated system, the point density of the gas tends towards a mathematically "simple" probability distribution function. The converse also appears to be the case. [The system also tends towards a similar energy probability distribution.] Relative to the statistical behavior of such gas molecules, Maxwell states after his equation 57:

We may therefore interpret the expression (57) as asserting that the density of a particular kind of gas at a given point is inversely proportional to an exponential function whose index is half the potential energy of a single molecule of the gas at that point, divided by the average kinetic energy corresponding to a variable of the system [8].

I do not agree that the resulting probability density function yields less "ordered" [disordered] behavior than if such a function did not exist. Obviously, as the isolated system tends towards thermal equilibrium, we have individual mindom behavior in the strong sense since such a distribution function begins to reveals itself. The same can be said for the speed and energy distributions.

I accept that, in this case, the notion of order or disorder is a philosophic notion that is but an attempt to force the philosophic notion of "random" on the scientific world. Hence, a more appropriate term is necessary for what actually occurs when natural-systems behave in this way, such as when one applies the Second Law of Thermodynamics. I replace the term "ordered behavior" with the term complex behavior. I replace the term "disorder behavior" with the term that actually implies how the density, speed and energy are distributed. I use the term simple behavior. In fact, I have used this approach previously as an actual measure of information. Of course, all such behavior is intelligently designed and produced, sustained or guided by an intelligent agent.

As I have pointed out many times, a great deal cannot be known about these processes due to the existence of ultranatural theories, ultranatural laws and ultranatural events. We can only have a vague comprehension as to what is actually happening. For example, if one assumes that all changes in physical parameters within our universe are discrete changes, it has been shown that in the Nonstandard Physical World such changes can be produced by "continuous changes" [6].

Always remember that the following can be rationally assumed. All natural-system behavior is first, intelligently designed. Then such behavior is produced, sustained or guided by intelligent agents. I note that since the designing intelligence has designed all of the intelligent agents, then they are but manifestations of the designing intelligence.

Finally, I mention that the above results are products of the interpreted General Grand Unification (GGU) model. The interpreted GGU-model is so scientific in character, that whenever an individual conducts a scientific experiment that verifies a statement, then the experiment also verifies the interpreted GGU-model.

References

[1] Bohm, D. (1957) Causality and Chance in Modern Physics, Harper & Brothers, NY.

[2] Casti, J. (1989) Alternate Realities, John Wiley & Sons, NY

[3] Herrmann, R. A. (2002) Science Declares Our Universe IS Intelligently Designed, Xulon Press.

[4] Herrmann, R. A. (2001) Hyperfinite and standard unifications for physical theories, Internat. J. of Math. Math. Sci., 28(2):93-102.See also www.arXiv.org/abs/physics/0101009 or www.arXiv.org/abs/physics/0205073

[5] Herrmann, R. A. (2001) Ultralogics and probability models, Internat. J. of Math. Math. Sci., 27(5):321-325. Also see www.arXiv.org/abs/quant-ph/0112037

[6] Herrmann, R. A. (1989) Fractals and Ultrasmooth Microeffects, J. Math. Physics, 30(4), April: 805-808. Also see www.arXiv.org/abs/math.GM/9903082 pages 70-71.

[7] Kac, M. (1983) Marginalia. What is random?, American Scientist 71(4):405-406

[8] Maxwell, J. C. (1876) On Boltzman's Theorem on the average distribution of energy in a system of material particles, Cambridge Phil. Soc. Trans. Vol. XII.

[9] Preskill, J. (1997) Course information for Physics 229. http://theory.caltech.edu/people/preskill/ph229/

[10] Tipler, P. A, (1990) Physics, for Scientists and Engineers, Vol. 1, Worth Publishers, NY.


TOPICS: News/Current Events
KEYWORDS: creation; crevo; crevolist; evolution; mathematics
Navigation: use the links below to view more comments.
first 1-2021-27 next last
Maybe things aren't as random as the evolutionists would have us believe. How about intelligent design?
1 posted on 01/31/2003 11:43:00 AM PST by CalConservative
[ Post Reply | Private Reply | View Replies]

To: CalConservative
Great post, Cal.
2 posted on 01/31/2003 11:49:45 AM PST by RAT Patrol
[ Post Reply | Private Reply | To 1 | View Replies]

To: CalConservative
Hey..your freeper page is a great resource.
3 posted on 01/31/2003 11:50:55 AM PST by RAT Patrol
[ Post Reply | Private Reply | To 1 | View Replies]

To: CalConservative
Depends on your reference and scale
4 posted on 01/31/2003 11:50:57 AM PST by stuartcr
[ Post Reply | Private Reply | To 1 | View Replies]

To: RAT Patrol
Hey..your freeper page is a great resource.

he, he - you can probably tell where my interests lie.

5 posted on 01/31/2003 11:59:08 AM PST by CalConservative
[ Post Reply | Private Reply | To 3 | View Replies]

To: CalConservative
mindom..... intelligently designed by Prof. Herrmann.
6 posted on 01/31/2003 12:06:59 PM PST by stanz
[ Post Reply | Private Reply | To 1 | View Replies]

To: PatrickHenry; Junior
ping
7 posted on 01/31/2003 12:08:25 PM PST by stanz
[ Post Reply | Private Reply | To 1 | View Replies]

To: CalConservative
Thank you so much for the post! Very timely! We have been all over these issues at Intelligent Design and Creationism Just Aren't the Same
8 posted on 01/31/2003 12:11:19 PM PST by Alamo-Girl
[ Post Reply | Private Reply | To 1 | View Replies]

Comment #9 Removed by Moderator

To: CalConservative
bump...
10 posted on 01/31/2003 12:13:33 PM PST by danneskjold
[ Post Reply | Private Reply | To 1 | View Replies]

To: balrog666; Condorman; *crevo_list; donh; general_re; Godel; Gumlegs; Ichneumon; jennyp; ...
Ping.
11 posted on 01/31/2003 12:16:14 PM PST by Junior (Put tag line here =>)
[ Post Reply | Private Reply | To 7 | View Replies]

To: CalConservative
Blah, blah, blah.
12 posted on 01/31/2003 1:08:44 PM PST by HassanBenSobar
[ Post Reply | Private Reply | To 1 | View Replies]

To: Junior
It is rational to assume that all probabilistic natural-system behavior is intelligently designed and an intelligent agent decides upon the occurrence or non-occurrence of each physical event. Thus, an intelligent agent is associate with all claimed "random" behavior and such behavior is neither lawless nor without guidance, when a science-community's theory is extended to include these new features.
It's so easy for some people to get carried away. Why doesn't he limit himself to the evidence, by saying: "It is rational to assume that all probabilistic natural-system behavior is intelligently designed subject to observable patterns not yet understood and an intelligent agent decides upon which determine the occurrence or non-occurrence of each physical event."
13 posted on 01/31/2003 1:29:19 PM PST by PatrickHenry
[ Post Reply | Private Reply | To 11 | View Replies]

To: PatrickHenry
Either that, or he's talking about Maxwell's demon.
14 posted on 01/31/2003 2:16:52 PM PST by HassanBenSobar
[ Post Reply | Private Reply | To 13 | View Replies]

To: CalConservative
I'm not sure where the good doctor was intending to go with this theory, but it seems to go off the deep end into "everything is designed."

If everything is designed, then everything is following a script, and free will is a designed illusion (and even the recognition of the illusion is designed, and the conversation about the recognition of the illusion is designed, ad infinitum.)

So, what good is this idea? Oops, never mind, I was designed to ask that... ;)

15 posted on 01/31/2003 2:22:14 PM PST by forsnax5 (It's designers, all the way down)
[ Post Reply | Private Reply | To 1 | View Replies]

To: CalConservative
Well, this explains what happens to my tools when the garage is closed at night.
16 posted on 01/31/2003 2:40:54 PM PST by Old Professer
[ Post Reply | Private Reply | To 1 | View Replies]

To: CalConservative
Studying these partial sequences of zeros and ones, it might appear that there is no mathematical expression that will deterministically generate partial sequences that "look" exactly like these.

I stopped reading after this nonsense. Given any arbitrary but finite sequence of numbers, there is an infinite group of algorithms that can be constructed to produce it.

17 posted on 01/31/2003 2:56:03 PM PST by balrog666 (If you tell the truth you don't have to remember anything - Mark Twain)
[ Post Reply | Private Reply | To 1 | View Replies]

To: CalConservative
I think it's widely acknowledged that "random" means "we can't predict it." I'm not sure how this fits into an ID/Evolution debate but I'll have to read the article later.
18 posted on 01/31/2003 3:18:25 PM PST by MattAMiller
[ Post Reply | Private Reply | To 1 | View Replies]

To: CalConservative
Maybe things aren't as random as the evolutionists would have us believe.

Then again, maybe they are.

Do you have any evidence for your supposition?

I'm afraid Herrmann's contribution doesn't count, it's just armchair philosophizing, based on no evidence at all, just his own personal prejudice that "it can't be random, it can't, not even things that don't matter squat like Brownian motion, they can't be just random!"

That's an interesting viewpoint, but it hardly counts as evidence one way or the other, even when it is stretched out into a several-hundred-line long ramble.

Herrmann's mental state appears even more questionable when you read his book, which is available online. He says it's not published because people are "afraid" to publish it, but a more likely explanation seems to be that it's pure twaddle. For example:

What in the "world" is an ultraword? If you've read any of my older writings in this subject, then an ultraword is a more consistent term for what I've called previously a "superword." This fact, of course, doesn't explain what the term means. Indeed, I'm sure I can't explain its complete meaning since many of its properties are intuitive in character and, except in a negative comparative sense, have no corresponding Natural language properties.
Ooookay...

His book is a long-winded (and windy) advocate of his personal "MA-model", which boiled down to its essentials is a minor variation on the old philosophy class teaser, "if the world was created last Thursday, and so were you along with 'memories' and artifacts (driver's license, etc.) of a past which never existed and so on, how could you tell?"

Well you couldn't, of course, but the even aside from the question of why would God (or aliens, or whomever) play such a trick on us, and aside from the philosophical question of if you can't tell does it matter, the fact remains that the most straightforward (and therefore probably correct) explanation for our current situation and memories is that we really *have* been here for a while, we *were* born and grew up, a zillion shark teeth in the ocean floor means that there really were a lot of sharks in the past who dropped them, and so on.

And likewise for Herrmann's song and dance about randomness. Sure, some "Intelligence" might be steering each and every particle, but not in a way that can be distinguished from actual randomness, but then why bother? If the results are indistinguishable from randomness anyway, then a) why presume a driver in the first place, and b) why would they waste their time?

Note that if the "driver of randomness" were actually nudging things to produce preferred non-random results, this would be noticeable by statistical analysis. Since this is the case, why play around with dice and why not just take an active hand since you're going to get caught loading the dice anyway?

Herrmann is obviously exercising his philosophical preferences, not providing any actual support for what he'd like to believe. He's just indulging in the old philosophical refuge nicknamed "the God of the Gaps". He thinks he's being original, but he's following well-trod ground.

Finally, it's not like "randomness" isn't understood in most cases. Herrmann ties to paint it as something unknown and mysterious (and thus maybe God is hiding in it, making it work), but more often than not "randomness" is just the name applied to the large-scale statistical effects of *deterministic* processes. There's nothing mysterious about it -- if you cared to observe the processes up close, you could watch the small-scale effects at work, in their predictable manners.

How about intelligent design?

How about it?

19 posted on 01/31/2003 7:04:43 PM PST by Ichneumon
[ Post Reply | Private Reply | To 1 | View Replies]

To: Doctor Stochastic
This-thread-is-right-up-your-alley-ping!
20 posted on 01/31/2003 7:19:02 PM PST by longshadow
[ Post Reply | Private Reply | To 13 | View Replies]


Navigation: use the links below to view more comments.
first 1-2021-27 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson