Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

Numerical Models, Integrated Circuits and Global Warming Theory
American Thinker ^ | February 28, 2007 | Jerome J. Schmitt

Posted on 02/28/2007 8:25:29 AM PST by Tolik

Jerome Schmitt is president of NanoEngineering Corporation, and has worked in the process equipment and instrument engineering industries for nearly 25 years.

Global warming theory is a prediction based on complex mathematical models developed to explain the dynamics of the atmosphere. These models must account for a myriad of factors, and the resultant equations are so complex they cannot be solved explicitly or "analytically" but rather their solutions must be approximated "numerically" with computers.  The mathematics of global warming should not be compared with the explicit calculus used, for example, by Edmund Halley to calculate the orbit of his eponymous comet and predict its return 76 years later.

Although based on scientific "first principles", complex numerical models inevitably require simplifications, judgment calls, and correction factors.  These subjective measures may be entirely acceptable so long as the model matches the available data -- acceptable because the model is not intended to be internally consistent with all the laws of physics and chemistry, but rather to serve as an expedient means to anticipate behavior of the system in the future. However, problems can arise when R&D funding mechanisms inevitably "reward" exaggerated and alarming claims for the accuracy and implications of these models. 

Many other scientific fields besides climatology use similar models, based on the same or related laws of nature, to explain and predict what will happen in other complex systems.  Most famously, the US Department of Energy's nuclear labs use supercomputer simulations to help design atomic weapons. Most of this work is secret but we know, of course, that the models are "checked" occasionally with underground test explosions. The experimental method is an essential tool

A much better analogue to climate science is found in the semiconductor industry. Integrated circuits and many other building blocks of modern electronics are manufactured by creating artificial atmospheres or "climates" within which chemical vapor deposition (CVD) forms nanometer-scale thin solid films on silicon wafer surfaces. In CVD, metal vapor precursors entrained in carrier gases are used to deposit metal films on surfaces in a condensation process not unlike formation of dew or frost on a lawn.  In such CVD processes, premature formation of metal particles is unwanted and needs to be controlled and prevented; such particle formation is akin to precipitation of rain drops in the atmosphere

The semiconductor process industry uses numerical models to predict the behavior of gases and vapors in order to deposit these substances on substrates, and thereby manufacture integrated circuits. I am not a climatologist or meteorologist but I have studied fluid mechanics and gasdynamics and have a general understanding of computer models used in process engineering.  Such models are used to analyze industrial processes with which I am familiar.  Indeed the mathematics for such models is generalized.  And industry's experience with numerical process models sheds light on their strengths and limitations.

Andrew Grove PhD is a giant in the history of semiconductors. A founder of Intel, Grove famously presided as CEO over its enormous growth during the 1980s and 1990s. Few realize that his academic training is as a Chemical Engineer, not an Electrical Engineer.  Chemical Engineering is at the heart of what Intel and other semiconductor manufacturers accomplish. 

Process Models: Vapor deposition

Let's consider how these process engineering mathematical models are actually used in industry.  Intel and its competitors (as well as their key suppliers) employ many chemical engineers who are familiar with such process models, some of whom specialize solely in mathematical modeling.   Often a new technical challenge will emerge in which a process must be changed (such as for scale-up to accommodate larger silicon wafers) or adjusted to accommodate a new material composition. 

Almost all semiconductor manufacturing processes occur in closed vessels.  This permits the engineers to precisely control the input chemicals (gases) and the pressure, temperature, etc. with high degree of precision and reliability.  Closed systems are also much easier to model as compared to systems open to the atmosphere (that should tell us something already).  Computer models are used to inform the engineering team as the design the shape, temperature ramp, flow rates, etc, etc, (i.e. the thermodynamics) of the new reactor.

Nonetheless, despite the fact that 1) the chemical reactions are highly studied, 2) there exists extensive experience with similar reactors, much of it recorded in the open literature, 3) the input gases and materials are of high and known purity, and 4) the process is controlled with incredible precision, the predictions of the models are often wrong, requiring that the reactor be adjusted empirically to produce the desired product with quality and reliability.

The fact that these artificial "climates" are closed systems far simpler than the global climate, have the advantage of the experimental method, and are subject to precise controls, and yet are frequently wrong, should lend some humility to those who make grand predictions about the future of the earth's atmosphere.

So serious are the problems, sometimes, that it is not unheard of for an experimental reactor to be scrapped entirely in favor of starting from scratch in designing the process and equipment. Often a design adjustment predicted to improve performance actually does the opposite.  This does not mean that process models are useless, for they undergird the engineer's understanding of what is happening in the process and help him or her make adjustments to fix the problem.  But it means that they cannot be relied upon by themselves to predict results. These new adjustments and related information are then used to improve the models for future use in a step by step process tested time and again against experimental reality. 

In actuality, the semiconductor industry is well familiar with the limits of process modeling and would never make a decision to purchase equipment or adjust their manufacturing processes based on predictions derived from models alone.  They would rightly expect extensive experimental data to support such a decision in order to assure the ability to reliably and economically manufacture high quality materials and devices. 

Climate Models

As with all fluid mechanics models, the flow field of a climate model (i.e. the entire atmosphere) is divided into three-dimensional grids of small volume elements designated by latitude, longitude and altitude. Each volume element of the grid is then characterized with parameters such as pressure, temperature, wind velocity, etc., and equations that relate these factors.  Air and energy that leave one volume element enters the adjacent one.  When summed across all volume elements, the model keeps track of the flows of air and energy in the entire atmosphere.  Many factors must be accounted (see below).  Boundary conditions must be set: in this case, the boundary of the atmosphere is land or ocean surface on the bottom, and some boundary in space on the top; these yield rules (e.g. air cannot flow into the surface of the earth).  Then, Initial Conditions must be set: this means that the grid's equations are "populated" with the known values of the parameters characterizing the atmosphere such as pressure, temperature, and humidity profiles measured today. 

Finally, the computer calculation can commence:  A unit of time (a second, minute, day) is assumed to pass and the computer calculates the next "state" of the model based on the initial conditions, the boundary conditions and the other equations of the model.  This process is repeated again and again, with the new state being the initial condition for calculating the subsequent state, until e.g. 100 years has passed.

Errors can accumulate rapidly.  Let's list some of the factors that must be included (by no means an exhaustive list):

Solar flux
Gravity, Pressure
Temperature
Density
Humidity
Earth's rotation
Surface temperature
Currents in the Ocean (e.g., Gulf Stream)
Greenhouse gases
CO2 dissolved in the oceans
Polar ice caps
Infrared radiation
Cosmic rays (ionizing radiation)
Earth's magnetic field
Evaporation
Precipitation
Cloud formation
Reflection from clouds
Reflection from snow
Volcanoes
Soot formation
Trace compounds
And many, many others
Even if mathematics could be developed to accurately model each of these factors, the combined model would be infinitely complex requiring some simplifications.  Simplifications in turn amount to judgment calls by the modeler.  Can we ignore the effects of trace compounds?  Well, we were told that trace amounts of chlorofluoro compounds had profound effects on the ozone layer, necessitating the banning of their use in refrigerators and as aerosol spray propellants.  Can we ignore cosmic rays?  Well, they cause ions (electrically charged molecules) which affect the ozone layer and also catalyze formation of rain-drops and soot particles. 

As with all models, it is perilous to ignore factors in the absence of complete experimental data which might have otherwise have significant effect.

Perhaps most critically, the role of precipitation in climate seems to be understated in the numerical global climate models. Roy W. Spencer, principal research scientist at the Global Hydrology and Climate Center of the National Space Science and Technology Center in Huntsville, AL, writes that the role of precipitation is not fully accounted for in global warming models. In my view, that's like an economist admitting his theory of the money supply doesn't fully account for the role of the Federal Reserve.

Unless we know how the greenhouse-limiting properties of precipitation systems change with warming, we don't know how much of our current warmth is due to mankind, and we can't estimate how much future warming there will be, either. To solve the global-warming puzzle, we first need to learn much more about the precipitation-system puzzle.
What little evidence we now have suggests that precipitation systems act as a natural thermostat to reduce warming.
Approximating the experimental method

While mankind cannot experiment on the global climate, these models can be used retroactively to see how well they "model" the past.  The UN's 2001 Climate Change report distorted the historical record by eliminating the Medieval Warm Period in the famous "Hockey Stick Curve" which, by many accounts, unreasonably accentuated temperature rise in the 20th century.  Such distortion of the historical data undercuts the credibility of the models themselves, since this is the only "experimental data" available for testing the fidelity of the models to the actual climate.

Why on earth would climate scientists "massage the data" to produce doomsday predictions? The answer requires looking at the rewards available to these researchers.

Catastrophe and careers

Vannevar Bush's seminal 1944 policy paper unleashed the Federal government's unprecedented post-war investment in R&D in the hard sciences and engineering. Science was seen as the way to avoid (or at least win) another catastrophic war. 

The golden era of federal funding resulted in unprecedented employment opportunities for hard science Ph.D.s.  Fresh graduates could easily find tenure track employment at universities expanding their hard sciences program. The enormous dividends from this investment make up our modern technological world.  However, the munificence of the federal funding caused a certain, shall we say, insouciance about resources: "Why use lead when gold will do?" became an informal motto at Lawrence Livermore National Lab.

Inevitably, the growth in congressional funding tapered off and in the late 1980s the competition for R&D sponsorship began to tighten.  Fresh Ph.D.s often had to look to the private sector for employment (heaven forefend!).  Grant writers were required to start highlighting the potential "practical payoffs" of their proposed work.  Since there was little need for better atomic weapons in the post-cold war era, High Energy Physics lost its central status in the funding universe.  Many mathematical physicists became refugees to allied fields (some of them even became "quants" on Wall Street). But others found employment elsewhere, including in climate science.

In this competitive environment, one can imagine climate modelers justifying their work by citing the possibility of global change, the further study of which requires, of course, "more research".  One can further imagine that in the inchoate communication between university researcher, funding agency, congressional staffer and congressmen that "possibility" eventually became "probability" and then "probability" morphed into "certainty" of global warming, especially if there was potential for political advantage. 

This has resulted in an inadvertent funding-feedback mechanism that now resonates in largely unjustified alarm and also seeks to quash scientific dissidents who indirectly threaten to throttle the funding spigots.

The practical experience of numerical modeling in allied fields such as semiconductor process modeling should cause us to question the claimed accuracy for Global Climate Models.  The UN's distortion of historical climate data should further undermine our faith in climate models because such models can only be "tested" against accurate historical data. 

In my view, we should adopt the private sector's practice of placing extremely limited reliance on numerical models for major investment decisions in the absence of confirming test data, that is, climate data which can be easily collected just by waiting. 



TOPICS: Editorial; News/Current Events
KEYWORDS: globalwarming
Navigation: use the links below to view more comments.
first previous 1-2021-4041-60 next last
To: sportutegrl

I think you have a good point there.


21 posted on 02/28/2007 10:08:59 AM PST by expatpat
[ Post Reply | Private Reply | To 19 | View Replies]

To: Lonesome in Massachussets
Historical data is perfectly good for validating a model

But not if the model is inferred in whole or in part from that historical data. That can easily happen even without the modeler's knowledge. The only sure way to avoid that problem is to test a prediction about data as yet unknown.

22 posted on 02/28/2007 10:14:42 AM PST by edsheppa
[ Post Reply | Private Reply | To 14 | View Replies]

To: Tolik
Might I direct your attention to

Pikey, O.H., Pilkey-Jarvis, 2007, Useless Arithmetic, Columbia University Press, ISBN 0-231-13212-3.

For your reading enjoyment. BTW the subtitle of this book is "Why Environmental Scientists Can't Predict The Future"
23 posted on 02/28/2007 10:21:34 AM PST by lmailbvmbipfwedu
[ Post Reply | Private Reply | To 1 | View Replies]

To: sportutegrl
Can't claim the credit: I read a fascinating take on the modeling problem. Let's say that you have 100 parameters (I'd love to know if it a stretch to say that such complicated system as global climate can be described by 100 variables). And lets assume that we know each of the variables with 99% certainty (unbelievable certainty, but lets assume for the sake of this argument). The resulting certainty then is .99^100 = 0.3660 i.e. 36.66%.

37% is excellent if you play lottery. Is it good enough justification to spend trillions of dollars?.

BTW, if anybody can say that its wrong to argue this way, I'd like to be educated on why.
24 posted on 02/28/2007 10:22:22 AM PST by Tolik
[ Post Reply | Private Reply | To 19 | View Replies]

To: edsheppa

Just wait 10 years and see if any model works. Its only prudent if the price is in trillions dollars, isn't it?


25 posted on 02/28/2007 10:25:31 AM PST by Tolik
[ Post Reply | Private Reply | To 22 | View Replies]

To: edsheppa

Not true. Suppose that you want to calculate , say, the co-efficient of expansion of a metal as a function of temperature. You could perform an experiment and vary the the temprature systematically and come up with an answer. OTOH, if someone had reliable observations, you could simply fit your model to these observations. If you knew what you were doing, and knew the limits on the accuracy of observations, given a temperature, you could predict the expansion, and state reliable upper and lower bounds.

It someone gave you a temperature outside the range of the observations however, you would be on shaking ground. The metal might melt at a very high temperature, for instance.

Astonomers have *nothing* but historical data yet they can make very accurate predictions about planetary ephemeris, with reliable limits on the expected errors. Sixty years ago astronomers were able to predict (to within a few blocks) which streets on Manhattan would see a total eclipse and on which it would be partial.


26 posted on 02/28/2007 10:36:30 AM PST by Lonesome in Massachussets (When I search out the massed wheeling circles of the stars, my feet no longer touch the earth)
[ Post Reply | Private Reply | To 22 | View Replies]

To: Tolik
Another keeper!

I particularly loved these two paragraphs, Which should be self evident to anyone with just a conversant familiarity with the hard sciences:

Almost all semiconductor manufacturing processes occur in closed vessels. This permits the engineers to precisely control the input chemicals (gases) and the pressure, temperature, etc. with high degree of precision and reliability.

Closed systems are also much easier to model as compared to systems open to the atmosphere (that should tell us something already). Computer models are used to inform the engineering team as the design the shape, temperature ramp, flow rates, etc, etc, (i.e. the thermodynamics) of the new reactor.

Nonetheless, despite the fact that 1) the chemical reactions are highly studied, 2) there exists extensive experience with similar reactors, much of it recorded in the open literature, 3) the input gases and materials are of high and known purity, and 4) the process is controlled with incredible precision, the predictions of the models are often wrong, requiring that the reactor be adjusted empirically to produce the desired product with quality and reliability.

There was an article recently by a statistician, who pointed out that use of statistics analysis by amateurs can quickly lead to non-sensical results.
For example, if 3 assumptions are made which are 99% correct, the results can only be correct, significantly less than 99%.
When just a dozen factors are 99% correct, the results are only about 50% reliable.

With climate having dozens, perhaps hundreds of relevant factors, most of which are guessed at or ignored altogether, how reliable can these computer "model predictions" be?

27 posted on 02/28/2007 11:14:43 AM PST by Publius6961 (MSM: Israelis are killed by rockets; Lebanese are killed by Israelis.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Lonesome in Massachussets

You missed the point. I didn't say that models inferred from historical data are necessarily unreliable. I said it is unsurprising, and no indication of reliability, that a model fits data from which it was inferred. One should infer reliability only if the validating predictions are independent of the model.


28 posted on 02/28/2007 12:04:04 PM PST by edsheppa
[ Post Reply | Private Reply | To 26 | View Replies]

To: expatpat

As a scientist who has often used modeling in research, I concur that models have to be regarded with great suspicion -- unless there is LOTS of broad empirical data to back it up. It reminds me of the old saying about Magnetohydrodynamics calculations:

"It takes a genius to get computational results from these equations -- and a fool to believe them."
---<>---<>---<>---<>---<>---

And MHD of any earthborne system is child's play when compared with the vastly greater complexity of global climate/ weather. Even the Sun is more uniform and is probably more calculable than Earth, due to the Earth's continents and diverse terrain and diverse components.

By the way... is MHD of any use in describing auroras?


29 posted on 02/28/2007 12:21:45 PM PST by AFPhys ((.Praying for President Bush, our troops, their families, and all my American neighbors..))
[ Post Reply | Private Reply | To 20 | View Replies]

To: AFPhys

I don't know. I'm basically a solid-state physicist (mostly theoretical) and digital signal-processing guy.


30 posted on 02/28/2007 1:01:29 PM PST by expatpat
[ Post Reply | Private Reply | To 29 | View Replies]

To: Tolik

thanks, bfl


31 posted on 02/28/2007 1:41:04 PM PST by neverdem (May you be in heaven a half hour before the devil knows that you're dead.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Tolik
Great Post.

The author's analysis is devastating critique of the methodological fallacies that lay behind Climate Modeling as it is currently practiced.

There is also another major flaw in the current practice that the author did not mention that should warrant the summary rejection of the use these models in Public Policy.

The builders of the models on which the IPCC and other Reports have been based have refused to publish the input parameters and algorithms used to generate predictive results. Peer review is the backbone of scientific integrity and the lack of peer review outside the small coterie of model builders is a big of a red flag as can be imagined for any type of predictive model.

When the builder of the infamous "hockey stick" model was asked to publish his model assumptions he played the "political persecution" card. The highly and odd defensive reaction is indicative of how badly these models have been cooked for political purposes.
32 posted on 02/28/2007 1:52:21 PM PST by ggekko60506
[ Post Reply | Private Reply | To 1 | View Replies]

To: Forecaster; Nailbiter; BartMan1; stanley windrush

... ping


33 posted on 02/28/2007 2:15:21 PM PST by IncPen (When Al Gore Finished the Internet, he invented Global Warming)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Tolik

Mark


34 posted on 02/28/2007 2:17:15 PM PST by Former Proud Canadian (How do I change my screen name after Harper's election?)
[ Post Reply | Private Reply | To 1 | View Replies]

To: edsheppa

Not true. The internal consistency of the historical data just needs to be available. Astronomers know pretty well how accurate their predictions are, because they know how good their observations are. Calibrations are just a special kind of historical data, selected to allow low variance estimates.

I agree, models are less reliable outside the range of inputs over which they are validated. Anyone who understands modeling and statistical technique must be hightly skeptical of the entire Global Warming ideology.


35 posted on 02/28/2007 2:52:04 PM PST by Lonesome in Massachussets (When I search out the massed wheeling circles of the stars, my feet no longer touch the earth)
[ Post Reply | Private Reply | To 28 | View Replies]

To: Tolik

Bump!


36 posted on 02/28/2007 3:58:42 PM PST by listenhillary (You can lead a man to reason, but you can't make him think)
[ Post Reply | Private Reply | To 1 | View Replies]

To: expatpat
Perhaps the scientist uses a ROM and the engineer uses a WAG.

The Difference Between an Engineer and a Scientist

An engineering student asked his professor the difference between an engineer and a scientist. The professor explained it this way.

Ask a scientist and an engineer the following question. Imagine you are on a bus and seated exactly nine feet away beckons a beautiful naked lady.

How many bus stops will it take you to reach the beautiful naked lady if the distance between you is decreased by exactly one-half at each bus stop?

The scientist quickly answered that an infinite number of stops would not permit you to reach the beautiful naked lady because the distance is only decreased by half at each stop.

The engineer answered that after about ten stops you will be close enough for all practical purposes

37 posted on 02/28/2007 4:43:25 PM PST by MosesKnows
[ Post Reply | Private Reply | To 20 | View Replies]

To: Tolik
To solve the global-warming puzzle, we first need to learn much more about the precipitation-system puzzle.

Great article! Note they are only talking about natural clouds and snow. Using technology there can be cost effective man-made clouds and snow as well. That changes everything. We dam mighty rivers and build hundred-mile lakes so we aren't at the mercy of mother nature's fresh water sources. We can do similar with clouds and snow to control the climate.

Scientists don't seem to want to understand precipitation, at least publicly. The reason is that once we have a good cloud and snow model we can test inexpensive man-made methods to actively manage the climate to be whatever we want.

The end of the world is averted again by the use of technology. Next!

38 posted on 02/28/2007 4:44:22 PM PST by Reeses
[ Post Reply | Private Reply | To 1 | View Replies]

To: Publius6961
Which should be self evident to anyone with just a conversant familiarity with the hard sciences:

The Al "the boob" Gore adherents have not yet learned that 2 does not equal 3; not even for large values of 2.

39 posted on 02/28/2007 5:04:54 PM PST by MosesKnows
[ Post Reply | Private Reply | To 27 | View Replies]

To: Lonesome in Massachussets
OK, then show me. Here's 45 years of historical data. C and T are observed (I've added a little random measurement error). Make a model that explains T. Then we'll see how well it does against the underlying system that produced the data over the next 50 or 100 years.
Year C T Year C T Year C T
1956 100.00 20.86 1971 115.98 22.25 1986 134.84 24.06
1957 100.75 20.85 1972 117.11 22.41 1987 135.91 24.31
1958 101.99 20.81 1973 118.45 22.66 1988 137.93 24.40
1959 103.09 21.19 1974 120.03 22.54 1989 138.83 24.76
1960 104.17 21.16 1975 121.05 22.87 1990 140.36 24.70
1961 104.52 21.05 1976 122.19 22.67 1991 141.57 25.06
1962 106.31 21.35 1977 123.58 23.11 1992 142.91 24.98
1963 107.04 21.46 1978 124.35 23.05 1993 144.24 25.12
1964 108.02 21.69 1979 125.09 23.32 1994 145.48 25.38
1965 109.64 21.66 1980 126.82 23.60 1995 147.49 25.29
1966 110.57 21.70 1981 128.73 23.53 1996 148.62 25.55
1967 111.61 21.90 1982 129.15 23.80 1997 150.86 25.79
1968 112.54 21.92 1983 131.07 23.85 1998 151.85 25.95
1969 113.70 22.08 1984 132.01 23.93 1999 153.78 26.22
1970 114.87 22.36 1985 133.36 23.92 2000 155.17 26.03

40 posted on 02/28/2007 5:29:00 PM PST by edsheppa
[ Post Reply | Private Reply | To 35 | View Replies]


Navigation: use the links below to view more comments.
first previous 1-2021-4041-60 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson