I particularly loved these two paragraphs, Which should be self evident to anyone with just a conversant familiarity with the hard sciences:
Almost all semiconductor manufacturing processes occur in closed vessels. This permits the engineers to precisely control the input chemicals (gases) and the pressure, temperature, etc. with high degree of precision and reliability.
Closed systems are also much easier to model as compared to systems open to the atmosphere (that should tell us something already). Computer models are used to inform the engineering team as the design the shape, temperature ramp, flow rates, etc, etc, (i.e. the thermodynamics) of the new reactor.
Nonetheless, despite the fact that 1) the chemical reactions are highly studied, 2) there exists extensive experience with similar reactors, much of it recorded in the open literature, 3) the input gases and materials are of high and known purity, and 4) the process is controlled with incredible precision, the predictions of the models are often wrong, requiring that the reactor be adjusted empirically to produce the desired product with quality and reliability.
There was an article recently by a statistician, who pointed out that use of statistics analysis by amateurs can quickly lead to non-sensical results.
For example, if 3 assumptions are made which are 99% correct, the results can only be correct, significantly less than 99%.
When just a dozen factors are 99% correct, the results are only about 50% reliable.
With climate having dozens, perhaps hundreds of relevant factors, most of which are guessed at or ignored altogether, how reliable can these computer "model predictions" be?
The Al "the boob" Gore adherents have not yet learned that 2 does not equal 3; not even for large values of 2.