But not if the model is inferred in whole or in part from that historical data. That can easily happen even without the modeler's knowledge. The only sure way to avoid that problem is to test a prediction about data as yet unknown.
Just wait 10 years and see if any model works. Its only prudent if the price is in trillions dollars, isn't it?
Not true. Suppose that you want to calculate , say, the co-efficient of expansion of a metal as a function of temperature. You could perform an experiment and vary the the temprature systematically and come up with an answer. OTOH, if someone had reliable observations, you could simply fit your model to these observations. If you knew what you were doing, and knew the limits on the accuracy of observations, given a temperature, you could predict the expansion, and state reliable upper and lower bounds.
It someone gave you a temperature outside the range of the observations however, you would be on shaking ground. The metal might melt at a very high temperature, for instance.
Astonomers have *nothing* but historical data yet they can make very accurate predictions about planetary ephemeris, with reliable limits on the expected errors. Sixty years ago astronomers were able to predict (to within a few blocks) which streets on Manhattan would see a total eclipse and on which it would be partial.