Skip to comments.Moore’s Law Lives: the Future Is Still Alive
Posted on 05/09/2011 6:46:27 AM PDT by Kaslin
Why your life and the lives of your children and grandchildren are going to be a whole lot better than they might have been.
In a week of big stories, the biggest didnt take place in Pakistan or Washington, D.C., but in Santa Clara, California. Unlike Osama bin Laden, we managed to dodge a bullet. If we hadnt, it wouldnt have ended modern civilization, but it might have sent it off on a much different, and much less happy, path.
You probably didnt read this story. So, put simply, Intel Corp. announced Wednesday that Moores Law isnt going to end anytime soon. Because of that, your life and the lives of your children and grandchildren are going to be a whole lot better than they might have been.
Today, almost a half-century after it was first elucidated by legendary Fairchild and Intel co-founder Dr. Gordon Moore in an article for a trade magazine, it is increasingly apparent that Moores Law is the defining measure of the modern world. All other predictive tool for understanding life in the developed world since WWII — demographics, productivity tables, literacy rates, econometrics, the cycles of history, Marxist analysis, and on and on — have failed to predict the trajectory of society over the decades … except Moores Law.
Alone, this oddly narrow and technical dictum — that the power, miniaturization, size, and power of integrated circuit chips will, together, double every couple years — has done a better job than any other in determining the pace of daily life, the ups and downs of the economy, the pace of innovation, and the creation of new companies, fads, and lifestyles. It has been said many times that, beneath everything, Moores Law is ticking away as the metronome, the heartbeat, of the modern world.
Why this should be so is somewhat complicated. But a simple explanation is that Moores Law isnt strictly a scientific law — like, say, Newtons Laws of Motion — but rather a brilliant observation of an implied contract between the semiconductor industry and the society it serves. What Gordon Moore observed back in the mid-1960s was that each generation of memory chips (in those days they could store a few thousand bits, compared to a few billion today), which appeared about every 18 months, had twice the storage capacity of the generation before. Plotting the exponential curve of this development on logarithmic paper, Moore was pleased to see a straight line … suggesting that this developmental path might continue into the foreseeable future.
This discovery has been rightly celebrated for years. But often forgotten is that there was technological determinism behind the Law. Computer chips didnt make themselves. And so, if the semiconductor industry had decided the next day to slow production or reduce their R&D budgets, Moores Law would have died within weeks. Instead, semiconductor companies around the world, big and small, and not least because of their respect for Gordon Moore, set out to uphold the Law — and they have done so ever since, despite seemingly impossible technical and scientific obstacles. Gordon Moore not only discovered Moores Law, he made it real. As his successor at Intel, Paul Otellini, once told me, Im not going to be the guy whose legacy is that Moores Law died on his watch. And thats true for every worker in the semiconductor industry. They are our equivalent of medieval workers, devoting their entire careers to building a cathedral whose end they will never see.
And so, instead of fading away like yet one more corporate five-year plan, Moores Law has defined our age, and done so more than any of the more celebrated trend-setters, from the Woodstock generation to NASA to the personal computer. Moores Law today isnt just microprocessors and memory, but the Internet, cellular telelphony, bioengineering, medicine, education, and play. If, in the years ahead, we reach that Singularity of man and computer that Ray Kurzweill predicts for us, that will be Moores Law too. But most of all, the virtuous cycle of constant innovation and advancement, of hot new companies that regularly refresh our economy, and of a world characterized by continuous change — in other words, the world that was created for the first time in history only about sixty years ago, and from which we can hardly imagine another — is the result of Moores Law.
When Gordon Moore first enunciated his Law, only a handful of industries — the first minicomputers, a couple scientific instruments, a desktop calculator or two — actually exhibited its hyperbolic rate of change. Today, every segment of society either embraces Moores Law or is racing to get there. Thats because they know that if only they can get aboard that rocket — that is, if they can add a digital component to their business — they too can accelerate away from the competition. Thats why none of the inventions we Baby Boomers as kids expected to enjoy as adults — atomic cars! personal helicopters! ray guns! — have come true; and also why we have even more powerful tools and toys — instead. Whatever can be made digital, if not in the whole, but in part — marketing, communications, entertainment, genetic engineering, robotics, warfare, manufacturing, service, finance, sports — it will, because going digital means jumping onto Moores Law. Miss that train and, as a business, an institution, or a cultural phenomenon, you die.
So, what made this weeks announcement — by Intel — so important? It is that almost from the moment the implications of Moores Law became understood, there has been a gnawing fear among technologists and those who understand technology that Moores Law will someday end — having snubbed up against the limits of, if not human ingenuity, then physics itself. Already compromises have been made — multiple processors instead of a single one on a chip, exotic new materials to stop leaking electrons — but as the channels get narrower and bumpier with molecules and the walls thinner and more permeable to atomic effects, the end seems to draw closer and closer. Five years away? Ten? And then what? What will it be like to live in a world without Moores Law … when every human institution now depends upon it?
But the great lesson of Moores Law is not just that we can find a way to continuously better our lives — but that human ingenuity knows no bounds, nor can ever really be stopped. You probably havent noticed over the last decade the occasional brief scientific article about some lab at a university, or at IBM, Intel, or HP, coming up with a new way to produce a transistor or electronic gate out of just two or three atoms. Those stories are about saving Moores Law for yet another generation. But thats the next chapter. Right here and now, the folks at Intel were almost giddy in announcing that what had been one of those little stories a decade ago — tri-gate transistors — would now be the technology in all new Intel chips.
Im not going to go into technical detail about how tri-gate transistors work, but suffice to say that since the late 1950s, when Jean Hoerni, along with the other founders of the semiconductor industry at Fairchild (including Gordon Moore), developed the “planar” process, all integrated circuits have been structurally flat, a series of layers of semiconductors, insulators, and wiring “printed” on an equally flat sheet of silicon. For the first time, Intels new tri-gate technology leaves the plane of the chip and enters the third dimension. It does so by bringing three “fins” of silicon up from beneath the surface, having them stick up into the top, transistor, layer. The effect is kind of like draping a mattress over a fence — and then repeating that over a billion fences, all just inches apart. The result is a much greater density of the gates, lower power consumption, faster switching, and fewer quantum side-effects. Intel claims that more than 6 million of these 22 nanometer Tri-Gate transistors can fit in the period at the end of this sentence.
The first processors featuring Tri-Gate transistors will likely appear later this year. And you can be sure that competitors, with similar designs, will appear soon after. But thats their battle.
What counts for the rest of us is that Moores Law survives. The future will arrive as quickly as ever ….
madge dunham document is real!....
Uh oh, looks like the truth comes out after all. Thanks for finding
So, are you maintaining that Obama birth documents will double every generation; is that what your post has to do witj Moore’s law?
It appears that is the case.
Contrasted with “Bill’s Law”.
Each successive OS will use 4 times as much ram, require 8 times as much processing power and have 32 times as much bloat as the previous generation.
Didn't the original 'law' say 'every 6 months'? Still, it's impressive...
This guy’s writing style — replete with asides — gets really annoying after awhile.
The paper that started it all:
Check out “The Singularity is Near” by Ray Kurzweil.
It is overly optimistic, but I think Ray has a good handle on where technology is going.
Sorry, should have provided a link:
Highly recommend the book.
I myself - brilliant writer that I am - sometimes think of far too many synonyms, antonyms, examples, contrasts - even exclamations! - to include correctly in any sort of comma-divided list. It is a sign ... or so I sometimes delude myself ... of a well-stocked mind.
Well stocked indeed, but lacking in systematic application, the root of effective communication.
Oh, well, if communication is your goal, that’s something totally different.
Some of us - we know who we are - are in the wordiness business for the sheer joy of watching the syllables unspool across the screen until they reach their natural termination ... in a period.
Cramming more components onto integrated circuits
Great link. It shows that Moore projecting his law from 1965 only to 1970 - and here we are able to confirm his projection has been valid for, not merely half a decade, but going on half a century.IMHO Moore's Law does not actually define the rate at which improved IC circuits are developed, but rather how fast they are demanded. That is, the fact that increased production of a good has, historically, always produced a concomitant reduction in cost for not just transistors but for all goods. The reason that the development of IC improvements has been as rapid as Moore predicted (for much longer than he actually initially predicted) is simply that two years from now the market will absorb twice as many transistors as now, at half the price per transistor as today. Doubling the production of a physical thing of constant size (a car, for example) would depress the price by much more than a factor of two. But two years from now I will be ready to listen if you offer to double my RAM for the same money as my original quantity of RAM.
Even at this late date there are sure to be billions of people who don't have computers, so when you drive down the price with increased production you still have a huge market to tap. As it is, I think many of those people are at least getting cell phones . . .
And also note that the speed of a processor doesn’t scale with the number of components. The doubling of components doesn’t mean that you get a doubling of speed.
I liked the style. It’s better than having to sort out misplaced modifiers all thrown at the end of a sentence (or wandering aimlessly throughout the paragraphs).
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.