Issue



EDITORIAL: Another century of discovery?


12/01/1999







As the odometer spins over from 1999-2000, a glance at the speedometer shows the speed of discovery moving upward at a steadily faster pace. At the beginning of this millennium, as the world moved from the year 999 into the 1000s, people had little sense of what was on the rest of this globe we call earth, let alone the unbelievable vastness that surrounds us. The earth was thought to be flat and at the center of the universe. Over the next thousand years we explored the earth, went to the moon, sent probes to other planets, and began to map the universe around us across a broad spectrum of emissions, from infrared to gamma rays. Note that, except for the exploration of the earth, almost all this discovery came in the past two centuries. In the last century, we discovered electricity, radio waves, bacteria, and genetics. In this century, we extended our knowledge to the periodic table of elements, quantum mechanics, nuclear fission and fusion, DNA, and viruses. Out of all these discoveries flowed incredible inventions that have transformed our lives and our environment. But, again, much of this invention was in this century, and the tools of the Information Age we are now entering came almost totally in the last half of this century.

The pace at which inventions become commercial products has also steadily quickened. The willingness of investors to plunge millions of dollars into dot.com ventures, well before they have even demonstrated commercial viability, is an indication of the sense we have now gained that "if it's possible, it WILL happen, and far sooner than anyone thinks." Only the people who are deep into technology and the science behind it realize that it is still very hard to push the boundaries. But somehow, we seem to have learned to speed up even this very difficult creative process and turn discoveries into useful inventions and products at an ever-faster clip.

Right now our industry is facing a path of unprecedented uncertainty as we try to keep climbing the semi-logarithmic path of Moore's Law. While some have long said that the physics won't allow us to go much further, chipmakers have actually zoomed ahead of the uncanny predictability that Moore's Law has provided for decades. Only a few years ago, top physicists did not believe we could make features smaller than the wavelength of light — now we are seeing 180nm devices being made with 248nm steppers, and optical enhancement techniques will enable even better than that. Those who worked on the new SIA Roadmap caution that their tables have more "red" areas (where there are problems we don't yet know how to solve) than ever before. Like the Internet investors, however, many believe that we WILL solve those many problems and forge ahead of what even experts now think is possible.

What's coming next? While we continue to shrink features, and work hard at improving tool productivity, our view is that another frontier remains to be explored. As we move toward a billion devices on a chip, we need to make much better use of them than simply cranking signals through logic gates. We need to build systems that go beyond our somewhat one-dimensional digital computing model. Neural networks, for example, gain power through multiple layers of processing. Human perception involves systems that globally analyze properties without digesting every detail, as we do in our present digital systems. We need workable system architectures that handle multilayered processing rather than just struggling to parallelize a unidimensional computing paradigm. It is also possible that stacked, 3-D circuit structures will evolve that greatly shorten the paths signals must now travel, possibly combining the traditional electron/hole flow with quantum effects, and intermingling patches of memory and processing.

Imagine, a whole new thousand years is about to open up for exploring new frontiers like this. At the rate discovery has been accelerating, the results will be awesome!

Click here to enlarge image

Robert Haavind
Editor in Chief,
[email protected]