From: Walter Watts (wlwatts@home.com)
Date: Tue Jan 29 2002 - 15:00:07 MST
Moore's Quantum Leap
Why has the microchip's explosive growth rate never happened before?
George Gilder explains the micro microeconomics and why silicon is just
the beginning.
In 1965, when the Internet was the inkling of an "intergalactic computer
network" in the mind of a mildly demented psychologist by the name of
J.C.R. Licklider, Silicon Valley produced more apricots than electronic
devices; Steve Jobs was growing hair and learning subtraction; and no
one had imagined a silicon DRAM or a microprocessor or a computer
smaller than a refrigerator. The prevailing wisdom of theorists at IBM
posited the inevitable triumph of a Few Good Mainframes. In the midst of
this antediluvian world, the young director of R&D for a subsidiary of
Fairchild Camera and Instrument, Gordon E. Moore, contributed an article
to an industry journal, exploding a mind-bending prophecy.
In futurism, the favored rule is "you can say what, or you can say when,
but not both at once." What made Gordon Moore's essay so delphically
dazzling was his prediction of how the marvels of integrated electronics
would be engineered - over time. He included a graph with his journal
article. With the year on the horizontal axis and the log of the number
of components in an integrated circuit on the vertical axis, the graph
mapped just four data points - the number of transistors on ICs in 1962,
1963, 1964, and
1965. These points produced a nearly straight diagonal line at 45
degrees across the graph, indicating that the number of components had
doubled every year, beginning with 23 or 8 transistors, continuing with
24, and up to 26, or
64 transistors. The Moore coup was to boldly extend the line through
1975 when 216 or 65,000 transistors would be inscribed on a single chip.
This feat was achieved in the designated year in a lab at IBM.
The annual doubling pace slowed to an ultimate rate of a year and a
half, but with each generation the devices were eminently manufacturable
at yields approaching 100 percent. This year, after 27 doublings since
1962, the billion-transistor DRAM chip should once more fulfill the
18-month pace of advance that is now known far and wide as Moore's law.
Every technology touched by integrated electronics has advanced at a
radically new speed. In the next two years a single fiber installation
will carry more than a month's worth of Internet traffic in one second.
Ask a historian what other technologies have approximated the pace of
Moore's law, and he'll tell you none. No other innovation by any metric
has come close to doubling at such quick intervals for such a sustained
period. Why? The answer lies at the intersection of quantum physics and
a phenomenon related to the learning curve called the experience curve.
First documented in the late 1960s under the guidance of Bruce Henderson
of the Boston Consulting Group, the experience curve ordains that the
cost-effectiveness of any manufacturing process increases 20 to 30
percent with every cumulative doubling of volume. Whereas the learning
curve attempts to measure the increase in productivity, the experience
curve quantifies the decrease in cost. BCG and its spinoff Bain &
Company documented experience curves for cars, golf balls, paper bags,
limestone, nylon, and phone calls. In farm products, they limned a curve
for chicken broilers.
As an empirical phenomenon, the experience curve describes efficiency
increasing with experience and scale in the manufacturing of any product
- from pins to cookies, steel ingots to airplanes. At the outset of any
production process, uncertainty is high: No one knows how hard the
machinery can be pushed; managers must supervise closely, keep large
reserves of supplies on hand for emergencies, and maintain high
manufacturing tolerances, or margins for error. Without a substantial
body of production statistics over time, managers can't even tell
whether a defect signals a serious problem recurring in one of ten cases
or a trivial one occurring once in a million.
Considered more deeply, BCG's theorem captures the explosive increase in
efficiency resulting from the mixture of mind and matter, information
and energy. Governing each is entropy. Informational entropy measures
the content of a message through the "news" or surprises it contains -
the number of unexpected bits. While in communications you want
unexpected news
(high entropy), in a manufacturing process you want predictability (low
entropy). Thermodynamic entropy measures wasted heat and movement:
unrecoverable energy. High informational entropy produces high physical
entropy, but in any industrial experience curve, the two forms of
entropy are being reduced: energy waste and informational uncertainty.
The combination of these two negentropic trends accounts for the 20 to
30 percent improvement in productivity.
One striking early demonstration of experience curve magic is found in
the history of television, when the chair of the FCC decreed that all
future TV sets must contain UHF tuners. Gordon Moore's colleague at
Fairchild, salesman Jerry Sanders
(now the chair of AMD), knew that among all companies in the world, only
his possessed a chip that could do the job: the 1211 transistor. At the
time, he was selling the device to the military in small numbers for
$150 apiece; since each one cost $100 to build, this brought a $50 gross
margin. But Sanders salivated at the prospect of lowering the price a
bit and selling large quantities, making Fairchild the world's largest
vendor of components for TVs. Then came the bad news. RCA announced a
newfangled vacuum tube called the Nuvistor that could also do the job
(though not as well) and priced it at $1.05, more than 100 times less
than the 1211 transistor.
With production volumes set to rise from the hundreds for military
applications into the millions for TVs, Fairchild's Bob Noyce and Gordon
Moore foresaw economies of scale that would allow a drastically lower
price: They told Sanders to sell the 1211 to TV makers for $5. Sanders
ended up diving further, meeting the Nuvistor's price of $1.05 and then
going far below it as volume continued to increase. Between 1963 and
1965, Fairchild won 90 percent of the UHF tuner market in the US. The
more chips the company made, the cheaper they got, the larger the market
they commanded, and the more money Fairchild made on the product. By the
early 1970s, Fairchild was selling the
1211 for 15 cents apiece.
But if every production process obeys the experience curve, what made
the 1211's saga so striking? Time. In Henderson's theory, volume is
crucial to efficiency and learning, but there is no measure of how fast
the larger volumes can be produced. Moore's law, on the other hand, is
not only explicit on the subject of time, but it is also unprecedented
in its pace. By contrast, beginning in 1915, it took automobile
production volume not 18 months - but 60 - to double, and another
60 to double again.
What governs production time is the availability of key resources, the
elasticity of demand (how much more of the product is purchased when the
price drops), and the physical possibilities of the materials and
systems applied. With respect to resources, as Moore was also the first
to point out, integrated circuits have a vast advantage over other
products: Silicon, oxygen, and aluminum are the three most common
elements in Earth's crust. Unlike farmers or freeway contractors, who
inevitably face diminishing returns as they use up soil and real estate,
microchip manufacturers chiefly use up chip designs, which are products
of the human mind.
When it comes to demand, the magic of miniaturization allows Moore's law
to respond rapidly to almost any increase in the market. Take the case
of the 1211. In those days, each TV contained essentially only one
transistor, and the number of potential TV sales was limited more or
less to the number of households on the globe. That would mean mere
billions of transistors. With a total volume of billions, discrete
transistors like the 1211 could decrease in cost to the price of their
packages, about a dime apiece, but no further. But with the integrated
circuit, you could put ever-expanding numbers of transistors together on
a single silicon sliver; today just one typical television set alone
contains billions of transistors.
More than an abundance of materials or elasticity of demand, however,
what makes Moore's law so powerful are the properties of the microcosm.
The ultimate science of semiconductors is quantum physics, not
thermodynamics. Rather than managing matter from the outside - lifting
it against gravity, moving it against friction, melting or burning it to
change its form - Moore and his team learned how to manipulate matter
from inside its atomic and molecular structure. In the microcosm, as
Richard Feynman proclaimed in a famous speech at Caltech in 1959, "there
is plenty of room at the bottom." As Moore's law moves transistors
closer together, wires between them become shorter. The shorter the
wires, the purer the signal and the lower the resistance, capacitance,
and heat per transistor. As electron movements approach their mean free
path - the distance they can travel without bouncing off the internal
atomic structure of the silicon - they get faster, cheaper, and cooler.
Quantum tunneling electrons, the fastest of all, emit virtually no heat.
Thus, the very act of crossing from the macrocosm to the microcosm meant
the creation of an industrial process that burst free of the bonds of
thermodynamic entropy afflicting all other industries. In the quantum
domain, as individual components became faster and more useful, they
also ran cooler and used less power.
If Moore's law were a mere oddity in the ongoing advance of technology,
it would be an extraordinary one. More remarkable, though, is that this
unprecedented change is not a blip but a beginning. From processors to
storage capacity, every technology touched by integrated electronics has
advanced at a radically new speed. Today, in fact, the 18-month pace of
Moore's law appears slow compared with the three-times-faster rate of
the advance of optics.
Emerging as the spearhead of global industrial progress is the
fiber-optic technology called wavelength division multiplexing. WDM
combines many different "colors" of light, each bearing billions of bits
per second on a single fiber thread the width of a human hair. The best
measure of the technology's advance is lambda-bit kilometers,
multiplying the number of wavelengths (lambdas) by the data capacity of
each and the distance each can travel without slow and costly electronic
regeneration of the signal. In 1995, the state of the art was a system
with 4 lambdas, each carrying 622 Mbits per second some 300 kilometers.
This year, a company named Corvis introduced a 280-lambda system, with
each lambda bearing 10 Gbits per second over a distance of 3,000
kilometers. This is an 11,000-fold advance in six years. With several
hundred fibers now sheathed in a single cable, a fiber installation in
the next two years or so will be able to carry more than a month's worth
of Internet traffic in a single second.
This process moves a step forward from the seminal effect of Moore's law
and the collapse of the price of computation. While the power of
microelectronics spreads intelligence through machines, sector by
sector, the power of communications diffuses intelligence through
networks - and through not just computer networks but companies,
societies, and the global economy.
And unlike silicon transistors, with their mass and expanse, photons are
essentially without mass, making the dematerialization that began with
semiconductors complete. Photonic carriers can multiply without weight
in the same physical space. Virtually any number of colors can occupy
the same fiber core. The new magic of optics feeds on the ultimate
low-entropy carrier - the perfect sine waves of electromagnetism - and
can plunge down curves of experience without mass or resistance through
worldwide webs of glass and light.
George Gilder (gg@gilder.com) is the author of Telecosm: How Infinite
Bandwidth Will Revolutionize Our World and Microcosm: The Quantum
Revolution in Economics and Technology, and the author and editor of the
Gilder Technology Report (www.gilder.com) .
-- Walter Watts Tulsa Network Solutions, Inc. "To err is human. To really screw things up requires a bare-naked command line and a wildcard operator."
This archive was generated by hypermail 2.1.5 : Wed Sep 25 2002 - 13:28:41 MDT