June 18, 2008
This article was contributed by Valerie Henson
Moore's Law - we all know it (or at least think we do).
To be annoyingly exact, Moore's Law is a prediction that the number of
components per integrated circuit (for minimum cost per component)
will double every 24 months (revised up from every 12 months in the
original 1965 prediction). In slightly more useful form, Moore's
Law is often used as a shorthand for the continuing exponential growth
of computing technology in many areas - disk capacity, clock speed,
random access memory. Every time we approach the limit of some key
computer manufacturing technology, the same debate rages: Is this the
end of Moore's Law? So far, the answer has always been no.
But Moore's Law is inherently a statement about human ingenuity,
market forces, and physics. Whenever exponential growth falters in
one area - clock speed, or a particular mask technique - engineers
find some new area or new technique to improve at an exponential pace.
No individual technique experiences exponential growth for long,
instead migration to new techniques occurs fast enough that the
overall growth rate continues to be exponential.
The discovery and improvement of manufacturing techniques is driven on
one end by demand for computation and limited on the other end by
physics. In between is a morass of politics, science, and plain old
engineering. It's hard to understand the myriad forces driving demand
and the many factors affect innovation including economies of scale,
cultural attitudes towards new ideas, vast marketing campaigns, and the
strange events that occur during the death throes of megacorporations.
By comparison, understanding the limits of computation is
easy, as long as you have a working knowledge of quantum physics,
information theory, and the properties of black holes.
The "Ultimate Laptop"
In a paper published in Nature in 2000,
Ultimate
Physical Limits of Computation (free
arXiv preprint
[PDF] here), Dr. Seth Lloyd calculates (and explains) the limits of
computing given our current knowledge of physics. Of course, we don't
know everything about physics yet - far from it - but just as in other
areas of engineering, we know enough to make some extremely
interesting predictions about the future of computation. This paper
wraps up existing work on the physical limits of computing and
introduces several novel results, most notably the ultimate speed
limit to computation. Most interesting in my mind is the calculation
of a surprisingly specific upper bound on how many years a generalized
Moore's Law can remain in effect (keep reading to find out exactly how
long!).
Dr. Lloyd begins by assuming that we have no idea what future computer
manufacturing technology will look like. Many discussions of the
future of Moore's Law center around physical limits on particular
manufacturing techniques, such as the limit on feature size in optical
masks imposed by the wavelength of light. Instead, he ignores
manufacturing entirely and uses several key physical constants: the
speed of light c, Planck's reduced constant h
(normally written as h-bar, a symbol not available in standard HTML,
so you'll have to just imagine the bar), the gravitational
constant g, and Boltzmann's constant kB. These
constants and our current limited understanding of general relativity
and quantum physics are enough to derive many important limits on
computing. Thus, these results don't depend on particular
manufacturing techniques.
The paper uses the device of the "Ultimate Laptop" to help make the
calculations concrete. The ultimate laptop is one kilogram in mass
and has a volume of one liter (coincidentally almost exactly the same
specs as a 2008 Eee PC), and
operates at the maximum physical limits of computing. Applying the
limits to the ultimate laptop gives you a feel for the kind of
computing power you can get in luggable format - disregarding battery
life, of course.
Energy limits speed
So, what are the limits? The paper begins with deriving the ultimate
limit on the number of computations per second. This depends on the
total energy, E, of the system, which can be calculated using
Einstein's famous equation relating mass and energy, E =
mc2. (Told you we'd need to know the speed of light.)
Given the total energy of the system, we then need to know how quickly
the system can change from one distinguishable state to another -
i.e., flip bits. This turns out to be limited by the Heisenberg
uncertainty principle. Lloyd has this to say about the Heisenberg
uncertainty principle:
In particular, the correct interpretation of the time-energy
Heisenberg uncertainty principle ΔEΔt ≥ h
is not that it takes time Δt to measure energy to an accuracy
ΔE (a fallacy that was put to rest by Aharonov and Bohm) but
rather that that a quantum state with spread in energy ΔE takes
time at least Δt = πh/2ΔE to evolve to an
orthogonal (and hence distinguishable) state. More recently, Margolus
and Levitin extended this result to show that a quantum system with
average energy E takes time at least Δt = πh/2E
to evolve to an orthogonal state.
In other words, the Heisenberg uncertainty principle implies that a
system will take a minimum amount of time to change in some observable
way, and that the time is related to the total energy of the system.
The result is that a system of energy E can
perform 2E/πh logical operations per second (a logical
operation is, for example, performing the AND operation on two bits of
input - think of it as single bit operations, roughly). Since the
ultimate laptop has a mass of 1 kilo, it has energy
E = mc2 = 8.9874 x 1016 joules. The ultimate
laptop can perform a maximum of 5.4258 x 1050 operations
per second.
How close are we to the 5 x 1050 operations per second
today? Each of these operations is basically a single-bit operation,
so we have to convert current measurements of performance to their
single-bit operations per second equivalents. The most commonly
available measure of operations per seconds is FLOPS (floating point
operations per second) as measured by LINPACK (see
the Wikipedia page on
FLOPS). Estimating the exact number of actual physical single-bit
operations involved in a single 32-bit floating point operation would
require proprietary knowledge of the FPU implementation. The number
of FLOPS as reported by LINPACK varies wildly depending on compiler
optimization level as well. For this article, we'll make a wild
estimate of 1000 single-bit operations per second (SBOPS) per FLOPS,
and ask anyone with a better estimate to please post it in a comment.
With our FLOPS to SBOPS conversion factor of 1000, the current LINPACK
record holder, the Roadrunner supercomputer (near my home town,
Albuquerque, New Mexico), reaches speeds of one petaflop, or
1000 x 1015 = 1 x 1018
SBOPS. But that's for an entire
supercomputer - the ultimate laptop is only one kilo in mass and one
liter in volume. Current laptop-friendly CPUs are around one
gigaflop, or 1012 SBOPS, leaving us about 39 orders of
magnitude to go before hitting the theoretical physical limit of
computational speed. Finally, existing quantum computers have already
attained the ultimate limit on computational speed - on a very small
number of bits and in a research setting, but attained it nonetheless.
Entropy limits memory
What we really want to know about the ultimate laptop is how many
legally purchased DVDs we can store on it. The amount of data a
system can store is a function of the number of distinguishable
physical states it can take on - each distinct configuration of memory
requires a distinct physical state. According to Lloyd, we have
"known for more than a century that the number of accessible states of
a physical system, W, is related to its thermodynamic entropy
by the formula: S = kB ln W" (S is the thermodynamic
entropy of the system). This means we can calculate the number of bits
the ultimate laptop can store if we know what its total entropy is.
Calculating the exact entropy of a system turns out to be hard. From
the paper:
To calculate exactly the maximum entropy for a kilogram of matter in a
liter volume would require complete knowledge of the dynamics of
elementary particles, quantum gravity, etc. We do not possess such
knowledge. However, the maximum entropy can readily be estimated by a
method reminiscent of that used to calculate thermodynamic quantities
in the early universe. The idea is simple: model the volume occupied
by the computer as a collection of modes of elementary particles with
total average energy E.
The following discussion is pretty heavy going; for example, it
includes a note that baryon number may not be conserved in the case of
black hole computing, something I'll have to take Lloyd's word on. But
the end result is that the ultimate laptop, operating at maximum
entropy, could store at least 2.13 x 1031 bits. Of course,
maximum entropy means that all of the laptop's matter is converted to
energy - basically, the equivalent of a thermonuclear explosion. As
Lloyd notes, "Clearly, packaging issues alone make it unlikely that
this limit can be obtained." Perhaps a follow-on paper can discuss
the Ultimate Laptop Bag...
How close are modern computers to this limit? A modern laptop in 2008
can store up to 250GB - about 2 x 1012 bits. We're about
19 orders of magnitude away from maximum storage capacity, or about 64
more doublings in capacity. Disk capacity as measured in bits per
square inch has
doubled about
30 times between 1956 and 2005, and at this historical rate, 64
more doublings will only take about 50 - 100 years. This
isn't the overall limit on Moore's law as applied to computing, but it
suggests the possibility of an end to Moore's law as applied to
storage within some of our lifetimes. I guess we file system
developers should think about second careers...
Redundancy and error correction
Existing computers don't approach the physical limits of computing for
many good reasons. As Lloyd wryly observes, "Most of the energy [of
existing computers] is locked up in the mass of the particles of which
the computer is constructed, leaving only an infinitesimal fraction
for performing logic." Storage of a single bit in DRAM uses "billions
and billions of degrees of freedom" - electrons, for example - instead of
just one degree of freedom. Existing computers tend to conduct
computation at temperatures at which matter remains in the form of
atoms instead of plasma.
Another fascinating practical limit on computation is the error rate
of operations, which is bounded by the rate at which the computer can
shed heat to the environment. As it turns out, logical operations
don't inherently require the dissipation of energy, as von Neumann
originally theorized. Reversible operations (such as NOT) which do
not destroy information do not inherently require the dissipation of
energy, only irreversible operations (such as AND). This makes some
sense intuitively; the only way to destroy (erase) a bit is to turn
that information into heat, otherwise the bit has just been moved
somewhere else and the information it represents is still there.
Reversible computation has been implemented and shown to have
extremely low power dissipation.
Of course, some energy will always be dissipated, whether or not the
computation is reversible. However, the erasure of bits - in
particular, errors - requires a minimum expenditure of energy. The
rate at which the system can "reject errors to the environment" in the
form of heat limits the rate of bit errors in the system; or,
conversely, the rate of bit errors combined with the rate of heat
transfer out of the system limits the rate of bit operations. Lloyd
estimates the rate at which the system can reject error bits to the
environment, relative to the surface area and assuming black-body
radiation, as 7.195 x 1042 bits per meter2 per
second.
Computational limits of "smart dust"
Right around the same time that I read the "Ultimate Limits" paper, I
also read
A
Deepness in the Sky by Vernor Vinge, one of many science fiction
books featuring some form of "smart dust." Smart dust is the concept
of tiny computing elements scattered around the environment which
operate as a sort of low-powered distributed computer. The smart dust
in Vinge's book had enough storage for an entire systems manual, which
initially struck me as a ludicrously large amount of storage for
something the size of a grain of dust. So I sat down and calculated the
limits of storage and computation for a computer one μm3
in size, under the constraint that its matter remain in the form of
atoms (rather than plasma).
Lloyd calculates that, under these conditions, the ultimate laptop
(one kilogram in one liter) can store about 1025 bits and
conduct 1040 single-bit operations per second. The
ultimate laptop is one liter and there are 1015
μm3 in a liter. Dividing the total storage and
operations per second by 1015 gives us 1010 bits
and 1025 operations per second - about 1 gigabyte in data
storage and so many FLOPS that the prefixes are meaningless.
Basically, the computing potential of a piece of dust far exceeds the
biggest supercomputer on the planet - sci-fi authors, go wild! Of
course, none of these calculations take into account power delivery or
I/O bandwidth, which may well turn out to be far more important limits
on computation.
Implications of the ultimate laptop
Calculating the limits of the ultimate laptop has been a lot of fun,
but what does it mean for computer science today? We know enough now
to derive a theoretical upper bound for how long a generalized Moore's
Law can remain in effect. Current laptops store 1012 bits
and conduct 1012 single-bit operations per second. The
ultimate laptop can store 1031 bits and conduct
1051 single-bit operations per second, a gap of a factor of
1019 and 1039 respectively. Lloyd estimates the
rate of Moore's Law as 108 factor of improvement in areal
bit density over the past 50 years. Assuming that both storage
density and computational speed will improve by a factor of
108 per 50 years, the limit will be reached in about 125
years for storage and about 250 years for operations per second. One
imagines the final 125 years being spent frantically developing better
compression algorithms - or advanced theoretical physics research.
Once Moore's Law comes to a halt, the only way to increase computing
power will be to increase the mass and volume of the computer, which
will also encounter fundamental limits. An unpublished paper entitled
Universal Limits on
Computation estimates that the entire computing capacity of the
universe would be exhausted after only 600 years under Moore's Law.
250 years is a fascinating in-between length of time. It's too far
away to be relevant to anyone alive today, but it's close enough that
we can't entirely ignore it. Typical planning horizons for long-term
human endeavors (like managing ecosystems) tend to max out around 300
years, so perhaps it's not unthinkable to begin planning for the end
of Moore's Law. Me, I'm going to start work on the LZVH compression
algorithm, tomorrow.
One thing is clear: we live in the Golden Age of computing. Let's
make the most of it.
Valerie Henson is a Linux consultant
specializing in file systems and owns a one kilo, one liter laptop.
(
Log in to post comments)