Foresight Update 4 (page 2)
A publication of the Foresight Institute
Manufacturing with Nanotechnology
Viewpoint by Jerry Fass
Nanotechnology-based manufacturing techniques should yield great increases
in productivity and wealth. Improvements in two techniques in particular
will greatly decrease resource requirements: the incorporation of voids,
Whenever possible, objects can incorporate carefully shaped voids to save
cost and mass. Generally, voids are more useful for large systems or those
under low loads. They can range in size from arbitrarily large down to a
fraction of a nanometer wide; the upper limit is set by device size, the
lower by the scale of atoms. For structures under light compressive loads,
voids formed in fractal patterns can yield maximum efficiency.
Today's bulk manufacturing can produce large, irregular voids at reasonable
cost, as in foam rubber and insulation. Nanomachines should be able to produce
uniform voids down to one atom across, thereby cutting the mass, cost, energy,
and time needed for production. The biggest gains will be for objects with
structural loads in pure compression or mixed compression and tension; fortunately
this includes the majority of objects we use, such as furniture, doors,
most walls, and appliances. The void fraction of these could be very high,
perhaps 99% or more. Highly loaded objects (e.g., engine parts) will benefit
less, and highly loaded tension systems (e.g., cables and pressure vessels)
will benefit little.
Incorporating voids, combined with scavenging heavy pre-nanotechnology parts,
will allow us to recycle old systems into multiple new ones without new
material resources, reducing the need for mining and refining.
|Van der Waals cylinder-and-sleeve
© K. Eric Drexler
Wear limits the lives of mechanical and structural systems, which often
attain a reasonable lifetime only by having worn-out parts replaced. (An
annoying example is the modern automobile). Wear is cumulative and can seem
exponential, as worn parts increase wear on other parts. The aim of wearproofing
is to head off the wear process, with the increasingly ambitious goals of
longer-lasting parts, zero-wear parts, and finally self-repair. Using nanotechnology,
we can expect improvements in:
Wear on tools can be reduced even for bulk processes by forming parts using
non-contact methods such as explosives, lasers, electron beams, plasma torches,
water jets, and electromagnetic forming instead of drill bits, grinding
wheels, and the like. There may be uses where such macro-tools forever outperform
nano-tools: perhaps in well drilling, tunneling, and excavating.
- Tribology--the science of why and how objects wear. Nanomachines should
greatly aid collection of data needed to further advance this field.
- Hardness--surfaces of harder materials wear more slowly. Surfaces
of ceramic or diamond, or perhaps the new form of carbon, "C8",
reported by Soviet researchers, will last much longer. Nanomachines could
apply such coatings, and powerful computers may allow us to design new ones.
- Friction control--lubricants and bearings. All three means of lubrication--solids
(Teflon, graphite), liquids (oil), and gases (air)--are improving rapidly
thanks to improved data and computer analysis. Contact bearings will benefit
from ever-tighter tolerances and more rugged materials. A revolutionary
non-contact bearing, the electromagnetic bearing, repulsively or attractively
levitates moving parts in a magnetic or electric field, with zero friction;
wear can often be practically eliminated by having such a bearing gently
"land" a part after it stops moving. The new high-temperature
superconductors will make such bearings smaller, stronger, and more precise;
since they are often computer-controlled, nanocomputers will be helpful
too. And of course, Drexler's suggested atomically-precise sigma bond and
van der Waals bearings will not wear in any conventional sense.
Synergies between the above techniques can be expected; for example, making
an object with voids but covering the surface with diamond. And besides
saving energy in manufacturing, we can expect to do so in transport as well:
objects will last longer and so need to be delivered less often, they will
weigh less when they do need transport, and with nanoproduction systems--quiet,
small, flexible, and clean--manufacturing on-site becomes a possibility.
But eventually, we can expect self-repair to solve the wear problem.
Jerry Fass is a part-time science writer based in Wisconsin. He also
coordinates FI's journal monitoring project.
Table of Contents - Foresight
How Many Bytes in Human Memory?
[Note: A related article on the computational
limits of the human brain is available in Update 6.]
Today it is commonplace to compare the human brain to a computer, and the
human mind to a program running on that computer. Once seen as just a poetic
metaphor, this viewpoint is now supported by most philosophers of human
consciousness and most researchers in artificial intelligence. If we take
this view literally, then just as we can ask how many megabytes of RAM a
PC has we should be able to ask how many megabytes (or gigabytes, or terabytes,
or whatever) of memory the human brain has.
Several approximations to this number have already appeared in the literature
based on 'hardware' considerations (though in the case of the human brain
perhaps the term 'wetware' is more appropriate). One estimate of 1020
bits is actually an early estimate (by Von
Neumann in The Computer and the Brain) of all the neural
impulses conducted in the brain during a lifetime. This number is almost
certainly larger than the true answer. Another method is to estimate the
total number of synapses, and then presume that each synapse can hold a
few bits. Estimates of the number of synapses have been made in the range
from 1013 to 1015, with corresponding estimates of
A fundamental problem with these approaches is that they rely on rather
poor estimates of the raw hardware in the system. The brain is highly redundant
and not well understood: the mere fact that a great mass of synapses exists
does not imply that they are in fact all contributing to memory capacity.
This makes the work of Thomas
K. Landauer very interesting, for he has entirely avoided this hardware
guessing game by measuring the actual functional capacity of human memory
directly (See "How Much Do People Remember? Some Estimates of the Quantity
of Learned Information in Long-term Memory", in Cognitive Science
10, 477-493, 1986).
Landauer works at Bell Communications Research--closely affiliated with
Bell Labs where the modern study of information
theory was begun by C.
E. Shannon to analyze the information carrying capacity of telephone
lines (a subject of great interest to a telephone company). Landauer naturally
used these tools by viewing human memory as a novel 'telephone line' that
carries information from the past to the future. The capacity of this 'telephone
line' can be determined by measuring the information that goes in and the
information that comes out, and then applying the great power of modern
Landauer reviewed and quantitatively analyzed experiments by himself and
others in which people were asked to read text, look at pictures, and hear
words, short passages of music, sentences, and nonsense syllables. After
delays ranging from minutes to days the subjects were tested to determine
how much they had retained. The tests were quite sensitive--they did not
merely ask 'What do you remember?' but often used true/false or multiple
choice questions, in which even a vague memory of the material would allow
selection of the correct choice. Often, the differential abilities of a
group that had been exposed to the material and another group that had not
been exposed to the material were used. The difference in the scores between
the two groups was used to estimate the amount actually remembered (to control
for the number of correct answers an intelligent human could guess without
ever having seen the material). Because experiments by many different experimenters
were summarized and analyzed, the results of the analysis are fairly robust;
they are insensitive to fine details or specific conditions of one or another
experiment. Finally, the amount remembered was divided by the time allotted
to memorization to determine the number of bits remembered per second.
The remarkable result of this work was that human beings remembered very
nearly two bits per second under all the experimental conditions.
Visual, verbal, musical, or whatever--two bits per second. Continued over
a lifetime, this rate of memorization would produce somewhat over 109
bits, or a few hundred megabytes.
While this estimate is probably only accurate to within an order of magnitude,
Landauer says "We need answers at this level of accuracy to think about
such questions as: What sort of storage and retrieval capacities will computers
need to mimic human performance? What sort of physical unit should we expect
to constitute the elements of information storage in the brain: molecular
parts, synaptic junctions, whole cells, or cell-circuits? What kinds of
coding and storage methods are reasonable to postulate for the neural support
of human capabilities? In modeling or mimicking human intelligence, what
size of memory and what efficiencies of use should we imagine we are copying?
How much would a robot need to know to match a person?"
What is interesting about Landauer's estimate is its small size. Perhaps
more interesting is the trend--from Von Neumann's early and very high estimate,
to the high estimates based on rough synapse counts, to a better supported
and more modest estimate based on information theoretic considerations.
While Landauer doesn't measure everything (he did not measure, for example,
the bit rate in learning to ride a bicycle, nor does his estimate even consider
the size of 'working memory') his estimate of memory capacity suggests that
the capabilities of the human brain are more approachable than we had thought.
While this might come as a blow to our egos, it suggests that we could build
a device with the skills and abilities of a human being with little more
hardware than we now have--if only we knew the correct way to organize that
is also available on Dr. Merkle's Web site.
Dr. Merkle's interests range from neurophysiology to computer security.
He recently spoke on nanotechnology and biostasis at the Life Against Death
Conference in San Francisco.
Table of Contents - Foresight
The Road to Nanomachine Design
by Thomas Donaldson
One of the contributions by K. Eric Drexler to nanotechnology was his success
with estimating the behavior of nanomachines by using simple mechanical
calculations. Ultimately, however, these exploratory engineering calculations
remain approximations only. Serious nanomachine design will require much
more. Almost certainly it needs very powerful computers able to carry out
dynamic calculations on large molecules. These calculations need lots of
computer power. Specialized chemical workstations with prices in the range
of $200,000 already exist.
To speed nanotechnology along what we really want is lower price computers,
ideally costing no more than a Mac II. There is a wide open road to just
such a computer. Technology for chemical design workstations costing about
$40,000 exists right now, for the trouble of assembling a system from standard
boards (unfortunately not done yet). The same parts will cost far less in
a few years (so Popular NanoMechanics may start publication
The technology depends on the Transputer, a chip specially designed for
parallel processing. Computer System
Architects sells IBM PC boards with 16 Transputer chips and 16 megabytes
of memory for $28,000.
Chemical Design Ltd,
a British company, already sells a chemical design workstation, the MITIE
1000, which can contain as many as 36 independent Transputers. The smallest
MITIE 1000 sells for $170,000. The MITIE calculates as much as 72 times
faster than a VAX 8600, analyzing the conformation and dynamics of large
molecules at supercomputer speeds.
The MITIE contains a microVAX as a host machine; the remaining modules run
on the VAX. Chemical Design has about 250 customers around the world, including
Glaxo, Rhone-Poulenc, Fisons, Dupont, American Cyanamid, Merck, and Hoffmann-LaRoche.
The program ChemX contains specific modules for building and displaying
the molecule (ChemCore), modelling molecules (fitting, analyzing the conformation:
ChemModel), designing proteins (ChemProtein), and carrying out calculations
to find minimum energy states (ChemQM). There are also library modules to
maintain a large database (ChemLib: the recommended size of hard disk for
a single-user system is 70 Megabytes).
Any molecular machine we design must be chemically stable in the environment
for which we design it. We must therefore make sure not just that the molecule
would be stable if isolated from all other chemicals but also that the system
will withstand likely chemical attack. Molecules will try to attain minimum
energy states, and their excited states are also of interest. To resolve
all of these issues will require very fast chemical design software. Ultimately
software for nanomachine design will do much more, but even existing chemical
design software running on an affordable workstation puts us far ahead.
What about the software? Unfortunately, porting software to a parallel computer
usually requires a total rewrite of any modules in which you expect to use
parallelism. Porting should be a cooperative effort between someone versed
in parallel computing and someone versed in chemical design software. When
someone will get a chemical design show on the road, porting software to
a MAC II-transputer system, isn't clear to me. My own expertise lies in
parallel computing. Anyone interested can reach me through Foresight
Dr. Thomas Donaldson currently writes software for a transputer machine
for the FEM market. He pioneered the idea of artificial enzyme systems as
an approach to cell repair.
Note: One example of recent use of parallel computers
for complex molecular dynamics calculations can be found on the web at:
Table of Contents - Foresight
Foresight thanks Dave Kilbridge for converting Update 4 to html for this
From Foresight Update 4, originally published 15
On to next page
of Foresight Publications | About the
Foresight Institute | Foresight Institute
Home Page |
Foresight materials on the Web are ©1986-1997 Foresight Institute.
All rights reserved.
Last updated 14April97. The URL of this document is: http://www.foresight.org/Updates/Update04/Update04.2.html
Send requests for information about Foresight Institute activities and membership
Send comments and questions about material on this web site and reports
of errors to firstname.lastname@example.org.