As a result of the U.S. budget wrangling last fall, government funding for
research rose some twelve percent overall. Both Congress and the President
seem to believe, vaguely, that research is good. Congress sees industrial
technology as the key to improving America's industrial competitiveness.
It has boosted funding for a wide range of "critical" or "pre-competitive"
technologies. Congress is also contemplating a change to the research and
experimentation tax credit which could encourage more research by private
companies. Meanwhile, President Bush has concentrated his efforts on a few
specific initiatives, such as high-performance computing. [Nature,
348:97, 8Nov90; Science, p747, 9Nov90].
The bad news may be the way in which that money is being spent. According
to Nature [347:697, 25Oct90], NASA will receive
some $13.9 billion this year, half the Federal research budget. The National
Science Foundation, by contrast, will get only about $2.4 billion, roughly
the cost of NASA's newest Space Shuttle. Even ardent fans of manned spaceflight
may question these priorities, considering the potential of new technologies
for extending human capabilities in space and elsewhere.
There's worse news in the method by which research money is now being allocated:
In the past, the science establishment worked out a unified program each
year and collectively lobbied Congress for funding. Last year, however,
a group of dissident biologists split from the pack and hired their own
lobbyist. [Nature 348:270, 22Nov90] This is
a dangerous precedent. If other groups follow the biologists' lead, Congress
may begin distributing research funds on the basis of political pull, rather
than scientific merit (as perceived by the science establishment). At best,
this would mean worthy projects would not be funded. At worst, scarce research
money would be directed to projects which could not possibly succeed.
The current budget contains at least one such project, pork-barrelled by
Senator Ted Stevens (R-Alaska). Senator Stevens arranged a $34 million grant
for the University of Alaska to "harness the electrojet" as a
source of electrical power. The electrojet is an electric current high in
the ionosphere, related to the aurora borealis; tapping it for power is
about as practical as feeding lightning into the power grid. The recipients
of the grant, knowing it to be scientifically unsound, managed to develop
an elaborate rationalization which allowed them to accept the money anyway
[Nature, 348:101, 8Nov90]. Of course, $34
million would go a long way toward developing nanotechnology.
The Commission of European Communities has decided to join Japan's Human
Frontier Science Program (HFSP). HFSP is the Japanese government's leading
international research program, although international financial participation
has been slow to materialize. This new agreement means that smaller European
countries, outside the "G7" group of nations, will be able to
take part in the program. Several U.S. researchers have received HFSP grants,
but the U.S. government still views HFSP with a certain amount of suspicion.
Among other goals, HFSP is investigating aspects of biochemistry and molecular
assembly, a possible path to molecular machinery. [Nature,
Two Englishmen have created a computer-based system for extracting better
decisions from a group of experts. The computer asks the group a series
of questions related to the decision, and each individual enters his opinions
on a numeric keypad. The computer weights and tabulates the responses and
displays the combined result. So far, this is a standard technique from
decision analysis. But the computer also displays histograms of the input
data. This allows a moderator to isolate areas of disagreement and investigate
them further. One individual may have an insight which others lack; the
moderator can spot such discrepancies on the histogram, and ask the stray
to explain his reasoning. From such debate can emerge a new consensus. In
theory, this happens at every committee meeting, but the new software is
much more effective in practice than the usual meeting. The new system,
called Teamworker, appears to be a genuine advance in complex decision-making
[Science, 250:367, 19Oct90].
Japan, meanwhile, is attempting to automate the ways in which scientists
exchange information. The new National Academic Center for Science Information
Systems (NACSIS) includes technical and bibliographic databases and electronic
mail services. The system designers are paying particular attention to the
users' needs for communication. Some such tools are available in the U.S.,
but only as a haphazard collection of parts designed for other uses [Nature,
The Japan Technology Transfer Association (JTTAS) is setting up a research
project into new computing technologies, including neural and biological
computing. This project, called the International Institute of Novel Computing
(IINC), is distinct from the nascent "sixth-generation computer"
project proposed by Japan's well-known MITI (Ministry of International Trade
and Industry). JTTAS gets its funding largely from private sources and says
the two computer projects are complementary [Nature, 347:217,
MITI recently announced that it would spend some $171 million over the next
ten years to study "microtechnology." This term refers to miniature
machines created by bulk technology, not to molecular manufacturing, but
in Japan these techniques are seen as complementary. Germany is planning
to devote some $255 million over four years to similar research. The National
Science Foundation in the U.S. is supporting such research at a level of
$2 million a year. [Seattle Times, 7Sep90]
A recent paper by three Japanese researchers described a reversible three-state
photoelectrochemical reaction which might be used to make extremely dense
computer memories [Nature, 347:658, 18Oct90].
The researchers' affiliation is intriguing: Department of Synthetic Chemistry,
Faculty of Engineering, University of Tokyo. In the U.S., synthetic chemists
insist on being described as pure scientists, despite their role in designing
and building molecular objects not found in nature. Molecular engineering
will progress faster when those who do it feel as comfortable with the label
"engineer" as do the synthetic chemists at the University of Tokyo.
Stewart Cobb is an aerospace engineer and was an early member of the
MIT Nanotechnology Study Group.
Research in nanotechnology continues
to grow: the latest indicator is the recent interest in computational nanotechnology
here at the Xerox Palo Alto Research
Center. In December we bought a Silicon Graphics 4D/35 workstation (6
megaflops) and the Polygraf molecular modeling software from BioDesign.
This lets us model chemically stable structures with as many as 20,000 atoms,
including proposed bearings, mechanical molecular logic elements, molecular
structural elements, etc. In the future we expect to get software that will
model transition states and reactive structures. Such quantum-mechanical
techniques are far more computationally intensive, restricting analysis
to ten or twenty atoms, but providing greater accuracy.
What does all this mean?
There is an accelerating trend towards modeling new designs and new concepts
on the computer before building them. GM has found that "computational
car crashes" on a CRAY are cheaper, more flexible, and provide more
information than real car crashes. Pharmaceutical companies are investing
heavily in molecular modeling to investigate new drugs for similar reasons.
Xerox, at several different sites within the company and for diverse reasons,
is also pursuing this trend by modeling a range of chemical systems.
Seen against this backdrop, work in computational
nanotechnology (at PARC or anywhere else) is simply a continuation of
the trend: before you build a car, a copier, or an assembler, you should
first model it on a computer. This lets you review more designs more quickly
and more cheaply before actually building (expensive) physical systems;
it reduces the lag time from product conception to product delivery; and
it improves the quality of the final product.
While it's not entirely clear how long it will be until we achieve a flexible
molecular manufacturing capability, it *is* clear that we will get there
more quickly and with fewer false starts if we model the components of such
a system on a computer before actually building them.
Oversimplifying somewhat, there are two classes of molecular modeling software:
molecular mechanics systems and quantum mechanical systems. Molecular mechanics
usually treats the nuclei of atoms as classical Newtonian point masses moving
in a potential energy function (or conservative force field) defined by
the electron cloud around them. There is no attempt to determine where the
electrons actually are, or even to worry about the electrons at all. Rather,
the positions of the nuclei directly define the forces acting between them.
As an example, consider two hydrogen atoms bonded together to form a molecule.
As the nuclei move closer together, they repel each other. As they move
farther apart, they attract each other. In equilibrium, the two nuclei will
stay at a characteristic distance. While this repulsion and attraction is
actually the result of a complex quantum mechanical interaction, it can
be summarized simply by noting the attractive or repulsive force acting
between the two hydrogen nuclei as a function of their distance. A complex
quantum mechanical interaction can be accurately summarized by a simple
graph. We don't know the actual electron distribution that produced the
forces acting on the two nuclei, and we don't care.
This is known more formally as the Born-Oppenheimer approximation: the nuclei
swim in a sea of electrons, but if all we are concerned about is the positions
of the nuclei, then we don't actually care where the electrons are: all
we really care about is the force field acting between the nuclei. The electrons
disappear from the computation and from our thinking, and are replaced by
the force field.
The Polygraf software from BioDesign uses the Born-Oppenheimer approximation
to greatly simplify the problem of modeling the interactions between nuclei.
By using structural data, heats of formation, and vibrational frequencies
determined experimentally for many different compounds, it is possible to
deduce a fairly accurate representation of the force field that must be
acting between the nuclei. A carbon-carbon bond prefers to be a certain
length, while two hydrogens bonded to a single carbon have a certain preferred
angle between them. These and other similar interactions form the building
blocks of the force field. Once this field is known, any structure can be
modeled (with greater or lesser accuracy), whether or not it was already
Empirically derived force fields have been available for many years. The
better ones provide quite good results within the broad range of compounds
they were designed to handle. By using this method, the geometry and interactions
of chemically stable structures (rods within a matrix, a molecular bearing
on a molecular shaft) can be modeled quite accurately.
This method has the great strength that a direct solution of Schrödinger's
wave equation is not required. The empirically derived force field is used
in its stead. It is this which allows modeling of structures with tens of
thousands of atoms and more.
Of course, because the force field is based on data derived from chemically
stable structures it does not provide information about unstable structures
or transition states. For this, it is usual to compute an approximate solution
to Schrödinger's equation (including the electronic structure). This
requires more computational effort, but allows analysis of chemically unstable
species (e.g., free radicals) and transition states where bonds are in the
process of being made or broken.
Taken together, these two methods from computational chemistry can model
the mechanical interactions of large structures with tens of thousands of
atoms, and the chemical interactions of one or two dozen atoms when bonds
are being made and broken. These are precisely the interactions that must
be understood if we are to build complex structures with atomic precision.
As we apply the methods of computational chemistry, a more detailed picture
of molecular manufacturing will emerge: a picture that will shorten the
path from today's limited abilities to the more general abilities of the
Dr. Merkle's interests range from
neurophysiology to computer security; he is a researcher at Xerox Palo Alto
Position Available: Research Associate in
A position will open in April 1991 for a Research Associate to conduct theoretical
research in molecular nanotechnology. This position reports to K. Eric Drexler
and is funded through a grant from a newly-formed research institute in
The Research Associate will:
gather information from the literature,
design and analyze molecular components and devices, performing computational
experiments using molecular modeling software,
co-author papers and articles on molecular devices, molecular manufacturing,
and their applications,
present these results at technical and other meetings.
Candidates must have a good grounding in physics, some substantial familiarity
with chemistry, and an interest in applying these to molecular engineering.
Writing skills and experience using computers are also required. Due to
the multidisciplinary nature of this work, an ability to learn quickly through
independent study is essential.
The successful candidate will have some characteristics in common with those
of the "ideal" candidate, as follows:
Has bachelor-degree--level (or better) knowledge of both physics and
chemistry, with a strong interest in engineering.
Has published some research articles.
Reads extensively in the science and technology literature.
Is enthusiastic about making a contribution to this field.
For reasons of cultural compatibility, we prefer candidates who routinely
invest more than the traditional forty hours/week in work and study. (Such
a person probably spends little or no time watching television.)
Rewards and Potential Career Path
The Research Associate position should be viewed as somewhat similar to
that of a graduate student; compensation comes in two forms:
unique training in a newly emerging field, and
a living stipend (including any benefits) of slightly over $20,000
We anticipate that someone who does well in this position will eventually
become an independent researcher, establishing his or her own reputation
as one of the first professionals to move into this emerging field.
Interested applicants should forward one or more of the following: resume,
c.v., copies of published or unpublished research work.
Mail to: K. Eric Drexler, Foresight Institute, P.O. Box 61058, Palo Alto,
CA 94306 USA
The Palo Alto chapter of the Computer Professionals for Social Responsibility
has recently formed a special interest group to explore new technical developments,
social consequences, and potential benefits and dangers of nanotechnology.
Founded by Apple computer scientist Ted
Kaehler, a long-time participant in both CPSR and the Foresight Institute,
the group meets every two weeks to discuss all aspects of the anticipated
technology: implementation methods, applications, and eventual effects on
our lives. The group will soon visit a local vendor of scanning tunneling
and atomic force microscopes. For more information contact Ted at 408-974-6241
or email@example.com. Meeting notices are sent to members of
the Palo Alto chapter of CPSR (you need not be a computer professional to
join), or can be obtained electronically from Ted. CPSR can be reached at
P.O. Box 717, Palo Alto, CA 94301.
In February Eric Drexler gave a plenary
lecture on nanotechnology, titled "Toward 1015 MIPS"
at the IEEE's Compcon computer conference held in San Francisco, and later
spoke on "Freedom of the Press for the Press of the Future." A
proceedings volume (including the latter paper but not the plenary lecture)
is available from IEEE Computer Society Press, 10662 Los Vaqueros Circle,
PO Box 3014, Los Alamitos, CA 90720-1264; request order number 2134.
Earlier in February he presented the concept to the New Roles in Society
group at the American Association of Retired Persons, which has stimulated
an invitation to speak at an April meeting of this group's steering committee.
In January the MIT Nanotechnology Study Group held an event at which a videotape
on nanotechnology was shown -- recorded at the Microelectronics and Computer
Technology Corporation (MCC) -- followed by a telephone-linked question
and discussion session.
In February, Ralph Merkle of the Computational
Nanotechnology Project at Xerox Palo Alto Research Center spoke on nanotechnology
at the Beckman Institute at the California Institute of Technology, and
at University of Nevada at Las Vegas. The latter talk was sponsored by the
American Chemical Society chapter and the local office of the Environmental
Also in February, Dr. Merkle spoke on the same topic at the Xerox Research
Center of Canada. In earlier months, he gave a well-received talk on silicon
nanotechnology at the Frontiers of Supercomputing II meeting at Los Alamos,
followed by a special evening session held on the topic.