[Editor's Note: This page has been optimized for Netscape
2 and later. If you are using a browser, such as Netscape 1.1, that does
not support the html tag for superscripts, please be aware that an
number like "2x109" is meant to be scientific notation for "2
times ten raised to the 9th power," and that "e2" means "e
Scanning Tunneling Microscope Techniques
Scanning tunneling microscopes have been used to to modify individual molecules.
Their main advantage is that they allow modifications at precisely chosen
locations. Their main disadvantage is that they operate on only one molecule
at a time. The first paper mentioned below describes a technique which may
extend the range of structures that can be built with an STM, while the
second demonstrates the stability of a type of switch that can now be built
with an STM.
An advance in controlling the environment of a scanning tip, which may be
applicable to fabrication, came from J. D. Noll, P. G. Van Patten, M. A.
Nicholson, K. Booksh, and M. L. Myrick, writing in [Rev. Sc. Instrum.
66: 4150-4156 Aug95]. They built a fluid cell for a scanning tunneling
microscope that allowed them to exchange fluids during imaging. The noise
due to fluid flow is sufficiently low that they were able to image graphite
with atomic resolution during flow. From a fabrication viewpoint, their
cell should allow a sequence of liquid phase reactions and scanning probe
controlled workpiece modifications to be applied to a single site. One should,
for instance, be able to adsorb one reagent on a surface, then locally modify
it with the scanning tip, then adsorb a different reagent, then modify the
new layer of the second reagent at the same site, and so on. The stability
of the fluid cell that this group has demonstrated should allow such a sequence
to be followed without needing to relocate the work site after changing
reagents, as would occur with a more disruptive method for changing reagents.
In the cell described in this paper, a reagent injected into the fluid stream
reached the cell approximately 90 seconds later, and peaked in concentration
about 50 seconds after that, setting a time scale of several minutes for
replacing reagents in this design.
An STM-based approach to nanometer-scale circuits is described by D. P.
E. Smith in [Science 269: 371-373 21Jul95]. This paper
describes experiments in switching quantum conduction channels on and off
in an STM experiment. The experiments were performed at 4.6 - 8.6 K, with
contact between a gold ball and a nickel tip (thought to become coated with
gold, producing gold-gold contacts, under experimental conditions). The
key finding is that the contact can form a very stable high conductance
state, with a conductance of 0.977±0.015 (2e2/h). This conductance
"could be stably measured over one position for a period greater than
24 hours and is in good agreement with detailed simulations that find a
stable single-atom contact with a quantized conductance of 0.93±0.05
(2e2/h)." The junction could be repeatedly cycled between
the conducting state and a state with about 15 times as much resistance.
The author observed switching speeds as fast as 10µsec (limited by
sensing electronics), and noted that "Molecular dynamics simulations
have shown that the fastest possible switching time for single atoms in
the point-contact configuration is on the order of 1 psec." The author
also observed switching of a second conductance channel, but much less stably
than the switching of the first channel. Because a long conductance channel
would make the switching of the second channel more distinct, "...the
single-channel constriction probably corresponds to a single atom."
Presumably the reproducible cycling between states of a single channel shows
that one of the bonds to an atom at the channel constriction can be repeatedly
broken and reformed without damaging nearby bonds enough to change the resistance.
In addition "The QPC [quantum point contact] switch displays power
gain and can be used to make an oscillator (by feeding the output signal
back into the control input) or to drive a series of other QPC switches."
Single Electron Circuits
Logic devices are necessary components of intelligent manufacturing systems.
The paper described below shows one approach to using small numbers of electrons
to operate electronic circuitry, including logic devices.
W. Rösner, F. Hofmann, T. Vogelsang, and L. Risch write of a technique
for simulating certain classes of single electron circuits in [Microelec.
Eng. 27: 55-58 1995]. They model their circuits as a set of
electrical nodes, tied together by capacitors and tunnel junctions, with
a charge on each node due to a small number of electrons.
Their modeling process calculates the electrostatic voltages on each of
the nodes. For each tunneling junction, it then calculates the change in
energy that would be produced by a single tunneling event through the junction.
Based on this energy change and the resistance of the junction, they then
calculate the rate at which electrons tunnel through it. The actual transition
is then selected with a probability proportional to its rate. Because this
analysis is applied to circuits with very small capacitances, the unfavorable
changes in energy due to the movement of individual electrons are accounted
for, so "Coulomb blockade" effects are properly modeled. Using
their program, the authors have modeled the voltage transfer curve of a
logic inverter, and have also simulated a ring oscillator built with 3 stages
of inverters. This simulated oscillator operated at a frequency of 20GHz
with a power consumption of 3nW. While this dissipation is larger than that
possible with fully atomically precise fabrication, it looks like an attractive
The authors describe some limitations to this modeling technique. It neglects
the energy level structure within the conducting regions that form the nodes
of the circuits. They note that this is acceptable as long as the regions
are a few nm in size, which makes the energy level separations smaller than
the voltages used in the circuits (67 mV was used in the inverter simulation).
In terms of fabrication, this makes the circuits designed with this paradigm
relatively insensitive to the atomic scale details of their conducting regions.
The calculation of the tunneling rates is also approximate, depending only
on the characteristics of a single junction at a time. This is accurate
as long as the junction resistance is large compared to the quantized resistance
h/e2 = 25.8kohms. In terms of fabrication, this makes the circuit
independent of quantum phase relationships across several circuit elements.
These circuits do need atomically precise control of their tunnelling
junctions, since the detailed bond geometry in the junction sets the tunnelling
resistance. The circuits might be good targets for a hybrid technique where
the tunnel junctions are built with atomic precision by classical chemistry,
and are then attached to capacitors and interconnections built by STM lithography.
The self-assembled route to nanotechnology requires extending synthetic
structures to larger sizes in order to form working mechanisms. This, in
turn, requires highly specific synthetic steps to maintain tolerable yields
of large, complex structures. These requirements are generally better satisfied
by enzymatic (or catalytic antibody) reactions than by classical chemistry.
Applications in molecular manufacturing have additional requirements. They
also require stiff structures, partially to reduce the effects of thermal
noise on mechanisms and (in the more extreme case of polymers with freely
turning bonds) to keep the task of predicting folding possible. In addition,
it is important that there be enough flexibility in the synthesis techniques
to permit the design of a useful range of structures. The papers described
below extend biosynthetic capabilities in these directions.
A new DNA technology that extends the range of metabolic products available
is described by M. Rouhi in [C&EN p9 2Oct95]. ChromaXome
cofounder K. A. Thompson explains "...if you take DNA and paste it
to DNA from another source or from other places in the same bacteria, then
you get a chemical that's built from a combination of pathways..."
From the point of view of nanotechnologists, these might provide building
blocks which are not otherwise available. P. B. Fernandes (a vice president
at Bristol-Myers Squibb) says "The diversity from natural products
is much more than you can buy from synthetic chemistry."
A specific example of which pathways might be usefully combined is explored
by R. McDaniel, S. Ebert-Khosla, D. A. Hopwood, and C. Khosla writing in
[Nature 375: 549-554 15Jun95]. They describe the biosynthesis
of polyketides, polymers of -(C=O)-CH2-, and their derivatives. The polyketide
backbone, without modification, would have many bonds about which it could
twist, and would not be a particularly good candidate for building stiff
structures. Fortunately, polyketide derivatives include compounds where
the backbone has been cyclized into sets of fused aromatic rings. One of
the new products that the article focuses on (designated SEK26) is a substituted
anthraquinone (a three ring structure). The authors give a set of design
rules for which gene clusters can be combined to give a functional synthetic
pathway. At present, 8 backbones are available, with the prospect that "enzymes
that catalyze downstream cyclizations and late-step modifications, such
as group transfer reactions and oxidreductions commonly seen in naturally
occurring polyketides, can be studied along the lines presented here and
elsewhere." Hopefully, some of the products of these enzymatic pathways
may include stiff, fused ring structures, valuable for constructing atomically
precise mechanisms, that are not available from classical organic chemistry.
Alternatively, the active sites of the enzymes which synthesize stiff, fused
ring structures may serve as a model for extending an early, protein-based
nanotechnology towards the diamondoid structures that exhibit better mechanical
properties and wider design options.
Turning to a broader range of biochemically catalyzed reactions, P. G. Schultz
and R. A. Lerner, writing in [Science 269: 1835-1842
29Sep95] describe the state of the art in catalytic antibodies. Catalytic
antibodies are raised by provoking an immune response to a compound that
resembles the transition state of a reaction that one wants to accelerate.
In catalysis, the antibody binds to the transition state of the reaction,
stabilizing it, lowering the activation energy of the reaction, and accelerating
the reaction. One of the early reactions to be catalyzed was ester hydrolysis.
The transition state for hydroxyl attack on an ester has a tetrahedral carbon
at the carboxyl position. Analogous phosphonates have sufficiently similar
tetrahedral geometries and charge distributions that antibodies to them
can catalyze hydrolysis of esters.
Shultz and Lerner describe a variety of reactions which are now possible
with catalytic antibodies. One of their examples is the catalysis of a Diels-Alder
addition ("...in its simplest form...the reaction of butadiene and
ethylene to yield cyclohexene..."). This is particularly notable because
no natural enzymes are known which catalyze this reaction. More generally,
Shultz and Lerner describe catalysis of a class of reactions known as pericyclic
reactions, only one of which is known to have an enzyme that catalyzes it.
"These reactions have not only received a great deal of theoretical
and mechanistic attention from chemists, they have also found many applications
in organic synthesis." In addition to the Diels-Alder reaction, they
describe catalysis of the Cope rearrangement, an intramolecular rearrangement
which moves two double bonds and a single bond in a six-atom group.
Another class of reactions that Shultz and Lerner cover is the catalysis
of reactions involving reagents ("cofactors") that are not present
under physiological conditions. The examples that they give are catalysis
of the oxidation of an organic sulfide by periodate and the reduction of
a number of ketones by cyanoborohydride to the corresponding alcohols with
"96% enantiomeric excess." In the corresponding uncatalyzed reaction,
there would be, of course, no selectivity between enantiomers.
Some reactions that can now be catalyzed have sufficiently unstable transition
states that they do not occur at any appreciable rate in the absence of
catalysis (although undesired side reactions of the same reagents may occur
without catalysis). An example of a cyclization is given where the product
formed in the presence of the catalytic antibody is essentially absent in
the normal reaction products. Another example is given where a cyclization
product is formed in 98% yield where the uncatalyzed reaction yields an
impure mixture of many products. The authors write "These studies will
undoubtedly lead to efforts aimed at larger multiring cyclization reactions."
The facility that these antibodies provide to nanotechnologists is the ability
to produce more selective reactions than classical organic chemistry allows.
Since the antibodies can bind all over the surfaces of their substrates,
they can geometrically orient the reactants, confining them to just one
of a number of potentially competing reaction pathways. This can potentially
allow us to build structures which are unreachable (in reasonable yields)
through classical techniques.
Another approach to controlling stiff, polycyclic structures from a technology
base which initially controls only flexible compounds is to control the
formation of the polycyclic structures in crystal lattices. A. Berman, D.
J. Ahn, A. Lio, M. Salmeron, A. Reichert, and D. Charych, writing in [Science
269: 515-518 28Jul95], describe controlling the crystallization of
calcite with an organic monolayer. Their experiments grew calcite crystals
on an acidic PDA (polydiacetylene) film. The monomer that they used was
Polymerization of this compound yields long polymer chains within the PDA
monolayer. In these experiments the calcite crystal's a axis was aligned
"parallel to the polymer backbone direction" (within the plane
of the film). By comparison, in previous examples of crystal nucleation
on thin films "The crystal axes in the plane of nucleation do not appear
to be aligned with a structural parameter of the nucleation surface."
The authors state that "the total control exerted over their [the crystals']
orientation is to our knowledge unprecedented in crystal growth at the in
vitro organic-inorganic interface." In order to use these crystals
for atomically precise mechanisms, it would now be desirable to extend this
work to control the absolute position, as well as the orientation, of these
crystals, perhaps by performing the analogous experiments on proteins that
mimic a finite, well-defined area of the PDA membrane.
An important capability in approaching nanotechnology is to be able to evaluate
the results of attempts to build target structures. A three dimensional
reconstruction of the structure of the ribosome at 25Å resolution
is described by J. Frank, J. Zhu, P. Penczek, Y. Li, S. Srivastava, A. Vershoor,
M. Radermacher, R. Grassucci, R. K. Lata, and R. K. Agrawal in [Nature
376: 441-444 3Aug95]. The analysis technique that they used combined
4,300 electron micrographs of separate ribosomes. The computational power
brought to bear on the problem was remarkable. For instance, "The orientation
of each projection was determined individually by matching it to computed
projections of the previous 29Å model..." This analytical technique
may prove useful for determining if some self-assembly step succeeded or
failed. It retains one of the strongest points of x-ray diffraction, the
ability to use information from many molecules, without requiring the ability
to precisely align all of the molecules in a crystal, as x-ray diffraction
The specific results of this group allow them to identify "a channel
in the small ribosomal subunit and a bifurcating tunnel in the large subunit
which may constitute pathways for the incoming message and the nacent polypeptide
chain, respectively." As this model is extended towards molecular precision,
it may allow us to improve the yield of our in vitro peptide synthesis techniques,
or possibly to extend the capabilities of ribosomes in intact organisms.
Either of these developments would broaden the range of proteins available
to us for constructing novel structures.
Jeffrey Soreff is a researcher at IBM with an interest in nanotechnology.
It was a pleasure addressing those of you who attended the Foresight
Conference on Molecular Nanotechnology in mid-November. Your questions,
as usual, were thought- provoking and perceptive. I enjoyed the dialog.
Unfortunately, there was more to discuss than time available for my presentation.
This column will fill in the gaps.
Primarily, you should be aware of a few pending bills that may have additional
impact on emerging technology, such as nanotechnology. The House Bill HR
1733, sponsored by Rep. Carlos J. Moorhead (R-CA) and Patricia Schroeder
(D-CO), proposes to amend the patent term from its present 20 years from
the U.S. priority filing date, to 20 years from the filing date of the issued
patent. This is a subtle, yet significant difference; 20 from priority means
that the 20 year term may be measured from an earlier patent application
from which the issued patent is derived; whereas 20 years from the filing
date of the issued patent merely considers the date on which the issued
patent was filed, independent of when the parent case was filed. To overcome
issues of "submarine patents", this bill includes a mandatory
publication of patent applications at 18 months from the priority filing
For example, an applicant files a "parent" application on June
30, 1995, and subsequently files on July 30, 1996, a "continuation"
or a "divisional" application based on the parent case. Under
the existing law (as amended post-GATT) the issued patent would have a term
of 20 years from June 30, 1995, but under the Moorhead/Schroeder Bill, the
patent would have a term of 20 years from July 30, 1996, with a publication
occurring December 30, 1996.
Another House Bill HR 359, sponsored by Rep. Dana Rohrabacher (R-CA) proposes
to change the patent term to the longer of 17 years from issuance or 20
years from the U.S. priority filing date. However, because it does not address
concerns over "submarine patents", it is not expected to succeed.
Another controversial pending House Bill is HR 2235, also sponsored by Moorhead.
That Bill proposes to grant certain rights to prior users of patented technology.
Such a provision exists under most foreign patent systems, and is notably
absent from the U.S. system. The Bill proposes to grant a royalty-free right
to a prior user to continue using the patented invention if the prior user
has been using it in good faith, and only continues to use technology for
the use existing at the time the patent issues. Obviously, this Bill raises
both significant concerns and potential for small companies, individual
inventors, and emerging technologies.
This last topic is worth discussing, and I welcome your comments. I will
address this issue, incorporating your comments, in the next column.
Elizabeth Enayati is a patent law specialist with the Palo Alto law firm
Weil, Gotshal & Manges.
Her e-mail address is email@example.com,
or phone (415) 926-6248, or fax (415) 854-3713. She can also be reached
by mail c/o Foresight Institute, P.O. Box 61058, Palo Alto, CA 94306.
[Editor's Note: See later columns for Elizabeth Enayati's
"There is something very wrong with American science." This provocative
claim was offered by George A. Kenworth II, White House Science Advisor
under the Reagan Administration, while addressing the House Science Committee
last June. "Preserving the status quo has become the overarching goal,
replacing the pursuit of excellence," he continues. In addition, U.S.
science suffers from "a deeply ingrained lack of [public] accountability"
and "ingratitude for two generations of unparalleled federal largesse.
"A major overhaul is needed." And the restructuring implicit in
the creation of a Department of Science (DOS) "is the only way I know
to restore coherent policies, research dedicated to excellence, and the
public trust," Kenworth told the committee.
Robert S. Walker (R-Pa.), chair of the committee, anticipates drafting a
bill to establish a DOS to the U.S. Congress. He estimates that the coordination
and streamlining of policy made possible by merging many research responsibilities
could, over seven years, save Uncle Sam more than $2 billion and eliminate
more than 5,000 federal jobs.
While the proposal won overwhelming endorsement from most committee members
present - and from all invited witnesses - the June 28th hearing also highlighted
some major issues that must be resolved before the long-term implications
of such a restructuring can be evaluated. Turf battles are expected.
One issue sure to kindle protectionist passions among people currently engaged
in federal research and development (R&D) is how much applied research
- the "D" in R&D - a DOS should undertake. Today, nearly 80%
of the federal R&D budget goes to applied research, including technology
Some people anticipate a DOS would probably spell an end to the White House
Office of Technology Policy (OSTP), but one of the chief advantages of a
DOS would be to merge science policy making and program implementation into
a single structure. [Science News, 148: 59-60]
U.K. "Technology Foresight"
In the U.K., an ambitious program to harness science, engineering and technology
for increasing the competitiveness of that nation's businesses is now ready
to disseminate and implement a number of recommendations. These recommendations
are based on a massive consultation exercise spanning 15 sectors of British
industry. One of the more concrete proposals is for the creation of a national
institute for applied catalysis.
The program is known as "Technology Foresight," and was one of
several announced in a 1993 document titled "Realizing Our Potential:
A Strategy for Science, Engineering, and Technology." It was the first
major policy review in the area for 20 years. "The program is a new
departure," says John Brophy, research general manager for BP Chemicals,
Sunbury on Thames, England. "It identifies national priorities for
the scientific infrastructure to channel research in the direction of industry."
Brophy is vice chairman of the Technology Foresight Panel on Chemicals.
He explains "Our panel recommends that we retain a large element of
that, but also ensure that the science base, working with industry, is prepared
for and aware of market opportunities when they arise."
Top priority areas include genetic and biomolecular engineering, sensors
and sensory information processing, and environmentally sustainable technology.
Catalysis, chemical and biological synthesis, and materials are regarded
as intermediate priority areas that require further action and development
of "stronger exploitation links." The strategic themes include
the need for a cleaner, more sustainable world, and advances in materials
science, engineering, and technology, with a particular emphasis on multidisciplinary
settings. [C&EN July 3, 1995, pps. 16-17]
"A Visualization Revolution" is how recent advances in computing,
visualization algorithms, and display technology are heralded by R&D
Magazine, in the October 1995 issue. "Virtual environments"
can now be created which enable researchers to make prototypes and conduct
tests and experiments. A virtual prototyping system, with stereoscopic computer-driven
images projected on two walls and the floor of the CAVE (Cave automatic
virtual environment) has been unveiled at the National Center for Supercomputing
Applications at the University of Illinois, Urbana-Champaign.
Researchers at Caterpillar Inc., have developed a virtual reality copy of
Caterpillar's Peoria test track. In the CAVE, one can sit in an instrumented
mockup of an earth-moving machine cab. Listening to lifelike stereo sounds,
the driver maneuvers the computer-driven machine and its earthmoving accessories.
With these simulations, Caterpillar engineers can eliminate the lengthy
process of fabricating steel casting needed for design studies by changing
CAVE's design program, which only takes a few hours.
Recent advances in desktop computing power and visualization of algorithms
mean that supercomputers are no longer always necessary. An engineer with
Pacific Gas & Electric used a desktop system and graphical analysis
program to study the effects of earthquakes on a building. Using accelerograms
to study the effects of the 7.1 Richter scale Loma Prieta earthquake on
a 34-story steel building in San Francisco, the model resulted in visualizations
of the movement of the 34th floor during Loma Prieta. These data allow structural
engineers to realistically evaluate the seismic safety of the building.
The analyses included baseline corrections, integration to obtain velocity
and displacement data, cross-section functions, and fast Fourier transforms
to measure the response to earthquake movements.
In line with my unbridled enthusiasm for the Foresight Web Project, here
are two important URLs for Update readers.
Color microscopy imaging is now seen on Web pages to add product details
and provide users with application forums. Many of these images would interest
nanophiles. Point your Web browser to: Digital Instruments, http://www.di.com Topometrix, http://www.topometrix.com
Dr. Jamie Dinkelacker leads Apple Computer's development of multimedia
authoring tools for science, math, and medical education as Senior Engineer
Scientist, Technical Manager of the East/West Authoring Tools Group within
Apple's Advanced Technology Group.
Modeling Report: Oak Ridge National Laboratory
Dr. Noid Describes Simulation Software to Minnesota Study Group
by Steven C. Vetter
Newly devised computational algorithms make possible more complex and useful
simulations of nanoscale structures, Dr. Donald Noid told the second meeting
of the Minnesota Molecular Nanotechnology (MNT) Study Group. Dr. Noid is
a physical chemist from the Department of Energy's Oak Ridge National Laboratory
and pioneer of chaos theory.
Recent molecular modeling work by the Oak Ridge National Laboratory trio
of Donald W. Noid, Bobby G. Sumpter and Robert E. Tuzan also was described
in a recent presentation by Drs. Sumpter and Noid at Technology 2005,
held at Chicago's McCormick Place convention center. Sumpter and Noid presented
the trio's paper, "Advancing Manufacturing Through Computational Chemistry,"
which discusses the path toward convergence of nanotechnology and computational
In the Minnesota presentation, Dr. Noid described his molecular simulation
software and showed a video of the software simulating several new molecular-scale
mechanisms. His algorithm reduced the computation required from a page full
of equations per atom simulated to just a few lines, he said. This algorithm
makes computational simulation of a many-atom structure possible. Without
it, many of the simulations he has done would be impractical with currently
available computing power.
The new mechanisms simulated and shown on video included:
A molecular steam engine, consisting of two tubes: a large diameter
one, and a small diameter one inside the first, with a flared "knot"
about midway along the length. The knot serves as the piston. The space
between the inner and outer tube is thus divided into two regions. These
regions are filled with water molecules. By heating one end of the larger
tube, the water molecules expand against the knot of the inner tube, driving
the inner tube laterally.
A tube as a conveyor or pipe for smaller-diameter buckyballs being
carried along by a flow of helium atoms. One of things he was testing with
this simulation was the propensity of the helium atoms to leak past the
buckyballs. He talked about some counter-intuitive results he got with these
experiments, dealing with the leakage as a function of the size and stiffness
of the tube. Stiffer tubes leak more, he said.
Bearing simulations, showing that friction, while different in each
mechanism's details, is very much present at the molecular level. A bearing
consisting of two concentric graphite tubes was given an initial rotation.
It stopped rotating within a few nanoseconds, due to friction.
Another simulation involved bearings being driven by a laser to simulate
a motor. Basically, a small-diameter graphite tube was placed within a larger
graphite tube. The inner tube had a pair of atoms attached at the end that
provided a dipole moment that the laser could couple with. Both one-laser
and two-laser motors were simulated, with a variety of radii and other parameter
values. It was clear from graphing the angular momentum as a function of
time that many of these designs were unreliable. The video of the motor
simulation showed graphite tubes distorting significantly in response to
the force of the driving laser, changing the characteristic dipole length
of the receiver. It will be interesting to see how well this laser-driven
motor works with a diamondoid bearing, instead of the graphite tubes. Possible
solutions to the problem, which remain to be explored, include a three-laser
motor, and simply stiffening the tubes, he said.
Simulation of a diamondoid bearing design known to be faulty in terms
of the stress on the bonds. As expected, the bearing changed shape radically
in response to the energy stored in the overly-strained bonds.
Noid's videos provide a sense of watching actual atoms. We can "see"
atoms, make things out of them, and see if they work. He said his software
makes simulations more realistic by taking into account conservation of
energy. All other known molecular dynamics packages have systematic errors
under which energy monotonically increases with time.
Dr. Noid believes that simulations of a mechanism must run for a fair amount
of time, several nanoseconds at least. He has seen a structure act in a
very stable fashion for a long period, then exhibit anomalous behavior.
With his software, you can "fine tune" the molecular structure
with successive iterations to reduce or eliminate unwanted behavior, in
the process developing "design rules" which, if followed, will
result in sound mechanisms. With this software, it is as if we could suddenly
make any physically possible molecular structure, and let it run and watch
it work or not. This is enormously powerful.
He cited two problems doing large simulations (thousands of atoms for millions
of simulation cycles). First is the substantial computing power required.
His algorithms provide much more accurate results (especially for long simulation
times), and do it much more efficiently, but the shear size of the computations
required is still significant. Some of his simulations required as much
as a week on up to four IBM RISC 6000 processors (a mini-supercomputer costing
around $60,000 each). Fortunately, his code is extremely vectorizable, meaning
that it can be run on many processors in parallel very efficiently. This
allows efficient simulation on even the most massively parallel machines,
such as the Paragon.
Another problem with large simulations is the large amount of data generated.
This problem is not unique to molecular simulations. People have been trying
to deal with simulation results data since the first simulators were used,
several decades ago. He is currently working on ways to "mine"
that data for useful information, using neural nets and other approaches.
Steven C. Vetter (firstname.lastname@example.org)
is a co-founder and President of Molecular Manufacturing Enterprises, Inc.,
9653 Wellington Lane, Saint Paul, MN 55125. MMEI is working with Dr. Noid
to make his software more widely available.