All manufactured products are made from atoms, with the properties of these products based on how atoms are put together. By rearranging coal atoms, diamonds are formed. Similarly, by rearranging the atoms in sand and adding some trace elements, electronic chips are developed. In time, it will be possible to more readily connect the fundamental building blocks of nature. The word "nanotechnology" is used to describe when the characteristic dimensions of any technology that is less than about 1,000 nanometers. The future will bring the production of new manufacturing processes that will allow companies to inexpensively build systems and products that are molecular in both size and precision. Any businesses, old and new, that are interested in, considering and/or already applying this advanced and still relatively unknown technology will have to do extensive research on the benefits and disadvantages and then, if wanting to proceed, develop a complete and thorough strategy of implementation if they wish to have higher chances of future success.
NT uses either top-down processes (lithography) to cut out or add material to a surface, or bottom-up processes in which NT materials self-assemble to create larger structures. Merkle from Xerox Corporation explains that this is a similar process that is done continually by trees. "Trees grow by taking energy from sunlight and nutrients from the soil to build themselves ... They only use what they need, arranging the atoms in complex internal patterns." Further, "trees also self-replicate: They produce seeds that build other trees. Precisely because it's a miracle of biology, lumber costs only a few dollars per pound." In other words, once there is self-replication, a company has a means for a manufacturing process that is intrinsically low cost (In Fouke 47).
Living systems also use self-assembly, adds Merkle (In Fouke ibid), who calls this "selective stickiness." If two molecular parts have complementary shapes and charge patterns, they will have the tendency to stick together to form a larger part. This will help in building "nanotools," which will construct other things. Molecular-scale positional devices will hold molecules in precise position and one-ten-millionth-scale robotic arms sweeping back and forth over a surface will add and withdraw atoms to build structures. This is comparable to constructing a car. First, use nanotools to build an assembler, or a minute, computer-controlled robot that can be programmed to build nearly anything. Then program the assemblers to replicate. Finally, build the product.
The time for "selective stickiness" is brief: A nanotool can move a molecule into position in a microsecond. With a million operations per second and a billion atoms per assembler, it would take approximately 20 minutes for an assembler to build a copy of itself -- or a billion assemblers in 20 hours. A car can be built in a few hours.
The basis of nanotechnology has grown out of several years of research, innovation and enhancements in a number of different fields of science, engineering and manufacturing. Computer circuits are becoming increasingly small and chemicals more complex. Biochemists regularly acquire more knowledge on how to study and control the molecular basis of organisms. Meanwhile, mechanical engineers are continually improving their precision of design and production.
In 1959, Nobel laureate and Caltech physicist Richard Feynman suggested that it should be possible to build machines small enough to manipulate and control things on a small scale. His talk, "There's Plenty of Room at the Bottom," is widely considered to be the foreshadowing of nanotechnology. Among other things, he predicted that information could be stored with amazing density. Despite the fact that the power of computers was just being recognized, he had the foresight to see this as the future:
... I do know that computing machines are very large; they fill rooms. Why can't we make them very small, make them of little wires, little elements- -- and by little, I mean little. For instance, the wires should be 10 or 100 atoms in diameter, and the circuits should be a few thousand angstroms across. Everybody who has analyzed the logical theory of computers has come to the conclusion that the possibilities of computers are very interesting- -- if they could be made to be more complicated by several orders of magnitude.
He could even imagine how this would be possible:
Up to now, we have been content to dig in the ground to find minerals. We heat them and we do things on a large scale with them, and we hope to get a pure substance with just so much impurity, and so on. But we must always accept some atomic arrangement that nature gives us. We haven't got anything, say, with a 'checkerboard' arrangement, with the impurity atoms exactly arranged 1,000 angstroms apart, or in some other particular pattern. What could we do with layered structures with just the right layers? What would the properties of materials be if we could really arrange the atoms the way we want them?
Another Nobel physicist, Phillip W. Anderson, in 1972 articulated the concept of "emergent" properties in complex systems in "More is Different" (Scientific American ix). He noted that the behavior of large and complex aggregations of elementary particles cannot be understood as a simple extrapolation of properties of a few particles. "Instead, at each level of complexity, entirely new properties appear, and the understanding of the new behaviors requires research which I think is as fundamental in its nature as any other."
In 1981, Eric Drexler of MIT began exploring Feynman's vision by describing the physical principles of molecular manufacturing systems in a paper published in the Proceedings of the National Academy of Sciences, where he envisioned nanomachines making products with atomic precision and introducing what would become molecular manufacturing. He quickly realized that molecular machines could control the chemical manufacture of complex products, including additional manufacturing systems with very powerful technology. "Molecular assemblages of atoms can act as solid objects, occupying space and holding a definite shape. Thus, they can act as structural members and moving parts." In this same year, the scanning tunneling microscope was invented by Gerd Binnig and Heinrich Rohrer at IBM's Zurich Research Labs provided the first direct images of individual atoms.
In 1986, Drexler introduced the term "nanotechnology" in his book Engines of Creation. Norio Taniguchi, in Japan, had used the word in the mid-1970s to describe precision micromachining. His definition remains today: "Nano-technology' mainly consists of the processing of separation, consolidation, and deformation of materials by one atom or one molecule." In 1992, Drexler's Nanosystems, technically outlined a way to manufacture extremely high-performance machines out of molecular carbon lattice, or "diamondoid."
In 1990, IBM painstakingly positioned 35 xenon atoms to spell the business' 3-letter name, which made it the world's tiniest company logo. Then Cornell University scientists produced a non-visible "nanoguitar," which cannot be seen by the naked eye. The strings, only a few atoms across, could be "plucked" by laser beams to play 17 octaves above those made by a typical guitar that exceed human hearing capability. Hewlett-Packard then began in 1999 to assemble circuits one molecule thick. The development of these circuits was announced in an essay in the magazine Science on July 16, 1999.
Research into nanotechnology had also long been conducted in the biology field. Erwin Schr ger, a 1933 Nobel Prize physicist who discovered new forms of atomic energy in 1945 questioned: "What is Life?" His concern was not philosophical, but rather how physics and chemistry could account for the events in space and time within the spatial boundary of a living organism (Crandall 18). He looked into the materiality of cellular life, speculating that the theorized gene was able to reproduce itself and use a code to determine the organism's development. Schr ger thus foresaw the key characteristic of DNA -- its ability to define a set of instructions for the material construction of living forms.
At the same time, John von Neumann published his report on the EDVAC that detailed the basic constructs of the modern computer. Based on the evident similarities between organic systems and logical computation, mathematicians responsible for designing the first electronic computers began studying the information-processing capabilities of both living creatures and engineered automata (Crandall 20-22).
Scientists envisioned that nanotechnology would benefit everything from microprocessing and lab work to agriculture and medicine. If these nanoscience areas were advanced by nanotechnology, society would benefit from the commercial impact in manufacturing and worldwide competitive advantage. This vision for the future was enhanced under President Clinton's administration, with the National Science Foundation creating a network of institutions researching the field. In 2001, over 30 nanotechnology research centers were started in the U.S. Less than ten existed in 1999. In other countries, total funding for nanotechnology grew from $316 million in 1997 to about $835 million 2000 (Scientific American.)
Since then, increasing uses have been hypothesized for the development and application of nanotechnology. Possible applications include quantum computers to long-term life preservation. These include:
Medicine: Nanotechnology could eradicate disease. Molecule-sized "nanobots" could be programmed to…