Thursday, October 15, 2020

Nanotechnology: Definition And History

The National Nanotechnology Initiative (NNI) defines nanotechnology as a science, engineering, and technology research and development conducted at the atomic, molecular, or macromolecular levels in the sub-100 nm range to create structures, devices, and systems that have novel functional properties.

The prefix “nano” referred to the Greek word for dwarf or something very small. One nanometer (nm) is equal to one-billionth of a meter, or about the width of 6 carbon atoms or 10 water molecules.

The development of nanoscience can be traced to the time of the Greeks and Democritus in the 5th century B.C. At that time, the scientists considered the question of whether matter is continuous, and thus infinitely divisible into smaller pieces, or composed of small, indivisible and indestructible particles, which scientists now call atoms.

Most modern accounts of the history of nanotechnology begin with Nobel Prize Laurette Richard Feynman’s historic 1959 lecture at the California Institute of Technology titled “There is Plenty of Room at the Bottom,” in which he outlined the idea of building objects from the bottom up. The lecture was delivered at the session of the American Physical Society.

Feynman’s new idea did not gain much traction until the mid-1980s, when Eric Drexler published Engines of Creation in 1986, a popular treatment of the promises and potentials of nanotechnology.

Norio Taniguchi from Tokyo University was the first to use the term ‘nanotechnology’ in a 1974 at the international conference on industrial production. He defined ‘Nano-technology' was mainly consisting of the processing of, separation, consolidation, and deformation of materials by one atom or one molecule.
Nanotechnology: Definition And History

The Most Popular Posts