
Feynman: There's Plenty of Room at the Bottom (1959) - MaysonL
http://www.zyvex.com/nanotech/feynman.html
======
jrkelly
A great article. I suspect the shortest route to atomically-precise
manufacturing will be by engineering biology since biology already knows how
to do it. And there is even already a high-school competition as Feynman
suggested at the end of that article:
[http://en.wikipedia.org/wiki/International_Genetically_Engin...](http://en.wikipedia.org/wiki/International_Genetically_Engineered_Machine)

~~~
maaku
Biology does not do atomically-precise manufacturing. It does do a lot of
sorting and filtering of random interactions, and in a few places (the
ribosome), precision alignment of large complex basic units (amino acids). It
is very, very unlikely that we will develop Drexlerian nanotech via biological
processes.

~~~
shawkinaw
I wouldn't call amino acids "large complex basic units" at all, they may be
bigger than an atom but 10-20 heavy atoms is hardly complex. Plus many amino
acids themselves are synthesized in vivo.

Moreover, this misses the point somewhat, namely that the resulting peptide
chain folds into a protein with essentially atomically precision. Atoms may
not be placed one by one, but an atomically precise structure is created.

~~~
maaku
When the machinery you want to lay out is composed of carbon atoms placed at
sub-nanometer scale, 10-20 heavy atoms is like boxing gloves.

------
IvyMike
For those who think that nanotechnology is too mundane, what about machines
made from nucleons?

[https://en.wikipedia.org/wiki/Femtotechnology](https://en.wikipedia.org/wiki/Femtotechnology)

Greg Egan speculates on femtotechnology in some of his sci-fi, mostly as
insanely fast computers. (I mean, jeez, it takes SO MUCH TIME for an electron
to whirl around the nucleus of an atom... it's so much faster when your
computer _is_ the nucleus.)

An excerpt from a Egan short story:
[http://gregegan.customer.netspace.net.au/SCHILD/00/SchildExc...](http://gregegan.customer.netspace.net.au/SCHILD/00/SchildExcerpt.html)

More on femtocomputing:
[http://hplusmagazine.com/2011/11/01/femtocomputing/](http://hplusmagazine.com/2011/11/01/femtocomputing/)

~~~
maaku
As a physicist and mechanical engineer ... I have absolutely no idea how
femtotechnology is supposed to work. The entire premise seems to be: "quarks
are smaller!" So? You know what you get when you put a bunch of sub-atomic
particles together? Atoms.

I would like very much to see an actual femtocomputer design, one which has a
chance of working (as a commentator in the h+ article points out:
"Unfortunately, this relies on assumptions that contradict known physics and
ones that have not yet been proven."), so I could at least understand what it
is supposed to be...

~~~
evanb
QCD's strong coupling makes it hard to imagine how one could create programs
for quarks with bounded errors. The linked article seems to suggest using
gluons/interactions as gates for the colors on the quarks. This is so totally
bonkers it is hard to know where to start, but perhaps an easy place is: how
would you go about isolating a single quark and a single gluon in the horrible
nonperturbative mess that is the QCD vacuum[0].

If you want to program the nucleons in the nucleus, you can switch your
description to chiral perturbation theory[1], which is weakly coupled, but you
need to be able to shoot individual pions at individual nuclei, which would be
extremely difficult, and might require enough energy to liberate the target
nucleon from the rest of the nucleus, anyway, destroying your "computer".

[0]
[https://en.wikipedia.org/wiki/QCD_vacuum](https://en.wikipedia.org/wiki/QCD_vacuum)

[1]
[https://en.wikipedia.org/wiki/Chiral_perturbation_theory](https://en.wikipedia.org/wiki/Chiral_perturbation_theory)

------
marcosdumay
And we are almost there. People currently print circuits with 18nm features,
that's about 180 atoms long, or a square with 32 thousand atoms in the
surface. We are still limited to planar designs on silicon, but that's just a
couple of breakthroughts from 3D printing once we get precision enough.

But that's the talk that launched the idea of molecular nanotech. That side
note is far away (or not, who knows?), but the main line is almost here.

~~~
HCIdivision17
Chips are built differently from your standard 3D printing process: 3D is
additive or subtractive, while chips are mask-deposit-strip (filter, additive,
subtractive?). Keep in mind that the reason we can get to those sizes on
microchips is through mind-boggling optical engineering on the masking step. I
remember a prof explaining in the clean room over one of our Perkin-Elmer
lithography stations that it takes years to redesign the systems for new
wavelengths; the application's precision demands that each optical system is
specifically engineered and tailored to that wavelength.

Get rid of the masking stage and you've got the missing link to 3D printing.
But that would require nigh magical levels of control over a particle beam.
Which could totally be solvable to a sufficiently clever team of engineers and
a sufficiently robust and controllable beam source. But add that basically
nothing behaves itself on those scales, especially machines, and there's still
tons of work left.

But you're right like Feynman was all those years ago: you can totally imagine
it is plausible, so there's no way we won't try! (Great, now I want to go back
into microtech...)

~~~
marcosdumay
> Get rid of the masking stage and you've got the missing link to 3D printing.
> But that would require nigh magical levels of control over a particle beam.

Quite like in an electron microscope. Ok, most electron microscopes are way
less precise than top of line lithography, but there there are some that are
more. 3D printing with controlled ion deposition would be expensive as hell,
but I see no reason why it could not be done.

Anyway, I was talking about iterating the filter - add - subtract pattern. Our
current low precision is the reason we only iterate a few times. With
increased precision, we can do it more. Yep, any advance here requires years
and a lot of investiment. But they always come through.

~~~
HCIdivision17
Now that you mention it, the setup I imagined is dang near identical to an
SEM, but with atoms. All we need to do is replace the filament with a mass
spectrometer's filter and bam! Instant micro-3D printer (where instant is ten-
twenty years of R&D :). I'd be willing to devote a decade of my life working
on that.

As for the second half, I apologize. I didn't realize you were thinking in
terms of the long now. I think you're completely correct: we will spend the
time and effort to make it better and faster. I have no doubt industry will
come through for us in this. We got to supercomputers in our pockets in less
than one lifetime, after all!

------
petergreen
"Of course, a small automobile would only be useful for the mites to drive
around in, and I suppose our Christian interests don't go that far."

I so want to make and have this tiny car and give mites driving permits omg.

------
T-A
... which led to
[http://e-drexler.com/p/06/00/EOC_Cover.html](http://e-drexler.com/p/06/00/EOC_Cover.html)

------
throwaway_yy2Di
To point out what everyone's overlooking: this was written in 1959, which
means the wildest sci-fi nanotechnology he's imagining here... is, basically,
achieved by magnetic HDDs since around 2011.

" _Why cannot we write the entire 24 volumes of the Encyclopaedia Brittanica
on the head of a pin? "

"Let's see what would be involved. The head of a pin is a sixteenth of an inch
across. If you magnify it by 25,000 diameters, the area of the head of the pin
is then equal to the area of all the pages of the Encyclopaedia Brittanica.
Therefore, all it is necessary to do is to reduce in size all the writing in
the Encyclopaedia by 25,000 times. Is that possible? The resolving power of
the eye is about 1/120 of an inch – that is roughly the diameter of one of the
little dots on the fine half-tone reproductions in the Encyclopaedia. This,
when you demagnify it by 25,000 times, is still 80 angstroms in diameter – 32
atoms across, in an ordinary metal. In other words, one of those dots still
would contain in its area 1,000 atoms. So, each dot can easily be adjusted in
size as required by the photoengraving, and there is no question that there is
enough room on the head of a pin to put all of the Encyclopaedia Brittanica."_

By his assumption, a pinhead is about 0.003 inches^2 (2 mm^2). The current
_Encylopedia Brittanica_ has about 300 million English characters [0], which
is about 300 MB in a reasonable text encoding (although it should compress [1]
to around 60 MB). So, what Feynman is speculating about, translates in digital
language to a memory density of 300 MB/(0.003 in^2) or 100 GB/in^2 or 800
Gb/in^2. This is roughly an average magnetic HDD from 2011 [2].

To emphasize the point: Feynman is speculating about a dot _" 80 angstroms [8
nanometers] in diameter - 32 atoms across, in an ordinary metal"_. This is
actually the size of a magnetic domain on a HDD platter -- wikipedia gives it
as 10 nm [3].

Unfortunately, there now exists a far larger English-language encyclopedia
which is 9 GB compressed [4] or 75 GB with images [5]. Using Wikipedia as the
new benchmark, it is currently _not_ possible to compress it onto the head of
a pin.

[0]
[https://en.wikipedia.org/wiki/Wikipedia:Size_comparisons#Com...](https://en.wikipedia.org/wiki/Wikipedia:Size_comparisons#Comparison_of_encyclopedias)

[1]
[https://en.wikipedia.org/wiki/Entropy_%28information_theory%...](https://en.wikipedia.org/wiki/Entropy_%28information_theory%29#Introduction)
("1.5 bits per character")

[2]
[http://storageconference.org/2013/Papers/2013.Paper.01.pdf](http://storageconference.org/2013/Papers/2013.Paper.01.pdf)

[3]
[https://en.wikipedia.org/wiki/Magnetic_storage#Design](https://en.wikipedia.org/wiki/Magnetic_storage#Design)
("Magnetic grains are typically 10 nm in size...")

[4]
[https://en.wikipedia.org/wiki/Wikipedia:Database_download#En...](https://en.wikipedia.org/wiki/Wikipedia:Database_download#English-
language_Wikipedia)

[5]
[http://xowa.sourceforge.net/image_dbs.html](http://xowa.sourceforge.net/image_dbs.html)

~~~
Houshalter
I believe it's only 2gb without images, 9gb with, but I may be incorrect.

~~~
throwaway_yy2Di
Check again. The text-only snapshot is 44 GB of XML compressed to a 9.5 GB
.bz2 -- that's just the current text of the English-language wikipedia. And
XOWA's snapshot, with images, is 75 GB compressed.

------
popotamonga
wow, thanks for sharing this!

------
roye
an application of DNA as a building block:
[http://www.youtube.com/watch?v=-5KLTonB3Pg](http://www.youtube.com/watch?v=-5KLTonB3Pg)

