
An advanced nanotube computer may keep Moore’s Law alive - yboris
https://www.technologyreview.com/s/614247/the-worlds-most-advanced-nanotube-computer-may-keep-moores-law-alive/
======
hinkley
We are currently going down a path of chip 'packages' where we put a bunch of
discrete cores and management chips into an enclosure that drops into the
processor slot.

Setting aside computing with photons (we seem to be making more progress with
quantum computing), when are we going to see optical communication hardware
shrink to the point where we can get more bandwidth between these chips than
copper interconnects?

Or does copper have a head start in signal processing that IC optics can't
surmount?

~~~
deepnotderp
The essential problem is photonics sounds great in principle, but once you
account for all the energy in an actual implementation, you end up falling
behind in bandwidth and higher on power to electronics, until you reach a
certain critical distance where light wins.

------
sverige
Speaking of things I wish were real, whatever happened to the DNA-based
computer I was promised 30 years ago? Or the memristors that would allow
instantly booting back to the state my computer was in when I turned it off
last night?

~~~
ben_w
DNA computers? I think the answer is “we outclassed them”. At 10nm,
transistors are about the size of 10 base-pairs at this point, but much faster
and much less error prone.

~~~
willis936
I think DNA has semiconductors beat on average information density though. A
tube of DNA can contain a datacenter’s worth of information. Read times are
abysmal and error rate is through the roof, but it’s relatively stable on the
order of hundreds of years.

~~~
patagurbon
>stable on the order of hundreds of years

Just isn’t something in super high demand. That and density are probably the
only major advantages right? And I don’t imagine the equipment to "read" that
data will be very compact in the foreseeable future.

~~~
willis936
If the demand was high enough I bet there would already be profitable
companies pursuing it. I do see it as a potential market though. It’s no fun
having a cold storage archive that needs constant hardware replacements.

I would imagine reading would be done in a similar way as it is today:
amplification and sequencing. Data would have to be given a hefty amount of
data correction. There are no apparent show stoppers that I’m aware of, just
not enough interest.

------
yters
Unless matter is infinitely divisible, the end of Moore's law is inevitable.

~~~
Symmetry
Yes, there's probably a limit to how small you can make a transistor. But
there are far more fundamental limits to computation, such as how efficient
it's possible for a boolean operation to be:

[https://en.wikipedia.org/wiki/Landauer%27s_principle](https://en.wikipedia.org/wiki/Landauer%27s_principle)

~~~
deepnotderp
Reversible computing or dumping the information to spin can get around the
landauer limit. The Margolus Levitin theorem is more brutal.

------
drallison
I wish that people who are discussing various computer architecture issues
would not invoke Moore's Law without really understanding it.

