

Intel predicts 10GHz chips by 2011 (from 2000) - biotech
http://www.geek.com/articles/chips/intel-predicts-10ghz-chips-by-2011-20000726/

======
simonsarris

        2000 Intel Pentium 4 1.5Ghz: Passmark score of 171
        Price of $819 at launch.
    
        2008 Intel Atom Z510 1.10GHz: Passmark score of 186
        Price of $45 at launch.
    
        2010 Intel Core i7 970 3.20GHz: Passmark score of 9954
        Price of $579.99 at launch.
    

Ghz used to be a performance mark in itself, now things are more nuanced. I'd
say Intel still did right by their estimate.

------
eru
> Moore's Law states that the number of transistors in a common microprocessor
> will double every 18 months. As well, it can be applied to processor speed
> and many other computing/technology metrics.

That's not Moore's law any longer.

------
phamilton
One guy had his head on straight:

 _As Gropo pointed out, there is in fact a point where Moore's law breaks
down. So far since about 1975 it has held, and it will continue to until about
2004 or 2005. At that point, we run into an actual physical-laws-of-nature
barrier. As silicon-based transistors decrease in size, obviously, all parts
of the transistor have to shrink. This includes the gap at the PN-junction.
Once this gap reaches a certain size (currently estimated at approximately the
width of five silicon atoms), quantum effects (strong force, weak force, et
al) begin to overtake the electromagnetic force that allows the transistor to
transist. In other words… it's no longer a transistor, just a really small
piece of doped silicon that doesn't do much.

That is an absolute, no-way-around-it limit. After that we have only two
choices: More transistors (bigger chips) or new technology.

Adding more transistors has the problem of adding heat, which means slowing
the clock. And there will be a finite maximum for number of transistors as
well…. these things have to operate in sync with each other, and at very high
clock rates, propagation delay becomes an issue… that is, the information
created on one side of the chip cannot be transmitted all the way across the
chip within the space of a single clock cycle. Also, the areas of the chip
near the clock generator will receive their clock pulses sooner than those far
away. If the near-the-clock pieces rely on data produced by the far-from-clock
pieces, your chip is in trouble. This is called “clock skew” and is a major
design consideration for any chip built today… it only gets worse as clock
speeds increase.

The point is, within ten years, we won't be using silicon-based computers.
They'll be made obsolete by DNA/protein type bio-computers or maybe molecular
computers. \- by MonkeyMan_

Well.. except for that last part....

~~~
sp332
Intel beat that limit with their (newly announced) vertical or "3D" transistor
tech, which has taken about 10 years to develop:
[http://www.anandtech.com/show/4313/intel-announces-
first-22n...](http://www.anandtech.com/show/4313/intel-announces-
first-22nm-3d-trigate-transistors-shipping-in-2h-2011) and a silly video to go
with it <http://www.youtube.com/watch?v=YIkMaQJSyP8>

------
biotech
I like the comments after the article - many people suggesting that Intel is
selling themselves short with this prediction.

~~~
hugh3
The first comment does, but the responses are more prescient:

 _Pullleeeeaze! What makes Moores 'Law' the end all and be all for computer
advancements? It's not like the Law of gravity or thermodynamics for goodness
sakes! Moores 'Law' has to break down at some point. There needs to be a whole
new paradigm of microporcessor technonogy before we get to speeds of 128GB. It
will either be optical or wave based_ (the rest of the thread is filled with
snark about his "GB" typo)

then again,

 _By 2011 we will have implementation of Quantum Processing that will make the
xHZ debate look like the colonists debating over sucession from the UK._

or how about

 _true, clunky beige boxes will be out of style. I was thinking more like 3
in. cube._

(would you settle for a MacBook Air?) Or there's also

 _As Gropo pointed out, there is in fact a point where Moore's law breaks
down. So far since about 1975 it has held, and it will continue to until about
2004 or 2005. At that point, we run into an actual physical-laws-of-nature
barrier. As silicon-based transistors decrease in size, obviously, all parts
of the transistor have to shrink. This includes the gap at the PN-junction.
Once this gap reaches a certain size (currently estimated at approximately the
width of five silicon atoms), quantum effects (strong force, weak force, et
al) begin to overtake the electromagnetic force that allows the transistor to
transist. In other words… it's no longer a transistor, just a really small
piece of doped silicon that doesn't do much_

which is true apart from the bit about 2005, because we're still _very_ far
away from having any components which are only 5 Si atoms across.

 _The point is, within ten years, we won't be using silicon-based computers.
They'll be made obsolete by DNA/protein type bio-computers or maybe molecular
computers_

[takes a drink]

 _Do you know there is something called carbon nanotube? The nanotube is only
1 to 10 nanometer(1/1000 of a micron) in diameter and might be the second best
conductor( next to superconductor) ever know to human beings. And yes, it can
operate under room temperature. Also scientists are working on quantum
computers. Just ask a scientist and he/she will tell you that a quantum
computer will at least 10000 times faster than a current computer. Who cares
about Intel's 10 GHz microprocessor?_

[takes another drink]

------
kvikramg
c'mon thats not fair. Sure they havn't delivered on the speed front ..but they
overcome it with multicore tech, which in the end means the same thing to the
users.

~~~
hugh3
At the cost of significantly increased work for the programmer, yeah.

I must admit I've lost track. What _is_ the reason that clock speeds have
stalled at 3 GHz? There was some fundamental physics problem happening, but I
forget what it was.

~~~
jxcole
I believe I have read, though please correct me if I'm wrong, that there is a
problem with current printing methods. The way that microchips are built is
sort of like those projectors we used to have in class; you would dark out
certain places and the rest of the areas light shines through. So those spots
where the light shines get etched away, and you have your transistor layout.

Only now the light rays are so close together they get tangled up with quantum
effects. So there is this scattering effect, and we can't make the transistors
any smaller, because the light is out of focus. The blurry edges caused by
quantum dynamics seems to be holding us back.

~~~
hugh3
That's one of the reasons why we're having difficulty making 'em smaller
(though semiconductor engineers are very clever and keep coming up with new
methods on that front)), but doesn't have much to do with making 'em faster.
We've been making 'em smaller and smaller all the time, but clock speed hasn't
gone up since ~2004.

~~~
georgemcbay
"but clock speed hasn't gone up since ~2004."

And yet the latest i7s run circles around the CPUs of that time period, even
when running code that is strictly single-threaded.

There have been tons of performance wins made over the last 10 years that
aren't just from adding more cores and Intel's predictions were more or less
right on the money, if you overlook the marketing mistake common at that time
of using ghz as a shorthand for overall chip performance.

------
ChuckMcM
I find it amusing that they predicted 5Ghz chips in 2005 but that pretty much
tanked with their inability to make a chip that could both run above 4Ghz and
not desolder the holder it was in from its motherboard.

If you will recall however, the last time Intel became obsessed with clock
speed it gave AMD a chance to deliver us the Opteron and a 64 bit x86
extension so by all means I'm egging Intel on! :-)

------
iuguy
To be fair, the best thing about 10 Ghz chips is that there's 10 of them.

------
crander
Read "The Pentium Chronicles" by Bob Colwell. The Pentium 4 is the time frame
when marketing took over Intel and fed this Ghz myth when many knew it was a
doomed long term path.

------
pwpwp
an estimation over 11 years that's off by a factor of three (or 2 for IBM's
z196 processor at 5 GHz) is good

~~~
rbanffy
It has four cores. I would count it as a 20 GHz unit.

Seriously, even the "mid 2011: 128 GHz" is not that off: how many cores are in
a gaming-heavy GPU?

~~~
bnegreve
No, you don't go twice faster with two cars.

~~~
rbanffy
It depends on what you want. If you want to move your furniture from house A
to house B, I grant you I can move them in half the time with twice as many
cars.

If the workload can be divided, multi-core is a solution.

------
zaidrahman
Will 10Ghz be possible in 2020?

~~~
rbanffy
I wouldn't rule that out with that Intel "3D transistor" thingie. Power
dissipation goes down, clock can go up. And the device also gets smaller, so
more cores can be put on the same chip.

Not only that, but we are only starting to play with memristors. Silicon still
has some mileage.

~~~
goalieca
I would rule it out. The 3D transistor design buys us just a few percent but
really, we're starting to hit the walls of physics with our current
technology.

~~~
rbanffy
IBM already has 5 GHz chips. Do they know some physics Intel doesn't?

~~~
goalieca
Not really. They do have some cool fabrication tech though. They also run a
different ISA and different pipeline which can make a world of difference.

Really though, I wouldn't expect them to hit 10GHz anytime in the foreseeable
future.

------
lwat
We got to 5GHz, that's pretty close in my book.

