

Nvidia's quad-core Kal-El demos next-gen mobile graphics, blow minds (w/ video) - evangineer
http://www.engadget.com/2011/05/29/nvidias-quad-core-kal-el-used-to-demo-next-gen-mobile-graphics/

======
gvb
_That means the simulations we're watching require a full quartet of
processing cores on top of the 12-core GPU NVIDIA has in Kal-El. Mind-boggling
stuff._

Mind boggling indeed, but totally absent in the text and demo was any mention
of power usage. Four cores running at 75% utilization plus 12 GPU cores is
going to suck down a tablet battery in a hurry.

~~~
nextparadigms
Not (much) more than a Tegra 2 running at 75% utilization. Each Tegra 3 core
should run at 50-60% power consumption of each Tegra 2 core, if history is any
guide. When every chip maker works on a new ARM chip generation, they try to
increase performance while maintaining the same _overall_ power consumption of
the whole chip. You won't see an ARM chip with 1W TDP in one year, and 1.5-2W
the next year, when the new generation arrives. That's why we see dual core
phones now with the same, or even bigger battery life than previous single
core phones.

Even so, they manage to improve the performance about 2.5x every 12 months,
and about 4x every 18 months. That's twice as fast as Moore's Law for x86
chips. That's why the ARM chips keep impressing every year on how fast they
progress.

~~~
ebiester
Sorry... must... resist... can't!

Moore's law has nothing to do with speed, only transistor count in the same
space. While the speed increases are impressive, they're still picking off the
low hanging fruit, as people haven't fully figured out how to optimize for the
space yet. Kal-El will reportedly be using 40nm transistors, compared to the
22nm process Intel is able to do.

Which means there's more room for the amazing in the arena. :)

~~~
nextparadigms
Intel won't have an Atom at 22nm until late 2013. By then Tegra 5 should be
out, which who knows with how many Cortex A15 cores, at 28 nm. In 2014, ARM
chip makers will move to 20 nm.

Intel's advantage is only of about 1 processing node generation, but that's
not even for Atom right now. It's only for their highest end chips. Current
Atom is at 45 nm, and it will move to 32nm next year, when ARM chips will be
at 28nm. But this doesn't even matter anyway. Even if Intel was 2 processing
nodes ahead, ARM would still be more efficient.

~~~
sitkack
Intel will stop making bricks (metaphor) and skip 28nm and go directly to 20nm
for Atom. It is their only hope to get into mobile, even then it will be a
weak value prop as the entire mobile market will be on an ARM ISA.

Even though the whole dev chain is slightly more flexible to an ISA change,
Intel will still be slighty x86ed out of the market only they will be on the
other side of the fence this time.

Intel won't run on the phones, but they might, if they play their cards right
run on tablets and netbooks.

It really does suck to be the 900 Lb gorilla.

------
thret
In case anyone is wondering, Kal-El is Superman's Kryptonian name.

------
daimyoyo
My concern with this is heat. If you have 75% utilization in a mobile device,
most turn into briquettes. They're impossible to hold. I wonder how NVIDIA
plans to address that.

~~~
sandGorgon
I think the problem space is different.

Power hungry processes and mobility are usually mutually exclusive and IMHO,
they should be. The win in this case is not mobility, but _portability_

I envision that we will see Crysis on mobiles in a couple of years, but that
will be in conjunction with a dock - which will not only connect the mobile to
a HD display - but also keep feeding it power and optionally cooling it. When
removed from the dock, the game will ideally switch to a low poly mode, where
it no longer is gonna exercise the GPU+CPU to its max - thereby saving power.

------
Fester
I really hope that somewhen soon someone will introduce next-gen quad core
batteries to provide enough energy to power mobile beasts like this one.

~~~
yhlasx
Exactly. Almost everything got better at least by factor of 2, while batteries
haven't seen any major improvements.

~~~
dedward
Batteries have seen continuous improvement for ages.

~~~
Fester
As a matter of fact, they are still quite far from the progress made by
portable devices in terms of power consumption.

------
yason
Funnily, my quad Intel Core i7 (two cores both with hyperthreading) couldn't
play the 720p Flash video on YouTube smoothly...

Impressive stuff! The thing I paid most attention to was the way they had
figured out at least some way of what to do with those ARM cores. Getting four
cores running at 75% (no idea whether they were throttled at the time) means
there's some serious work being done. The usual mobile graphics benchmarks
don't stress the cpu side as much the gpu side so Tegra3 wouldn't gain as much
by running existing demos.

~~~
nightlifelover
it can, you are doing something wrong

~~~
philjackson
Yea, using flash to play the video.

------
UtestMe
I bet Google will come up with Chrome for phones and let Nvidia fully run on
its new OS. I can foresee Android killing more than 2 cores out of Kal-El'
4...

~~~
nextparadigms
I was disappointed when Google made the same mistake with Google TV set top
boxes, and put first Atom chips inside Chromebooks, which led to expensive
units.

I hope we can see Tegra 3 Chromebooks as soon as they can be shipped in
products(August). That should make Chromebooks significantly cheaper and with
much better battery life (and thinner).

~~~
nl
What are you talking about? Atom isn't any more expensive than Tegra2/3, nor
is the physical size any bigger.

------
nightlifelover
Why did Apple break up with Nvidia..? Now they have to face the competition.

~~~
xutopia
What do you mean? They still use Nvidia chips on Macbook and Macbook Air.

~~~
kapitalx
He means on the iPhone and iPad. Nvidia acquired PortalPlayer which was the
chip used in the iPods, but Apple decided not to use them anymore and
developed their own chip. Meanwhile PortalPlayer chips eventually became
Tegra.

Also due to a lawsuit that was recently settled with Intel, Nvidia was unable
to provide Nehalem chipsets to Apple, meaning that Apple ended up shipping
Nvidia's Core 2 Duo chipset, but that won't last forever and Apple will
eventually need a different partner (ATI). Nvidia announced earlier this year
that they are developing their own CPUs based on ARM for desktops/notebooks,
who knows maybe that will end up in Apple notebook line.

~~~
tobylane
Or Sandy bridge and a compatible graphics card.

~~~
kapitalx
Thats the bet nvidia made when they lost the chipset business. I'd say they
are more interested in their own CPU specially now that Win8 is compatible
with it.

~~~
jc-denton
Sure as you can see they are already quite fast :) It's nice to see a new
competitor in the CPU business..

------
jc-denton
For those wondering weather more cores means more power consumption:
[http://superuser.com/questions/163567/why-does-the-heat-
prod...](http://superuser.com/questions/163567/why-does-the-heat-production-
increase-as-the-clockrate-of-a-cpu-increases)

------
nagnatron
Does it run Crysis?

------
navs
That name bugs me. Made worse by the fact I just re-read Crisis on Infinite
Earths so my mind is full of DC goodness.

Be sure not to use this under a red sun.

