
Nvidia officially unveils next-generation Tegra 4 SoC - Garbage
http://arstechnica.com/gadgets/2013/01/nvidia-officially-unveils-next-generation-tegra-4-soc/
======
jpxxx
That was effortlessly the worst keynote I have ever sat through, with the
added bonus of a grotesque "gamer babe" interlude that saw a married 49 year
old CEO pretending to hit on a model at a tiki bar. Rancid 1993 puke.

All that said, their Shield handheld console looks pretty spectacular, if
vaporous.

5" ~retinesque~ touchscreen clamshelled onto a full-size game controller with
all the bells and whistles. Outrageously powerful, has HDMI, USB, headphones,
and holds a 5-10 hour charge. Runs pure Android with full Google Play access
and a specially curated library built for the console itself. Can also
wirelessly stream games off of a PC, including from Steam Big Picture. Cloud
storage of game state comes free.

Not announced: price, storage, availability, who's making it, sales channels,
or anything. So whatever. But this -is- the Sony PSP slayer, and it is another
sign that all roads lead to Android+OpenGL ES on ARM as the future default
game platform.

~~~
ulyssesgrant
Haha, I agree that the keynote was at times extremely awkward and hard to sit
through, but I have to say I enjoyed how casual it was. Compared to something
like an Apple presentation, where it seems like there's 0 room for error or
for tangents, this felt more "real".

~~~
jpxxx
It was thoroughly amateur. The host/CEO was leaden, the pacing was all over
the place, anything involving women was grotesque, it was full of meaningless
and half-hearted digressions, nothing shown was placed into an industry
context, and their livestream offered an unskippable chat box that was mostly
"U FAGGOTS" and "BRING THAT GIRL BACK".

Demo example: "Here are the guys who made the game Hawken." Who? What company?
What is Hawken? What does this game mean for the industry? How is it showing
off your product in the best light? What does this mean for developers?

No, none of that gets answered. Two cameras drill in through shitty sightlines
at three anonymous men playing a game in silence while Jen-Hsun Huang asks
things like "where are you guys at right now?" and "so what are you doing?"

------
OmarIsmail
While a somewhat valiant effort from a specs/control perspective, I think this
is going to be a pretty big flop. Primarily because it doesn't have much of a
market, which means that they won't be able to overcome the chicken-and-egg
issue.

There's a reason why the main gaming console manufacturers also have large
development studios as well. Every new gaming device needs a killer app to
launch it past the critical mass where it becomes economically beneficial for
outside developers (3rd parties) to also release software for said platform.

nVidia doesn't have game studios (that they've announced anyway) so they're
going to be relying on the broader Android gaming developers. However, Android
developers are going to be targeting the hundreds of millions of phone/tablet
Android devices and not unknown number of Shield owners.

So if very few developers are going to take the development resources to make
games that take advantage of the Shield then how many gamers are going to want
to spend the kind of money this will cost, when there are MANY other great
alternatives (E3 this year will see the announcement of PS4 and the next
Xbox).

If the hardcore gamers aren't going to be purchasing this system, then that
leaves the so-called casual gamers. However, by definition this thing is
targeting hardcore gamers so it doesn't have much appeal to casual gamers at
all. Which means there's a very narrow market segment that is seriously
interested in this.

~~~
cmccabe
I think you read a different article than I did. The Ars Technica article
doesn't mention the Shield at all, just the Tegra 4.

Anyway, it sounds like NVidia is putting out a device to show people what the
Tegra 4 can do. They have to build prototype devices anyway; they might as
well sell a few of them while they're at it. They're not expecting to create
their own platform: the Shield is just useful for basically acting as a
controller for your PC or as a box for playing Android games.

There's more to the market than "hardcore gamers" (how hardcore can you be
using a portable device anyway?) There's the Windows 8 tablet and Android
tablet markets, which NVidia would very much like to win. I also would not be
greatly surprised to see an SoC like this in Steam's new box, either.

~~~
OmarIsmail
Sorry, I got my headlines/articles mixed up.

The Tegra4 does indeed look quite stellar!

------
ricardobeat
Real-time HDR is great! Hope everyone gets in that bandwagon.

At some point HDR processing should become part of the hardware pipeline in
every sensor, and we'll simply have a selectable dynamic range.

~~~
frozenport
You made my postmodern sense tingle!

I have noticed more and more cameras implementing automatic color correction
resulting in orange people and color changes while filming the same scene. I
think HDR looks unnatural.

~~~
ricardobeat
HDR can look as natural as you want. You're probably thinking of burnt-out
crazy over-saturated flickr shots like this:

[http://www.flickr.com/photos/yury-
prokopenko/3561920871/ligh...](http://www.flickr.com/photos/yury-
prokopenko/3561920871/lightbox/)

Or this (beyond terrible):

<http://www.flickr.com/photos/nik-on/4624961812/>

But that's the photographer's fault. HDR is just a way of compensating for
lack of dynamic range in a sensor. Used correctly it should make the image
_more_ real, not less, by bringing it closer to the human eye's full dynamic
range. Examples:

<http://www.flickr.com/photos/margall69/7496881548/lightbox/>

[http://www.flickr.com/photos/frankspecht/4954970921/lightbox...](http://www.flickr.com/photos/frankspecht/4954970921/lightbox/)
(snow and sky would be completely blown out without HDR)

[http://www.flickr.com/photos/michaelgcumming/4525129653/ligh...](http://www.flickr.com/photos/michaelgcumming/4525129653/lightbox/)

~~~
UnoriginalGuy
I'm actually a photographer and I like the over-the-top HDR shots (even the
example linked). It really depends on personal taste, and what it is you're
after.

A lot of people dismiss them because they're "unrealistic" (which they
obviously are) but I ask you is an oil painting unrealistic? For example this:

[http://snapzlife.com/wp-content/uploads/2013/01/Oil-
Painting...](http://snapzlife.com/wp-content/uploads/2013/01/Oil-
Painting5.jpg)

Clearly that is a painting, but if exactly the same picture was acquired using
HDR everyone would dismiss it as being terrible?

My point is that you can use HDR to get realistic looking pictures, but you
can also use HDR for "artistic" reasons, like painting using your camera. A
lot of photographers are dismissive of the latter because they want to pretend
photography is a practical rather than artistic endeavour.

~~~
andrewcooke
wouldn't most people also think that was a pretty bad painting?

~~~
jjkmk
Not if you live in Wisconsin.

------
polshaw
_> In a side-by-side Web page loading test with Google's Nexus 10... Tegra
4-based prototype loaded a set of Web pages nearly twice as quickly... the
Tegra tablet appeared to be running the stock Android browser, however, while
the Nexus 10 was running Google Chrome_

Scumbag Nvidia? I would doubt that either browser makes significant use of >2
threads, leaving most of the performance difference that was not due to the
200mhz clock increase explained by the different browser.

~~~
mtgx
Because Chrome for Android is in a pretty terrible state right now, and doing
it on Chrome, it may have made it look even worse than iPad 4. Chrome for
Android does a "good job" at making some of the most powerful chips on the
market look pretty mediocre right now. I really resent Google for this right
now, because they are embarrassing a lot of chips makers and device makers
because of it. But whatever - they'll probably fix it in Android 5.0 (I hope).

So basically Tegra 4 shouldn't be "2x faster" than Nexus 10, but probably more
like 1.5x-1.8x faster, depending on how much more they optimized the stock
browser over Chrome.

------
est
I am surprised no one mentioned the i500 SDR. Maybe it's hackable, run
GNURadio then BAM Tegra 4 becomes the ultimate portable software radio, with
20MHz bandwidth, both receive and send, ARM NEON/VFP and GeForce GPU power to
process data, it's like a wet dream.

------
dharma1
Some cool stuff here. We've been shooting with RED Epic for a while now, and
the extended dynamic range of the HDR-X mode is really nice. It's about 18
stops of dynamic range. I can pull almost anything out of high contrast scenes
in post-production - we don't have to wait for the magic hour anymore and can
generally wave goodbye to blown out skies. Check out londonhelicam.co.uk -
there are a couple of HDR video examples in the reel.

I wonder if this Tegra 4 chip will have enough horsepower encode the extended
dynamic range of the HDR video into a usable 16bit/32bit video format, at good
quality bitrates. What codec would it use? h.265?

Sony has the best dynamic range in consumer sensors, hope there will be some
products combining this into a truly usable and user configurable product -
either in a phone or a consumer camera.

~~~
mtgx
VP8 and h.264. h.265 and VP9 probably won't be ready until late this year or
next year.

------
cageface
It's really a pity that Android makes it so difficult to get down to the metal
on this hardware. If I want to take advantage of Neon intrinsics I have to use
the NDK and write them by hand. On iOS I just use the Accelerate framework
directly.

~~~
buster
You have to have some limitations, it's not like every Android device can do
NEON.

~~~
cageface
Sure but that doesn't mean that couldn't be fewer hoops to jump through for
the many devices that do. I have an iOS app that does realtime audio FFT
without even breaking a sweat. I've considered porting it to Android but the
degree of hassle puts me off it.

~~~
buster
Sure, it could be made easier, but i am wondering if it is done on purpose.
Android encourages the developers to stay away from the NDK unless it's really
necessary. Making it easier to build incompatible apps (= NEON) would surely
mean more devs building incompatible apps even if not necessary and thus
hurting the android ecosystem in general with more fragmentation.

Doesn't mean there shouldn't be some helpers in the SDK in the likes of "if
NEON is supported execute <optimized code path/ARM assembly>, else <slow
stuff>".

~~~
cageface
Perhaps but the unfortunate consequence is that there are entire niches of
apps that flourish on iOS that don't really even exist on Android. Android is
a great utilitarian OS but the really exciting stuff on mobile (IMO) needs to
push the hardware hard.

~~~
buster
What niche apps would that be?

I mean the same statement can be made for Android and Widget Apps, Homescreen
Apps and whatever else is not possible on iOS. I doubt the amount of your
niche apps is more then a fraction of those :P

P.S.: My point is, that there will always be differences in the available apps
plainly because the API is not the same. I am wondering though, what apps can
be done on iOS that can't be done on Android.

~~~
chipsy
Practically the entire audio field(recording, synthesis, processing,
sequencing, etc.) has stayed away from Android because latency is far too high
on average, ruining UX; it also varies widely between manufacturers, which
doesn't help matters.

To some extent this also impacts all games, since high-latency playback hurts
the experience.

~~~
buster
I am wondering how true this is with Android 4.1+, because audio latency was
one of the major goals for Android 4.1 and more so in 4.2 and was supposed to
bring audio latency on par with iOS.

See: <https://developer.android.com/about/versions/jelly-bean.html> (Section
is called Low-latency Audio).

edit: So, even when said apps were not possible in the past, now would be a
very good time to port your app (or those apps in general), because then this
is a niche which is not yet filled and there is some money to be made by being
#1 in that niche :)

~~~
cageface
<http://code.google.com/p/android/issues/detail?id=3434>

There's some hope for 4.2 but needless to say there's no point targeting that
market yet. _None_ of the devices currently on the market have acceptable
audio latency. And Android still has no MIDI support.

Android has its strengths but it can't hold a candle to iOS for these kinds of
applications.

~~~
buster
Mh, i just tested on my phone (
[https://play.google.com/store/apps/details?id=de.darkbloodst...](https://play.google.com/store/apps/details?id=de.darkbloodstudios.dubstepdrumpads)
) and i don't see/hear a latency, certainly nothing above 100ms, so i don't
think your remake "NONE of the devices currently on the market.." is true. My
Galaxy Nexus is over a year old and works fine. Atleast i suppose that this
app would show what you think is not working.

~~~
cageface
Latency for pro audio apps should be around 10ms. No Android phone gets close
to this but all iOS devices do. You can use the free Caustic app on Android to
measure your exact audio latency and then you will understand why none of the
pro audio app makers bother with Android despite the large install base.

~~~
buster
I see, i have no clue about "pro" music stuff so let's hope they fix that...
(says 40ms on my phone).

------
martythemaniak
Between project shield and the ouya, it should be a great year for android
gaming.

~~~
chucknelson
I feel like the OUYA is a bit out of luck considering how fast the mobile
landscape is moving. Tegra 3 is being surpassed (at least technically) before
the OUYA launches, assuming the OUYA even hits its "early 2013" release.

As mentioned on here and elsewhere, maybe development will cater to the
lowest-common-denominator as far as SoCs go and the OUYA will be OK, but I
feel like the audience for the OUYA won't particularly like being on a last-
gen mobile SoC.

~~~
mtgx
I think OUYA would be smart to re-target it at sub-14 year old kids, and not
really at the more hardcore gamers. At least until they release a version with
the latest and most powerful ARM SoC.

------
zanny
Can't wait to see that chip in the Nexus 7 2 or however they brand it. Or
maybe the Nexus 10 2? A15 cores are such a noticeable boost over A9.

I'm really hoping once this thing gets benchmarked that Nvidia finally hits it
out of the park on the graphics side, for being the foremost GPU company of
the last decade they sure screwed up ULP Geforce with 7000 series gpus from a
decade ago. I think these are Kepler cores, and they have proven themselves
fantastic on the desktop.

------
shmerl
Are they going to start supporting VP8 hardware decoding there? It's not
available in Tegra 3, at least with their Linux for Tegra release.

------
hayksaakian
This will also be great for emulators.

------
jitl
Nvidia will save us from Intel in the long run. Intel just doesn't get multi-
core processors.

~~~
ChuckMcM
Ok, that made me chuckle. Intel absolutely understands multi-core, what they
don't yet 'get' is third party access to the inner works. Its interesting to
watch nVidia because they have been on the receiving end of that sort of
squeeze (front side bus patents and all that) and so they tend to lean toward
that as a strategy, whereas other ARM processor houses don't seem quite so
focused on locking down all of the silicon around them.

The really smart bit here though is the LTE capability. If nVidia goes 'all
in' on building integrated CPU/GPU/Wireless cores then that puts even more
pressure on Intel to integrate or leave the market. Intel has not had a
stellar track record with regard to wireless unfortunately.

~~~
wmf
I would say Intel/Infineon is ahead of Nvidia/Icera. At least the Infineon
baseband has had some design wins.

~~~
ChuckMcM
That's an excellent point. Makes me wonder if there is an AMD ARM part with a
Radeon GPU and some wireless implementation in our future. Seems like all the
cool kids are building this particular kind of chip. It almost feels like the
old 8080 days where everyone had a kinda-sorta the same 8 bit MCU to throw at
the emerging PC market.

It is certainly going to be an _interesting_ decade.

------
mariusmg
The "72 cores" is marketing drivel . The stream processors found in GPUs are
not general purpose and certainly not equivalent to a cpu core.

This number of cores remarks started to remind me of the gigahertz race
between Intel, Amd and Cyrix back in the days.

~~~
willvarfar
Take a look at the very first image right at the top of the article.

I never saw them mixing up the GPU and CPU cores. They have 4 CPU cores and
they kept on saying it.

~~~
zanny
Well, 5 cpu cores, one is invisible to the OS though.

