
Spec analysis: XBox720 vs PS4 - Strom
http://www.eurogamer.net/articles/df-hardware-spec-analysis-durango-vs-orbis
======
acgourley
Why does the xbox have HDMI in? My only guess is they are getting into the
business of HDMI switching, so the XBOX can sit behind any DVRs or other media
boxes you have. If they do this, they might have an interesting remote control
to unveil (either physical or app-layer)

~~~
roc
Would it be worthwhile just to get away from a shared bus (USB) for the
Kinect->Console connection?

~~~
arscan
There is a dedicated Kinect In though.

It seems more likely that they are trying to push xbox towards being a media
hub, and to get people used to always consuming media through their xbox.

------
Expez
Given how common multi-platform releases have become I think it makes sense to
just get behind x86.

Sony had quite a bit more powerful hardware in the last generation, but when I
had the opportunity to compare games side by side I didn't notice a difference
large enough to prefer one console over the other. Most likely the extra
headache for developers and the higher cost of developing their own hardware
(in partnership with IBM last time) is leading Sony in a different direction
this time around.

~~~
electrograv
The Xbox360 is actually _dramatically_ faster than the PS3 [1] where it really
matters - the GPU.

    
    
        Triangle Setup
        Xbox 360 - 500 Million Triangles/sec 
        PS3 - 250 Million Triangles/sec
    
        Pixel Shader Processing with 16 Filtered Texels Per Cycle (Pixel ALU x Clock) 
        Xbox 360 - 24.0 Billion Pixels/sec
        PS3 - 16.0 Billion Pixels/sec 
    

The only reason people think (to some extent correctly) that the PS3 is more
powerful than Xbox360 is due to the PS3's cell processors. If you sum up the
PS3's total FLOPS, the cell processors do give it a total advantage in FLOPS.
But this is very misleading - obviously FLOPS are only useful if the
architecture enables you to use them towards what you need.

Almost nobody used "cell processors" to potential initially, and for very good
reason [2]. The bottom line is they were incredibly difficult to program for
in any useful way to graphics. But more than that, they are limited in very
specific ways [3]. They won't increase triangle throughput, and they don't
directly increase pixel throughput.

In some extremely rare cases, some game engines found a way to use them for
graphics (like Battlefield 3's Frostbite 2 engine towards the deferred
lighting passes). But by the time engineers found a way to leverage this
extremely complex architecture, it was already too late. If you actually look
at most Xbox360 vs PS3 games side by side, the Xbox360 often looks much
better.

In this case though (the next gen Xbox and Playstation) the situation is
reversed: Sony's GPU is actually legitimately faster, and by a large margin.
Not only that, but having a full 4GB of fast ram is a HUGE advantage to Sony.
High performance 3D rendering is inherently a bandwidth hog, because
everything on the screen _must_ be pushed through the pipes every single
frame. Having a fast 32MB cache doesn't really help you if you want to
consistently and smoothly render more than 32MB of content on the screen. And
moreover, just as the last generation PS3 had an overly complex architecture
making it difficult to reach its full potential, it seems Xbox and Playstation
are swapping roles here as well with this generation.

[1]
[http://forums.gametrailers.com/viewtopic.php?f=23&t=7947...](http://forums.gametrailers.com/viewtopic.php?f=23&t=794700)

[2]
[http://www.videogamer.com/ps3/saints_row_2/news/two_or_three...](http://www.videogamer.com/ps3/saints_row_2/news/two_or_three_years_till_ps3_graphics_better_than_360.html)

[3] [http://stackoverflow.com/questions/1355827/what-does-
program...](http://stackoverflow.com/questions/1355827/what-does-programming-
for-ps3s-cell-processor-entail)

~~~
pandaman
>Almost nobody used "cell processors", and for very good reason.

Seeing that you edited with a citation from a Volition AP. I guess if you
trust sources like this - there is no argument possible.

>In some extremely rare cases, some game engines found a way to use them for
graphics (like Battlefield 3's Frostbite 2 engine towards the deferred
lighting passes). But by the time engineers found a way to leverage this
extremely complex architecture, it was already too late.

You mean like 2007 Uncharted?

~~~
jerf
It's too late in this cycle to be a fanboy. It is very well established that
the cell processors are bandwidth starved and it's hard to feed them. Not
_impossible_ , but legitimately hard. This is an architectural fact, not
something to be debated anymore.

The PS3's internal architecture was poorly balanced. This really hasn't been a
secret for a long time, for anyone who can take off the hype goggles. It
sounds like the PS4 will not have this problem. It will be interesting to see
if Microsoft makes the same mistake; it _really_ won't be a console-gaming win
to stick 8GB of slower RAM in the box, then suck away 2 or 3 GB for other
purposes. (Probably still won't be as poorly balanced as the PS3 though.)

~~~
teamonkey
Each console has its strengths and weaknesses. Working on multiplatform games,
the 360 version is, more often than not, CPU-bound while the PS3 is very often
GPU-bound. But in reality there's very little in it. They're both horribly
memory-bound and the 360's Achilles' heel is actually the lack of disc space.

But look at something like Halo4. It's had all of Microsoft's drive behind it.
Pretty, yes. Significantly better than anything the PS3 could do? Not really,
no.

------
codex
It's a shame the PS3's Cell failed so badly. As far as theoretical general
purpose performance per transistor it was a great design, but was stuck in a
no man's land between general purpose CPUs and GPUs, each more specialized.
However, you can see some of the Cell's ideas live on in the Xeon Phi and
elsewhere.

~~~
jfb
Eh. It failed because it didn't solve problems devs actually had, and made
many formerly easy things harder. Most technical decisions made for political
reasons end up in the same bucket.

~~~
codex
Agreed, but the point was not to make developers' lives easier. Were it, they
would have put a monster multicore X86 CPU in the PS3. The idea was to get
maximum performance for minimum hardware cost (that is, minimum number of
transistors) even if it made developers lives harder. This is because the
console market typically involves an initial subsidy of the hardware by the
manufacturer, and to minimize the need for cooling hardware in the machine.

And that is why it was so hard to program--things that the hardware used to do
for you, you now had to do in software (manual DMA, caching, coherency). If
you were smart, you could do it better than the hardware implementations
(special purpose vs. general purpose), or at least as well. In this respect it
was like a GPU--but people are more familiar with how GPUs work, and they
dominate their restrictive niche quite well already. There was no niche to
fill between hard-to-program, special-purpose, graphics-specialized GPUs and
easy to program, general-purpose, transistor-wasting traditional CPUs outside
of, say, physics engines (and supercomputers). The Cell was neither special
purpose nor general purpose.

Sony bet that super-smart game developers would create great things out of
hard-to-program machines like they always had in the past (e.g. the PS1 and
PS2)--after all, the machine had a ten year lifecycle, so developers would
eventually have the skillz. However, the return wasn't worth it, for the most
part--gamers care about graphics, and that wasn't where the Cell could add a
lot of value due to the unbalanced architecture of the PS3 and the amazing
rate of progress made by GPUs. So the Cell was used for tasks that a general
purpose CPU could do better (if less efficiently), but it was still harder to
program than a GPCPU, and it was overkill for CPU tasks (i.e. the game engine)
which don't require a lot of horsepower in a game, and don't expose much
parallelism.

However, Intel is targeting the Xeon Phi at areas the Cell was good at (lots
of parallelism, lots of unpredictable branching--mainly supercomputing),
albeit with hardware which is slightly easier to program. I think Sony hit it
slightly off the mark, trading a bit too much complexity in software for
savings in hardware. There is definitely a sweet spot. They would have been
better off with a more powerful CPU and fewer SPEs, or SPEs which were slower
but had a real cache.

------
andrewcooke
how did amd get both these? is it because they can offer cpu/gpu combos, which
is more attractive then trying to combine solutions from intel and nvidia?
that would be good; but what worries me is they are so desperate that they
have offered an unsustainable price. presumably there's background to this
i've not seen. what are the rumours?

~~~
maximilianburke
I recall reading that when Microsoft selected IBM and AMD to design the CPU
and GPU respectively for the Xbox 360 they wanted (and received) the IP so
they could produce the chips themselves. The original Xbox was dependent on an
Intel processor which they did not have the same rights to and could not, for
example, build it on a smaller or cheaper process as the console's life ran
on.

I'd imagine that combining that, plus that the market for powerful number-
crunching processors is slightly narrower now that IBM has taken itself out of
the consumer processor game, would mean that they would not have many vendors
to choose from for the CPU/GPU. ARM processors are coming along but I think
they are still a ways out from the speculated performance of what is going
into the next generation of consoles.

~~~
MBCook
I thought the problem with the original Xbox was the nVidia GPU, and that it
was nVidia that refused to do a process shrink.

~~~
wmf
It is rumored that Nvidia felt burned by the contract on the Xbox and they
didn't want to bear the risk of a shrink.

------
tpurves
what's interesting is that, by desktop x86 standards, cpu power is power in
these consoles is going to be rather anemic. AMD "jaguar" cores, are optimized
for power not performance at only 1.6GHz, meant that max single threaded
performance (which is still important in games) is probably going to fall
short of even what today's base-model 11" macbook air is capable of (1.7 i5,
and the i5 core has a lot higher IPC than any core of AMD's). And that's at
launch, how will these cpu's compare to laptop/desktop cpus just a few years
after launch? not well.

But the design will allow for small, sleek and quiet boxes.

It's clear that the competition they are really targeting here, from a
hardware perspective, is tablets, appleTV's and anything ARM-based. Those AMD
cores will still wipe te floor with any current ARM designs, esp when coupled
with an AMD GPU unit. The designers of both platforms have apparently decided
that PC gaming isn't worth competing with, it's device-based gaming (and
google and apple's game/app stores) that they are targeting with this form
factor.

Meanwhile the gap between what a PC can render and what a console can render
is only going to get a lot wider this generation.

Or maybe we all just need nVidia's grid solution to come to reality and
disrupt the whole model. (e.g. end the hardware race by just putting huge
racks of GPUs and CPUs near the edge of the cloud and run your games as VMs on
any handy screen or device)

~~~
mtgx
Aren't those AMD cores even slower than Atom? How will it "wipe the floor with
ARM chips" when Cortex A15 is significantly faster than Atom?

~~~
wmf
Brazos is already faster than Atom, so Jaguar should be much faster than Atom.

------
meaty
Which one allows me to sell my games when done?

That's the only spec I care about.

~~~
danellis
Performance specs don't sell games. Look at the Wii.

~~~
kristofferR
The Wii U is, so far, a huge failure, probably for many different reasons (the
name is a big one), but one of the main reasons is likely the poor specs, it's
just slightly faster than the old Xbox 360 and Playstation 3.

If it were a "real" next-gen console with specs similar to those of the
upcoming Xbox and Playstation, instead of just essentially being a last-gen
console with a fancy controller, a ton of people would care.

~~~
bluedanieru
What the fuck are they thinking with that name. Wii as well, for that matter.

------
Symmetry
I'm sort of sad they didn't use hetrogenous cores for these. 2 Steamroller
threads with 4 Jaguar threads would be pretty interesting. Given that the game
developers would know the core count they could just lock threads to the
appropriate core, which they do anyways. Same shared memory space as normal
unlike the Cell's PPUs, but you get throughput when you want it or relatively
high singled threaded performance when you want that.

------
batgaijin
Weird that they don't mention Gaikai... I think that's going to be the real
edge for the PS4.

Of course it depends on a fictional and amazing future network, but
considering all relevant information I think that it will most likely be the
winning edge for the PS4 in the next decade.

Sony is going to fucking print money when they have a subscription for a
streaming gaming service.

~~~
Yhippa
I'm real disappointed that these consoles are still fighting an arms race. I
was hoping to these boxen would really take to the cloud and finally figure
out streaming but I guess we're just not there yet.

~~~
jsnell
Why? It seems like a completely ridiculous idea (latency-sensitive, incredible
waste of bandwidth). I honestly can't figure out what the purported benefit
is.

~~~
mynameisvlad
But it's _the cloud_! /s

------
lucian1900
Game developers will almost always take more memory over faster memory, as
long as the slower one's bandwidth is still sufficient.

Bandwidth of storage is abysmal by comparison.

~~~
KVFinn
Not really:

<http://forum.beyond3d.com/showthread.php?t=62108>

>Usable memory amount is very much tied to available memory bandwidth. More
bandwidth allows the games to access more memory. So it's kind of
counterintuitive to swap faster smaller memory to a slower larger one. More
available memory means that I want to access more memory, but in reality the
slower bandwidth allows me to access less. So the percentage of accessible
memory drops radically.

In this case, if the 360 is really at 60 GB/s and the PS3 200+, that's a much
larger disparity than the current generation.

------
mtgx
I thought Sony was going to use the full OpenGL API for PS4?

~~~
kevingadd
IIRC they've exposed access to OpenGL on the PS3 but nobody uses it due to
performance/feature set issues.

~~~
mjn
There doesn't seem to be anything official about how it's implemented under
the hood, but the widespread assumption is that OpenGL on the PS3 is an
emulation layer on top of the native graphics API, so performance tweaks are
easier if you use the native API directly.

~~~
pjmlp
If I am not mistaken, even the engine provided by Sony in their SDK does not
use it.

FireEngine if memory does not fail me.

~~~
toksaitov
PhyreEngine

~~~
pjmlp
Thanks!

------
WizardlySquid
Microsoft should advertise their next system to the audience they have and not
the audience they want. They should start by calling the system the xbox 420
instead of the xbox 720.

