
The x86 PlayStation 4 could signal a sea-change in the console industry - zoowar
http://arstechnica.com/gaming/news/2012/04/the-x86-playstation-4-signals-a-sea-change-in-the-console-industry.ars
======
Osiris
The rumor that they are attempting to block the used game market through a
sort of DRM scheme is disappointing. While I've only ever bought 1 or 2 used
games in the past, the act itself is a huge "F* YOU" to consumers and leaves a
bad taste in the mouth.

No one likes to feel controller or compelled to act in a certain way, which is
why pretty much all DRM schemes have been broken and why the MP3 became so
incredibly popular.

I feel like game producers are making the same argument for the second-hand
game market as the MPAA makes for pirating movies: Every used game sale is a
lost new game sale and blocking used games is better than innovating or
changing the pricing/sales structure.

~~~
christoph
I honestly lost my faith in Sony respecting their customers when they removed
a feature I paid for, from my hardware (PS3 Linux compatibility) then leaked
all my personal data through poor (non-existent?) security.

~~~
ewillbefull
Add attempts at suing reverse engineers for posting decryption keys
(mathematically derived, no less) to that list.

------
gfodor
The article fails to mention the audience this is good news for: PC gamers. If
next generation consoles are x86 based, expect to see future games being more
widely available on PC, and, better yet, expect the "best" versions of those
games (in terms of graphics, features, etc.) to be the PC versions. The only
catch is if the game developers hold back on PC releases due to fears of
piracy, but on the whole this probably will still mean many more releases on
PC.

That said, most flagship titles (Halo, Metal Gear, Final Fantasy, etc.) will
probably stick to a single console due to contractual obligations.

~~~
FooBarWidget
What? The instruction set matters very little. Pretty much all games are
written in C++ and last time I checked there are C++ compilers targeting
pretty much any architecture in existence. The graphics API is more important
than the instruction set.

~~~
Arelius
Additionally, current generation console hardware have a big-endian byte
order, which can make dealing with importing of resources a pain at times.

~~~
figglesonrails
Surely, /surely/ this is the least of their concerns. Since when did we get so
sloppy as to assume byte ordering in files? When was the last time your PNG
loader failed on x86 because the file has big-endian fields?

~~~
NickPollard
In the console programming world, cycles are precious. You don't tend to waste
time doing things 'just in case', you're much more likely to assume a best-
case scenario and engineer things that way.

In terms of endianness, it's not a huge problem - the toolchain normally copes
with this, as assets are built individually for each target platform. This is
what we did last time I worked on a cross-platform game anyway.

------
nextparadigms
The article fails to mention that the graphics quality of the games made for
these consoles could be a lot more advanced than what's available for the PC
then, because developers get to write games directly for that specific
hardware. John Carmack has said that the DirectX/OpenGL layers can slow down
performance by 4x-10x, for example.

~~~
intsunny
You'll have to forgive my skepticism, but can you link to where John Carmack
has ever made such a statement?

What I don't understand is, if the performance is being slowed by 4x to 10x,
what is it being compared to? I doubt anyone is actually coding GPU assembler
for complex 3D scenes all the big games require. If there are no alternatives
to these API's for allowing game devs to create the results they need, then
its a lot like comparing apples to oranges.

~~~
AndrewDucker
John Carmack: So we don't work directly with DX 11 but from the people that I
talk with that are working with that, they (say) it might [have] some
improvements, but it is still quite a thick layer of stuff between you and the
hardware. NVIDIA has done some direct hardware address implementations where
you can bypass most of the OpenGL overhead, and other ways to bypass some of
the hidden state of OpenGL. Those things are good and useful, but what I most
want to see is direct surfacing of the memory. It’s all memory there at some
point, and the worst thing that kills Rage on the PC is texture updates. Where
on the consoles we just say “we are going to update this one pixel here,” we
just store it there as a pointer. On the PC it has to go through the massive
texture update routine, and it takes tens of thousands of times [longer] if
you just want to update one little piece. You start to advertise that overhead
when you start to update larger blocks of textures, and AMD actually went and
implemented a multi-texture update specifically for id Tech 5 so you can bash
up and eliminate some of the overhead by saying “I need to update these 50
small things here,” but still it’s very inefficient. So I’m hoping that as we
look forward, especially with Intel integrated graphics [where] it is the main
memory, there is no reason we shouldn't be looking at that. With AMD and
NVIDIA there's still issues of different memory banking arrangements and
complicated things that they hide in their drivers, but we are moving towards
integrated memory on a lot of things. I hope we wind up being able to say
"give me a pointer, give me a pitch, give me a swizzle format," and let me do
things managing it with fences myself and we'll be able to do a better job.

[http://www.pcper.com/reviews/Editorial/John-Carmack-
Intervie...](http://www.pcper.com/reviews/Editorial/John-Carmack-Interview-
GPU-Race-Intel-Graphics-Ray-Tracing-Voxels-and-more/Intervi)

~~~
malkia
Yes, in layman's terms - on the console you talk to the hardware as your
buddy, you trust each other, and get along doing business better.

On OS's, there is no buddy, friendship. The OS (kernel) does not trust you by
default, and there is no friendly language to get quickly the stuff done as in
the consoles.

You can't schedule a DMA user-space. You can't batch too many draw calls in
one piece. You can't check, inspect more detailed what the system is doing.
You can't talk to any of the devices directly, but have to delegate this
through the OS.

Now, the consoles have gotten to be closer and "less" friendly in that
respect. But then again, security becomes much more important these days,
especially with online games...

------
lancewiggs
Where are the pixels?

TV manufacturers are currently delivering 1080P sets in volume and at low
prices. Nobody can really get out of a commodity play, and the future of
bigger HDTVs is kind of pointless.

Apple is delivering up 1080P content already, and the iPad 3 delivers better
in our hands. (3D is a red herring.)

So what's next? Higher resolution TV screens. IMAX in the home. iPad
functionality on the walls.

We are, in my opinion, going to continue to see a steady rise in the
resolution of our TV screens (and change in what they do), and the content
resolution will need to increase to match. That content will increasingly be
delivered over IP.

Apple is rumoured to be entering the TV market, and we know it won't be with a
me-too commodity product. So why wouldn't they launch with a higher resolution
screen, just as they did with the iPhone and iPad 3? They can control the
delivery of media through the Apple TV, and with the iPad already in the
lounge gaming on the higher resolution TV is essentially ready to go.

So for me the NextGen gaming devices better launch alongside new TVs (Sony can
do this), and with stunningly detailed graphics, or we will rightly yawn at
their arrival and stick with our computers, iPads and iPhones.

Heck - if Sony, Nintendo & Microsoft continue with their very slow release
cycles for the gaming machines, then the next generation may well be the last
- and we'll be driving big screen games using iPads and other tablets.

~~~
padobson
The future of the living room is absolutely rooted in some form of forthcoming
disruption. However, no one has meaningfully disrupted TV since the 80s when
cable, video tapes and game consoles all hit mass market appeal at the same
time.

If the rumors here are to be believed, innovation among console makers is
waning, so you're right in assuming the inevitable TV disruption is coming
from elsewhere.

It could come from Apple, but my money is on a startup.

TVs, ultimately, are just monitors. They process a signal that comes from an
external source. The first product that meaningfully augments the signal,
regardless of source (antenna, cable, satellite, streaming, game console) is
the true disruptor.

Tayloring your TV watching experience to your web browsing, social, taste, and
purchase histories is where TV will ultimately be disrupted. When you walk
into the living room to watch Mad Men, the TV should know it and adjust the
in-screen Twitter feed accordingly. It should hear you laughing at Its Always
Sunny in Philadelphia and provide suggestions to other shows people laughed at
who laughed at the same joke. No man should ever watch a commercial for
feminine hygiene again. No woman should watch a beer commercial that
objectifies women again. And when I see Don Draper wearing a slick hat, I
should be able to pause the show so I can buy it.

As far as I can tell, the only thing stopping this gazillion dollar disruption
is that we can't get signal providers to play well with device makers. Apple
has made this work with cell phones, so there's every reason to believe they
could do it to TV, but I think a startup that figures out how to augment the
signal without the provider detecting and blocking it has a chance to become
the next Apple.

~~~
untog
_When you walk into the living room to watch Mad Men, the TV should know it
and adjust the in-screen Twitter feed accordingly. It should hear you laughing
at Its Always Sunny in Philadelphia and provide suggestions to other shows
people laughed at who laughed at the same joke._

That sounds _awful_. The issue at hand when compared to video games is that
television and film are not interactive- people have tried time and time again
to make them so, but I honestly think that is a mistake. I don't want to read
instant Twitter reactions to Don Draper's latest verbal beat-down of Peggy, I
want to watch it. There is nothing wrong with television and film being a one-
way experience.

 _It could come from Apple, but my money is on a startup._

I doubt it, simply because television is an extremely expensive medium. The
existing players make it more expensive than it should be, but creating TV
shows will always cost a lot. That's why the moves by the likes of Netflix
into original programming are particularly fascinating.

~~~
padobson
You're absolutely right that the passive TV peg shouldn't be forced into the
active entertainment hole.

However, when I'm watching a sporting event, I find myself looking down at my
phone or tablet and looking for reactions from my favorite sports writers -
and then I miss a play and I get frustrated. I can't be the only one having
this experience.

Also, GetGlue is proving that people want to make their entertainment social -
they want to scream what they're watching to all of their friends.

I'm not saying it shouldn't be passive, but it should be doing a lot more than
what it is. When something on the screen illicits a reaction from me - be it a
need to hear someone's opinion on it, or a laugh or a desire to make a
purchase - the TV should be immediately providing an outlet to that reaction
without getting in the way of the experience.

------
julianpye
This is a clear departure from the old Sony. Sony used to be all about locking
developers and content providers into proprietary platforms, raising platform
value through exclusives and lowering cross-platform development. They started
off with cash-cows based on standards such as Trinitron and the Walkman and
went cocky after the big success of going it all alone with the 'PS1' and
'PS2'. However where they were only looking at professional content, Apple
one-upped them by lowering barriers for developers, opening the floodgates and
lowering content prices. Is this the next step of Sony to reinvent themselves
under Hirai-san? Whereas other Japanese CE companies are exiting the consumer
business rapidly, Sony certainly is putting up a fight.

------
dustin
Wonder what this means for a Steam console, particularly if PC games get more
love as a result of consoles being x86. I suppose it's good news in the end,
with less time wasted on cross-platform shenanigans and more time spent on
making games.

~~~
noobface
I wonder if Sony is basing this decision on negotiations with Valve.

Think about that possibility for a second. What would be the motivating
factors? Sony has gotten the shit beat out of them relentlessly in terms of
multiplayer/online gaming. PSN has been shuddered for weeks at a time.
Partially outsourcing this component to groups like Valve will fix many of
their problems, but in return Valve probably would like their whole portfolio
of content to be accessible.

If Sony goes x86, it'd be a shame to make such a dramatic shift and not
capitalize on a content neutral platform.

~~~
apowers
Sony's already been dabbling in Steam support. This move would not surprise
me.

------
pagekalisedown
Sony has always had terrible developer tools and developer support. Hopefully
this might enable some of the tools available on PCs to become available on
the Playstation.

~~~
TwoBit
Huh? Maybe you're thinking of PS2 days or earlier. I'm a PS3 developer and
Sony's tools and developer support is better than Microsoft's or Nintendo's.
Enough so that we have asked Microsoft if they could replicate some aspects of
what Sony does for the next generation.

~~~
hythloday
Sony's tools support got a _lot_ better after they bought SN in 2005, but it
wasn't that long ago that you couldn't do things like source-level debugging
on an SPU (I assume you can now). Compare that to something like Pix/remote
source-level debugging, which the 360 has had since day 1, and the difference
is quite clear.

~~~
maximilianburke
SPU debugging has been available since at least 2007 going by archived email
conversations I have on the topic, and I recall it working even before that.

Sony's tools stopped being annoying around 2007 and have been excellent the
last few years. In some ways they surpass what's available on the Xbox 360 --
Tuner and GPAD are more flexible than PIX and the debugger isn't tied to the
massive anchor that is Visual Studio.

------
jessriedel
Will next generation consoles still use optical disks, or will games be
delivered by flash?

~~~
vetler
Why do we need physical media? I'd prefer to download them myself. I usually
buy games in the Playstation store if they are available there. Yes, of
course, this would mean that you probably can't sell your games after you're
done with them, but that's not a big issue for me personally.

~~~
pyre
You're making several anecdotal assumptions based on your personal situation
and extrapolating it to the entire target market for the Playstation 4:

1\. Everyone has a fast internet connection to the point where downloading a
full BluRay disc of content (~25GB) is a non-issue.

2\. Everyone has an Internet connection where downloading up to 25GB of
content for a single game will not go over their transfer caps, possibly
incurring usurious overage charges.

3\. Everyone that will buy a Playstation 4 will be well-off enough that they
don't need to resell their games after they have played them in order to
afford the next game that they want to play (effectively only paying the
difference between the original cost and the resale amount).

~~~
GlennS
It works well enough for Steam.

Consider also that we're talking 2013 onwards. Connections continue to
improve.

~~~
zanny
Not in rural, or even a lot of suburbian America. Telecom monopolies and all
that. My mother has 300 kbps dsl, which she has had for a decade, and that is
from Verzion, because they have an internet monopoly in her area.

Unless we get either reform on who can lay fiber or get some national fiber
channel program in the states, only big cities stand a chance of seeing
meaningful increases in bandwidth this decade.

------
rkalla
From the business perspective, we all understand the rumored incremental
hardware updates here from Microsoft and Sony; Nintendo ate everyone's lunch
for 5 years with last-gen, 3-year old tech[1] that cost little to make while
Microsoft and Sony hemorrhaged money until what seems like last year... not to
mention the huge technical hurdle for devs and middleware companies trying to
reach parity on each platform.

WIN: If Microsoft goes the _incremental_ route with the same Xbox OS a-la
Apple's iOS evolution, they keep middleware compatibility with larger
graphical budgets for teams to play with... EVERYONE likes this and every dev
team that has shipped an Xbox 360 game hits the ground running immediately --
possibly opening up the doors to _dual_ Xbox 360 and Xbox 720 releases of
their games (one with lower quality textures and rougher geometric assets and
the other with higher-end business) -- I am sweeping my hands across a fairly
complex problem here, but you get the point. With compatibility intact, there
are interesting things that can happen here that we've seen work on the iOS
ecosystem. (maybe not for Triple-A devs, but for budget minded games, this
could work wonderfully especially as digital-downloads and indie devs grow).

LOSS: Sony has to make a clean cut and scuttle the last 8 years moving onto an
x86 design, new OS, new middleware all new QA cycles and retraining all the
PS3 devs.

WIN: Sony now matches the same arch as the Xbox-next (same CPU optimizations
apply and same GPU family) and we can stop getting these damn port-style games
that never take advantage of the extra Blu-ray room or cell processing power.

LOSS: No revolutionary boost in performance if the hardware rumors are true. I
have no doubt the bump in specialized/tuned hardware will give us real-time
versions of the Unreal Engine 4 demos (of that smoking guy that beats up
mechs) BUT, at 4k resolution and 120hz for 3D? Not for every game. I imagine
we will see something akin to what we have now with _most_ games running at
720p with the few rare tuned ones actually running at 1080p resolutions. I
would guess next-gen will offer 4k resolution on some titles and 3D on others
while running at 1080p or some such combination. Obviously if it is a simpler
game like a Wave Racer, we'll get both, but for Gears of War 15 and Mass
Effect 32, I doubt it.

WIN: Faster game dev cycles means we get some more impressive titles on the
improved hardware/experience sooner. Don't have to wait 2 years post-launch
for the first real _good_ game.

LOSS: Next-generation consoles are going to be engineered to be all-in-one
entertainment hubs... TV cable tuning, DVR, streaming, apps, social, games,
mobile-tie-in, etc. etc... this will make jumping between consoles harder as
you stake your claim in the Microsoft or Sony (or possibly Nintendo -- yet to
be seen if they can pull off a Network) camp and build up your life/existence
within their walled garden. This generation it was nothing more than rep
points, next-generation (especially Microsoft) will be throwing every
Facebook-esque psychologically engineered trick in the book to keep you ON
their platform, consuming content through their channel with your credit card
en-tow, sharing your pictures with friends and building your character's rep
from game to game to game, movie to movie and app to app.

I sincerely doubt we (the harder-core folks) will all have all 3 next-gen
systems this next time around and will most likely have 1 (probably Xbox-next)
as the cost of flipping between them will be bigger than just powering on
Console X to play game Y for a few hours.

LOSS x2: Double the resolution on the Kinect-next, rumored eye-tracking,
multi-person tracking and more accurate voice controls? I can't even imagine
what is in store (e.g. "You don't seem to be enjoying this episode of Lost,
would you like to skip to the next episode or can we recommend Battlestar
Galactica?" -- say "Start Mass Effect, invite Scotty32, Soldier load-out" --
experience {episode of Law and Order, McDonald's Billboard in background...
notification popup...} "Just emailed you a buy-1-get-1-free BigMac coupon for
McDonalds! The nearest one is 2 blocks from you... yummy! Say 'Pause' to pause
what you are watching and go grab lunch!")

You get the idea...

[1] <http://en.wikipedia.org/wiki/Wii#Technical_specifications>

~~~
kooshball
I disagree with you on LOSS x2: Kinect v.next.

The future generations of kinect will bring astonishing changes to the world
of gaming. You assume the new capabilities will be used for "malicious" means
which is possible but that's a bad way to judge some technology. That's like
saying the next Gen Intel CPUs will be 2x faster so people will use it to spam
twice as much or hack passwords twice as fast.

The fact of the matter is Kinect will continue to shape the future of gaming.
The hardware is just not good enough for hard core gamers currently. Once that
barrier is gone I think it can create a much more immersive gaming experience
that would benefit all gamers.

~~~
int3rnaut
"Once that barrier is gone I think it can create a much more immersive gaming
experience that would benefit all gamers."

I wonder about this. I think the lack of tactile feedback as well as the
higher intensity play style will never sit right with all games or gamers, but
I do think there will be genres that become synonymous with the Kinect and
become better than they ever were--and more than dancing and fitness. What I
hope is that developers recognize this and don't just shove kinect down our
throats but use it when it makes sense to do so.

------
hef19898
Seems like good news for PC gamers, bad news for nVidia, so. I don't even see
the risk of having that fast moving hardware development cycles like we had in
the past when PC don't need to catch up on concoles in terms of performance.

The only hope I have now is PC gamers don't have to cope with PC adapted
console titles affecting game play. But maybe that's more due to market share
and not architecture, but hope dies last!

------
InclinedPlane
I have to say I'm rather disappointed in arstechnica right now. This article
is nothing more than a riff on a rumor (read the first 2 paragraphs
carefully). I think that's wholly irresponsible on their part.

------
willvarfar
Classic 1st April :)

