
The rise and fall of the PlayStation supercomputers - lelf
https://www.theverge.com/2019/12/3/20984028/playstation-supercomputer-ps3-umass-dartmouth-astrophysics-25th-anniversary
======
DCKing
The Playstation 1, 2 and 3 are each in their own way Sony's take on an
alternate history of computer graphics. It was Sony's intent with all three
consoles to put the "special effects" (that in 2019 are performed using
shaders) _in the CPU_ , and not the GPU.

As a result, the PS1, but especially the PS2 and PS3 were engineered to have
amazing vector capabilities in the CPU. The PS2's Emotion Engine had extremely
high performance vector processing units, and was so optimized for compute
throughput it used two different floating point formats, neither of which are
compatible with the IEEE754 standard that every other piece of hardware you've
ever touched uses. The PS3 was the culmination of that philosophy, with the
Cell being explicitly developed to put as much raw number crunching hardware
in something that can still be called a CPU. It was just a single unimpressive
POWER-compatible CPU core, and 60% of its die area went to specifically
engineered number crunching units. This is what made it "supercomputer grade".

There's a persistent rumor that the Cell was so powerful that Sony had
intended for the Playstation 3 to be equipped with _two_ Cell chips, and no
true GPU at all. But it seems Sony realized during development that their
philosophy had no future. Sony compromised and introduced a standard Geforce
GPU. Compared to the more conventional architecture of the Xbox 360, the
compromised design still had slightly more potential but it was a lot more
difficult to work with for games, and more expensive to produce as well.

Starting with the last generation of consoles, homegrown graphics
architectures aren't tried anymore. You can only get performance and
especially good developer buy-in on commodity hardware. As a result, the PS4
and Xbox One have PC hardware. Their successors will too. The Nintendo Switch
shares a chip with the Nvidia Shield TV and Google Pixel C tablet. Console
hardware is not really interesting anymore if you ask me.

~~~
dehrmann
> As a result, the PS4 and Xbox One have PC hardware.

Does this mean the Xbox was pretty revolutionary for showing that you really
can make a console with commodity hardware?

~~~
Narishma
Didn't they have to kill it only a couple of years in because it wasn't cost-
effective? IIRC it was sold at a huge loss that they never managed to get
back.

~~~
wmf
The original Xbox had some contractual problems that prevented it from being
cost-reduced. I don't think MS cared if they broke even since they were in it
for the long haul. Everybody knows Xbox 3.11 for Workgroups was the first good
Xbox anyway.

~~~
llampx
Very different from the Windows Phone philosophy, then.

~~~
TMWNN
Not necessarily. PS2 dominated its generation; Xbox did OK, but only a company
Microsoft's size could have sustained it (and even Microsoft superseded the
console in four years, unusually short for that generation). Xbox 360,
Microsoft's second try, was hugely successful, beating PS3 everywhere outside
Japan. Had it only been as successful as Xbox was, though, Microsoft might not
have tried again.

No Windows Phone iteration ever had more than a few percentage points of
share. iOS and Android both clearly destroyed it in the market. Had WP seen a
few years of sustained success it would still be around today, sort of like
how BlackBerry the brand is still around, albeit for Android-based devices.

~~~
Narishma
The Xbox 360 only beat the PS3 in English-speaking countries. By the end of
the generation the PS3 had outsold it world-wide, if only barely.

------
rvz
This was the moment that Sony could have made something useful out of
themselves here for allowing Linux to be used on a popular games console at
the time, as in improving the accessibility of installing Linux on a computing
machine. (Or even basing GameOS on Linux).

Instead they cowardly remove it due to a 'hack' by Geohot and locked it down
for 'security purposes' and potential researchers, scientists and those
without access to supercomputers are the ones to suffer from this decision.

Sony had to appeal to their gaming community over the science community so
that's that.

Also it's 'OtherOS' not 'OpenOS'.

~~~
grogenaut
(I'm sure I'll lose karma for explaining Sonys side, since some people just
want to be mad)

Linux support was not removed due to Geohot. It was a driver and business
priority issue. Geohot just sealed the deal.

Having worked there during the kerfuffle, I can confirm their reason was valid
for not supporting Linux going forward. It was a non trivial effort and they
wanted those engineers to make it support the new hardware turns. It was a
company wide goal to hit break even at retail pricing at that time (Sony was
amazing at running hardware coat forecasting internally and iirc hit their 7
year Target by a week). This required the new hardware. So they dropped Linux.
That part is real.

Then they effed up by not messaging it well and by dropping promised back
support.

And then a bunch of the community responded as petulant children and attacked
sonys main business and released products that allowed others to infringe Sony
and other game studios property with impunity. These actions killed any chance
Sony would engage with this community for the next decade or so.

Sony was also backed into a corner technically as I recall as the thing hot
hacked (by realizing it was a null crypto key on the one processor), and thus
allowed Linux was also the thing that did the drm. Sony had to choose between
another year or so of drm or Linux support.

As in many cases not parties are at fault. Contracts can and are broken. Sony
paid multiple prices for it. So did the Linux/hacker community.

~~~
hyperman1
I am taken aback a bit by this characterisation:

    
    
        And then a bunch of the community responded as petulant children and
    

Sony supported linux in public, people bought their hardware for it, and then,
without warning, sony janked the rug from under them. Of course people tried
to take back their hardware, for which they paid. The blame here is 100% on
Sony.

Sony has a massive internal conflict of interest by being both a content
creator and a hardware provider. They sabotage their own products to the
detriment of their customers again and again.

Sony worked with the community for a few years, and had the longest unhacked
console to show for it. Then They slapped the community in the face and paid
dearly for it, as the should. Sony should beg the community for another
chance.

~~~
grogenaut
When you have a legitimate beef or problem with someone, you don't take
illegitimate and illegal actions against them. I'm all for reverse
engineering. That's fine.

Extorting a company by threatening them with release of an exploit if you
don't get your way is not legitimate, that's blackmail or extortion.

All of the hackers attacking Sony for months, disabling PSN, is not
legitimate, and should never have been supported by anyone.

Just let the courts handle it like they should and did.

~~~
monksy
> illegal actions against them

What's illegal about reverse engineering a physical product you sell to
someone?

> Disabling psn

Are you kidding me? The PSN was notriously insecure. Sony didn't give a crap
until they were embarassed.

~~~
grogenaut
Read my post.

Reverse engineering legal and to me moral.

Extorting a company with a threat of release of a thing to get what you want:
extortion and illegal. This was done by people around hots.

Agreed: psn was insecure. Hacking is still illegal.

By taking these actions we lost the moral high ground as hackers. The class
action by itself would be just fine

~~~
monksy
The class action only happened because it was hacked. Sony just threw up it's
hands and said "we don't care" until they were forced to.

------
capableweb
When reading through this I was sure this was about Folding@Home but turns out
this was a different project! Strange that Folding@Home isn't even mentioned,
if I recall it correctly, you could install Folding@Home easily via the store
on the PS3, and a normal user could contribute. The Verge article seems more
focused on customized machine (rooted to run Linux and stored at location)
instead of using the standard OS and letting the contributors have the machine
at home. Folding@Home seems to still be running to this day as well! Plenty of
information about it here:
[https://en.wikipedia.org/wiki/Folding@home#PlayStation_3](https://en.wikipedia.org/wiki/Folding@home#PlayStation_3)

~~~
vagab0nd
Off-topic, what's the current state of protein folding simulation? Is it still
an unsolved problem? It was mentioned pretty often as a hard problem but I
stopped hearing about it a couple years ago.

------
gok
The entire interest in this came from the fact that Sony was selling PS3s at a
loss, hoping to make up for it with game sales. They were willing to put up
with people buying them for compute for a little bit for marketing but it was
never going to be sustainable system.

------
needle0
"Supercomputer projects needed the original PS3, not the PS3 Slim, because
Sony had removed Linux support from the console in response to hacks — which
later led to a class-action settlement. This article originally stated that it
was because the PS3 Slim was less powerful. We regret the error."

Wonder what made the writer originally assume that? Mid-generation console
refreshes would be the one occurence where much care would be taken to keep
the performance not faster, not slower, but completely identical against the
original hardware. I remember reading somewhere that the Xbox 360 refreshes
had some hardware to intentionally add latency for inter-chip communications
to keep performance identical.

------
aorth
Seriously surprised that the article didn't mention Iraq buying 4,000
PlayStation 2's in the year 2000. I think Saddam was doing some
supercomputing.

[https://www.theregister.co.uk/2000/12/19/iraq_buys_4000_play...](https://www.theregister.co.uk/2000/12/19/iraq_buys_4000_playstation_2s/)

------
j1vms
> "The Air Force had to convince Sony to..."

To be a fly on the wall during that discussion. I'm sure money wasn't an issue
for the Air Force, or that Sony had some balls of steel.

~~~
callalex
Are you implying that the Air Force is in the business of making “offers that
cannot be refused” especially to foreign owned companies?

~~~
redis_mlc
Well, actually yes.

Notice how there's very few military aircraft companies in the USA?

That's because the little fish got a visit from Washington saying, "that was
your last contract unless you merge."

Northrop tried to resist, for a while anyway.

Regarding foreign companies, the Avro Arrow was cut up after the US offered
Canada a northern missile defense network. The Arrow was an early Mach 2
interceptor - think of the foreign sales possibilities.

Many of the designers ended up at NASA.

~~~
rasz
actually [https://www.nytimes.com/2019/11/20/opinion/military-right-
to...](https://www.nytimes.com/2019/11/20/opinion/military-right-to-
repair.html)

------
clay_the_ripper
> the PS4 can’t easily be turned into a cog for a supercomputing machine.
> “There’s nothing novel about the PlayStation 4, it’s just a regular old PC,”
> Khanna says. “We weren’t really motivated to do anything with the
> PlayStation 4.”

The article doesn’t really explain in detail the difference that made the ps3
an attractive buy for making super computers over regular PCs. Can someone
elaborate?

~~~
capableweb
From Wikipedia about another, similar project (Folding@Home)

> At the time of its inception, its main streaming Cell processor delivered a
> 20 times speed increase over PCs for some calculations, processing power
> which could not be found on other systems such as the Xbox 360.[36][190] The
> PS3's high speed and efficiency introduced other opportunities for
> worthwhile optimizations according to Amdahl's law

> The PS3's uniform console environment made technical support easier and made
> Folding@home more user friendly.[36] The PS3 also had the ability to stream
> data quickly to its GPU, which was used for real-time atomic-level
> visualizing of the current protein dynamics

[https://en.wikipedia.org/wiki/Folding@home#PlayStation_3](https://en.wikipedia.org/wiki/Folding@home#PlayStation_3)

~~~
agumonkey
Ironically, this is the kind of massive research that got Sony into troubles
because it costed a ton and was hard for developers to master (SEGA felt this
hard a decade prior), hence the Nintendo U-turn and Sony/Microsoft 90%
mainstream hardware follow-up.

------
moonbug
It was a cute way to get a Cell processor running Linux for an order of
magnitude less than any other option, during the brief window where that
processor looked like it was going to be future of high performance CPUs.
(Anyone else here have the misfortune to work with LLNL's Roadrunner?). But
then IBM killed it, and Nvidia came along and ate the market.

~~~
dreamcompiler
LANL had the Roadrunner. LLNL had BlueGene/L at the time.

~~~
moonbug
quite so. time and autocorrect.

------
walrus01
That wire rack shelf (sans wheels) is still how some low budget
colocation/dedicated server operators set up their hardware. Minitower or mid-
tower microatx/ATX sized cases with low cost motherboards, CPU, power
supplies. A cheap microatx sized tower case with one rear exhaust fan is $35.

Each system with one or two cat5e cables and a single power cable. In really
low cost colocation facilities the limiting factor is often total kW of
electricity and cooling available, not square footage of floor space.

It's been a common thing in the datacenter/dedicated server business for about
18-20 years.

The same general idea was adopted by Bitcoin miners using ASIC hardware, and
later on with Ethereum mining again with systems using two or four GPUs
connected to a low-cost ATX motherboard.

~~~
lozaning
If you're not running 7+ cards on a mining rig you're probably doing it wrong.
The higher the number of cards you run in a rig the more you can distribute
the cost of the non mining hardware (HD, mobo, cpu, memory) across cards.

A mobo that supports 8 cards might be double the cost of a board that supports
4 cards, but it prevents you from having to buy a second cpu, a second hard
drive, second psu, and more memory to put in your second cheapo mobo.

Now if you can buy random cheap ATX boxes for less than $27 dollars per PCI-E
slot you are better off initially, but then management across all your various
junk boxes becomes a PITA quick.

~~~
walrus01
The last time I spent 15 minutes doing a possible ROI calculation on a GPU
based Ethereum mining system, about a year ago, I came to the conclusion that
the upfront cost of buying the GPUs alone, plus the electricity (entirely
ignoring the cost of a dedicated mining purpose motherboard and cpu/ram, power
supply, etc) would not have a reasonable payback period. Even if the
electricity was 1.5 cents a kWh, and cooling was free.

I've seen the various motherboards from Taiwanese board manufacturers which
are designed for mining, with many small pci-express 3.0 slots intended for
use with riser ribbon/interface cards, such as for use as you describe with 8
GPUs attached to one motherboard.

[https://www.techradar.com/news/best-mining-
motherboards](https://www.techradar.com/news/best-mining-motherboards)

~~~
OldHand2018
I did some Ethereum mining for a little while and simply couldn't believe that
people were buying/building new systems. PCI 3 is backwards compatible with
PCI 2, which has far more bandwidth than a mining GPU will ever need.

I bought surplus Core2Duo systems for $10 and slapped in a cheap SSD and a
1060 6GB. The CPU would sit there at 99% idle and the power supply never broke
a sweat.

~~~
walrus01
That is a very good point. The calculations I did were well after the value
drop (early 2018) in Ethereum, and assumed that case, motherboard, cpu, ram
and power supply were 0 dollars. Just based on the cost of electricity and
buying several thousand dollars of GPUs. Even then it didn't work out.

------
krilly
I would expect this to be more viable than ever, now that the cost of console
hardware is so heavily subsidised by online subscription services.

~~~
masklinn
The CPUs of modern console are pretty standard general-purpose CPUs: the PS4
and XB1 use customised AMD APU, and the Switch uses a Tegra X1. That's not
useful for supercomputers.

The PS3 was using a pretty weird CPU, which importantly had a single general-
purpose core but 6 SIMD co-processors, making it _very_ useful for compute
tasks on supercomputers.

These days you'd use off-the-shelf compute-oriented FPGA or stream / GPGPU
units paired with general-purpose CPUs e.g. OLCF-4 has 9216 general-purpose
CPUs driving 27648 Tesla V100.

------
Finnucane
I have a recollection that at the time when this sort of thing became
possible, that there was some worry that unfriendly nations could use this to
get around technology export restrictions (what no, we're not buying a
supercomputer--we're just buying 500 game consoles for our weapons designers).

------
WrtCdEvrydy
I love the plug for Person of Interest and was surprised that the original
ones from the Air Force ended up there.

------
gautamcgoel
It would be awesome if you could install Linux on modern Xbox or PS4 consoles
and use them as PCs...

~~~
ascagnel_
I’m not sure how much of a benefit that’d be at this point — the PS3 was some
custom hardware that was unique to its era, while the current consoles are
5-year-old x86 chips running custom OSes. Putting Linux into that mix only
gets you a stable, but outdated, hardware platform.

~~~
gautamcgoel
I was really thinking more about the upcoming PS5 and Xbox Scarlett, which
both should have Zen 2 CPUs and Navi GPUs, along with a generous helping of
GDDR6 memory and SSDs.

~~~
ascagnel_
At launch, those systems will probably be very competitive with off-the-shelf
PCs. By 2022, they won't look as good.

One thing these new consoles are doing that seems interesting is that they use
a huge pool of unified RAM (rather than two independent pools of main+video
memory).

------
investologia
he article neglects that the PS3 supercomputer was popular because Sony sold
them at a loss.

No hardware manufacturer does this any more.

