I was out of school for quite a bit by the time the PS3 came out but I remember installing linux on mine and being pretty excited to program on it.
I did a little bit, but it ended up being a super PITA because I didn't want to put my PS3 on my desk and give up playing games on it on my TV.
Because I wanted to keep it with my TV I had tried to use it with Bluetooth KB/Mouse so I could use it from the couch.
And well typical Linux you always end up spending time on the most mundane thing. Bluetooth on the PS3 in Linux at that time didn't seem to work right.
That to me is like peak Linux right there. At some point I had said to myself "I wanted to play with the cell and instead I'm working on Bluetooth" and I lost interest cause of course I had real work too. Then eventually Sony disabled the whole thing.
It's really not 'peek Linux' it's just that many manufacturers refuse to even provide specifications for using their product (basically it's API) without charging money and/or requiring an NDA etc.
This makes any third party trying to support them face nightmare after nightmare of unknowns, and a complex device like a computer of any sort is nearly always a complex web of interlinked mini-computers each doing their own thing. Like the damned bluetooth chip; which worse probably has government regulated transmission compliance stuff baked into it's software rather than hardware. That adds a whole other nightmare you might see with various firmware blobs and driver support, most often seen with other WiFi adjacent projects such as OpenWRT (and supported hardware).
It would be very nice if some law enforced free 'edge technical specification' for all products to anyone who buys them or a product containing them. That would help a level and fair market for competitors as well as right and ability to repair for consumers / device owners. That sounds a lot like effects desired by most (many?) Libre software licenses, like the GPL (and LGPL), and Creative Commons.
I built a cluster of 20,000 PS5 APU's. The problem was that they didn't have ECC memory, so it was pretty hard to repurpose them for any real work. I tried really hard to find some use cases, but unfortunately getting funding to run them was going to take longer than we had time. It all got shut down.
Oh right... I think somewhere around 1st gen they were cool with that. But didn't they end up getting killed in the market because they had to sell it for less than Five Hundred and Ninety Nine U.S. Dollars, effectively at a loss, and people were making compute centres out of them?
For the most part the cluster of PS3s supercomputer was a myth used to hype up how powerful the PS3 was. The amount of RAM per compute was so low that it limited the type of workloads it was good at. Then server CPUs became more powerful as the PS3 processor stayed the same.
However PS3 clusters did find niche success as a super computer. There are a handful of notable examples. The fastest one ever built was by the US Airforce, consisted of 1,760 PS3s, and was the 33rd fastest super computer at the time. It was used for satellite image processing.
When I was paying attention during the 90s and 2000s I remember this being the hype in magazines / on the web for pretty much EVERY upcoming console. Namely that it was so outrageously powerful that it might be considered a supercomputer in its own right. Needless to say, I was hyped up and obsessing about them myself.
Let's not forget the Japanese government applying export restrictions on the PS2 because it might be powerful enough to be used for military purposes [1]. It even went so far as claims that Iraq ordered a bunch of them for that purpose [2]. That's some serious hype marketing haha.
I always thought it was a bit odd that we went from super-compute clusters built out of PS3 to the PS4/Xbone generation consoles being meh-grade laptop chips in a fancy case.
As one goes back in console history, the performance to achieve contemporarily visually-stunning results required exotic architectures + hand-tuned code.
Over time, state of the art PC + GPU hardware improved, to the point there wasn't a functional difference vs exotic architectures. Additionally, the cost of developing leading-edge custom architectures exploded.
Microsoft really pushed this with the original "PC in a box" XBox, but you saw the console OEMs outsource increasing parts of their chip solutions after that.
Why keep / buy entire chip design teams, when AMD will do it for you?
Especially if the performance difference is negligible.
The article also references the cost of developing a AAA "HD" title is so high that it pretty much needs to be released on multiple platforms to be profitable.
PS3 was popular for compute clusters because it was sold heavily at cost at launch. That's typical for a console where the platform owner gets a slice of all software sales, but makes no sense for a general computing platform. The performance/$ was unbeatable because the economics were perverted.
The PS4 was sold at break even price and couldn't run unsigned code anyway.
OtherOS was a way for Sony to work around higher tarifs. General computing devices (enabled by OtherOS) had a lower schedule than consoles.
It was retroactively removed when people started getting close to enabling full GPU access. This was previously crippled under OtherOS to prevent games developed without the PS3 SDK.
> It was retroactively removed when people started getting close to enabling full GPU access
It was retroactively removed when Sony started worrying about "security" concerns - aka people running unlicensed commercial games, which is usually the killer app for any firmware that supports customization and programming.
It was also advertised on the original PS3 box.
Ultimately there was a lawsuit and some PS3 owners got $10 apiece.
As PS2Linux owner I think that was a reaction to the way the Linux community handled PS2Linux.
Instead of following the Yaroze footsteps, and being a way for a new indies generation to learn how to create console games, the majority used PS2Linux for classical UNIX stuff and play games on emulators.
Note that PS2Linux had GPU access, via PSGL and an easier high level API for the Emotion Engine.
>>About the 33rd largest supercomputer in the world right now is the US Air Force Research Laboratory's (AFRL) newest system, which has a core made of 1,760 Sony PlayStation 3 (PS3) consoles. In addition to its large capacity, the so-called "Condor Cluster" is capable of performing 500 trillion floating point operations per second (TFLOPS), making it the fastest interactive computer in the entire US Defense Department.
>>It will be used by Air Force centers across the country for tasks such as radar enhancement, pattern recognition, satellite imagery processing, and artificial intelligence research.
On release and for a few years after installing Linux was fully supported (Yellow Dog was probably the most popular distro). After George Hotz demonstrated a jailbreak that would have made piracy possible Sony released an update that permanently disabled the ability to install or run Linux.