This is one of those headlines that initially reads like a joke until you think of the economics of it. "cost is about 5-10% of the cost of an equivalent system built with off-the-shelf computer parts."
I wonder if this is due to the fact that the Playstation hardware is (was?) competitively priced to encourage revenue generation through games? Or was Sony simply very good at mass-producing these units?
Sony subsidizes the cost of the hardware, in order to sell games. They were not happy with the Airforce doing this and it probably played a role in Sony pushing out an update that disabled Linux on all PS3s.
They were very happy about the air force doing this.
The whole point of OtherOS (and the official Linux port to the PS2 before this https://en.m.wikipedia.org/wiki/Linux_for_PlayStation_2 ) was to get the system classified as a general purpose computer rather than as a game console because that gave them import tax benefits in quite a few jurisdictions.
That's interesting. There must be other products with "odd features" for tax reasons. There was a case where Ford installed seats in a particular model of van so that it could be imported to the US as a passenger vehicle, but they removed the seats before selling. They ended up getting fined of course, but probably wouldn't have if they left the seats in and allowed the customers to remove them...
Probably the best known example is Converse. "The felt lining on the bottom of the sneakers allows Converse to classify their product as slippers, so the company benefits from a much lower tariff rate." https://blogs.pugetsound.edu/econ/2019/02/18/tariff-engineer...
Similarly, to my understanding, some cameras are perfectly capable of taking video, but disable the feature altogether (or impose a 30-minute recording time limit) to be classified as a still camera rather than a video camera for tariff reasons.
> it probably played a role in Sony pushing out an update that disabled Linux on all PS3s
Pretty sure that was due to it being used to hack the PS3.
> Blame for the latest culling has been pinned on computer hacker George Hotz, who was originally infamous for unlocking Apple's iPhone. In January of this year Hotz claimed that he had successfully hacked Sony's PS3 by exploiting Linux, gaining "read/write access to the entire system memory, and HV [hyper-visor] level access to the processor".
> Hotz released this to the public on 26 January, boasting, "Sony may have difficulty patching the exploit". He may well have been right, since Sony's latest response has been to completely lock off the required 'Install Other OS' feature. Shame on pirates, shame on Sony.
The death of otherOS was entirely to do with the fact it facilitated a lot of the reverse engineering efforts on the PS3, nothing else. As others correctly note, this was largely in response to George Hotz's hacking which required otherOS. Sony actually exposed themselves to legal action in many countries by selling a customer a product with an advertised feature then removing it after the fact, as such behaviour can often fall foul of sale of goods legislation.
Sony absolutely adored that supercomputer project if you lived through this period and followed the company; the idea that the cell processor was a supercomputer for the living room and we'd all be using our PS3s for media editing etc was genuinely a thought Sony had back then. It all fed into much of the (at times ridiculous) marketing for the Cell chip. Sony had plans for more Cell based devices that never materialized too.
"First, the company will manufacture a high-end workstation using the Cell CPU. Planned for release at the end of 2004... the Cell workstations will be marketed directly to the game and special-effects industries. The labor in their creation will be divided between Sony and IBM. SCE will develop middleware and other tools for game development and film effects. The Cell chips themselves will be manufactured by IBM, who will also work on the OS."
It's a shame they couldn't make it work out anyway. If the cost really was 5%-10% of a similar off-the-shelf cluster, even without the subsidy it would be, what, 10%-20%? That still seems like a steal.
And Sony gets to sell their gaming console as "US Airforce proven" or "a supercomputer in a box" or whatever marketing spin they want to put on it.
And it is only a couple thousand PS3s anyway, so it is a drop in the bucket.
I wonder if there was some behind the scenes stuff going on, maybe IBM worried or angry that this would devalue Cell processors somehow.
I don't feel Sony should have to worry about it. That amount of consoles would have been 1M$ at the time that happened, and represents a minor amount compared to the total sales that Sony had. It's probably a good marketing/PR budget, and messaging like "the PS3 is so good that even the military uses it instead of other computers" sounds like it could make it more appealing for customers.
The main risk would have been that more companies buy PS3 without buying subsized consoles without buying games. But due to the highly specialized architecture I'm not sure if there was ever a real risk of that happening.
You think Sony didn't know the Airforce was doing this? It's not like they walked into Walmart and purchased 1,760 PS3s. It had to have been a direct purchase from Sony.
Walmart certainly had 1,760+ PS3s in their warehouses at one point. So the Air Force could have just direct purchased from Walmart (or Target, etc.) if Sony didn't want to sell to them.
2010: 2882 US Super Centers + 608 Sam's Club + 1578 Mexico stores + 321 Canada stores = 5389 stores
People underappreciated the scale of big box national retail. ;)
The article states that Sony disabled OtherOS before the USAF got their cluster built, and Sony was recalling and warehousing the PS3s that had the OtherOS feature. The USAF had to negotiate with Sony to acquire these older PS3s that still had OtherOS capabilities.
Almost 90 million PS3s have been sold, so it seems entirely plausible that they could have directly bought 2000 of them through some distributor. Doesn't the military industrial complex prefer to go through its inner circle of buddies anyway?
Having worked briefly in military procurement, I can tell you the system is set up in a very bureaucratic way.
Certainly there are companies that the military wants to buy from. For all the shit about the F-35, Lockheed Martin probably employs some of the greatest engineering teams on the planet.
The C-130, for example, is probably one of if not the greatest aircraft ever designed.
Anyways, there are certain companies that make things militaries want to buy, but for more mundane things like computers and pens and chairs, either there's a negotiated standing offer that legally has to be the first point of procurement, or it goes out to contracts. Unfortunately winning government contracts is a bit of a skill in and of itself and some firms have that skill and others don't.
> The C-130, for example, is probably one of if not the greatest aircraft ever designed.
One of the benefits of not being a "sexy" project is that you don't have everyone and their mother trying to be part of the design process. You can tell the team that designed the C-130 was given two numbers: range and payload, and every other aspect of the design was determined by the engineers.
I'm sure if you google around you'll find some articles but, from my perspective as an aerospace engineer who used to work on C-130s: It's an absolute workhorse.
If you've ever been up close to a museum fighter plane, they're in good shape. The leading edges are all smooth and polished, everything is sleek and in good condition. Line Hercs are not that. They're usually dented and covered in carbon from the exhausts. The leading edge of the wing is like three feet thick. It's a Mack truck with wings held aloft by furious amounts of horsepower.
It's dependable, reliable, and versatile. These things survive being shot at, being landed on gravel, ingesting birds into the intakes, ingesting sand into the intakes. You can start a Herc by putting another Herc in front of it and running the engines up so that the prop wash buddy-starts the aircraft behind, like bump-starting a car rolling downhill.
There are dozens of variants from the gunships to the EC-130 Compass Call and friends which carry serious business ELINT gear for secret squirrels to do secret squirrel shit with. You can put RATO pods on it. You can use it for SAR. You can drop bombs from it (and not even by throwing them out the ramp, which you could also do). You can use it to refuel fighters and helicopters aerially. You can put skis on it and land it in the show. You can parachute from it. It's not a jet, but despite being a draggy brick of an aircraft it'll still pull almost 0.6 of Mach while carrying two hummvees. Also, those hummvees can parachute from the aircraft.
There's a reason it's so widely-used [0] and that reason is because the Herc is groovy. It's the unsung hero of nearly every military operation carried out by NATO and friends since the 1960s.
The Air Force maintains many recreational centers. I have a buddy that ran these for the Navy. Lots of game consoles, etc. I'm sure they could have explained the purchase that way.
The PS3 probably was the cheapest way to play with it in practice. There is a reason why modern GPUs share an architecture with video gamers.
It's not about technical advancement, as much as economics. There are two groups of people who want TFlops of SIMD compute. Supercomputer groups, and video gamers.
Cell / PS3 was one attempt at making one device work with both groups, sharing research and economic investment.
NVidia over the next 15 years would execute these economics better however.
I mean, it’s also true that not all supercomputer groups want TFLOPS of SIMD. Some video gamers are happy with consoles made in the 1990s. Why single out the crypto people?
I thought PlayStation and XBoxes are subsidized by the Sony and Microsoft respectively as they make most of the money off the games and the network subscription. Nintendo is the only one that does not do that.
The PS3 was an extraordinarily expensive console, between the cell processor and the bluray drive (iirc this alone was several hundred dollars) and in early models also including a whole embedded ps2 implementation. So PS3 was sold at a loss, and not just a small one, but a heavy loss, like several hundred dollars per console, and even still the PS3 was derided for being far too expensive. It was a financial disaster for sony really, moves like ripping out the embedded ps2 make complete sense in that context, and they absolutely changed their business model for subsequent consoles.
Since then, consoles moved away from the exotic POWER/cell/etc custom hardware towards commodity x86 hardware based on integrated x86 APUs and haven't really been sold at a loss outside of maybe a small window at launch. PS5 moved into hardware profitability about 9 months after launch, microsoft said that the xbox series is still sold at a loss but I don't believe them because the xbox shouldn't be monumentally more expensive to build than the PS5. This is in the context of them trying to argue during the apple app store lawsuit that their lock-in on xbox store was different from the lock-in on the app store, so they have a financial incentive to make sure they "run a loss". It's either not much of a loss, or it's hollywood accounting and the money is going into their other pocket somewhere, like making the xbox division pay a parent holding company big licensing fees for on every console sold.
(again, I don't agree with the "finally" spin here, this article was roughly a year after launch and they may have been turning a profit for a while before disclosing it... the consoles themselves become profitable pretty quickly.)
However, this mindset that "consoles are sold at an initial loss" still persists. They're not, Sony has said they're selling the PS5 at a profit. Previous generations also reached profitability pretty quickly after launch as well. It's not 2005 anymore and the ps3 is gone. Slapping some GDDR5/GDDR6 on a semi-custom APU is dirt cheap.
Even during the launch window when they do lose money it's much smaller, nobody is losing a couple hundred dollars on each console anymore like on the PS3, that model is gone.
>The PS3 was an extraordinarily expensive console, between the cell processor and the bluray drive (iirc this alone was several hundred dollars)
Sony singlehandedly did a lot toward adoption of both DVD and Blu-ray, by the PS2 and PS3. The former's ability to play DVDs out of the box was a huge differentiator between it and GameCube/Xbox. My understanding is that PS3 at launch was not only the cheapest Blu-ray player available by several hundred dollars, but also excellent quality.
>It was a financial disaster for sony really, moves like ripping out the embedded ps2 make complete sense in that context
You can see that in the history of PS3 variants. It's normal for consoles to get a major redesign to cut costs late in their history, typically just before or after the successor is out. PS3 redesigns were a) many and b) very early in the console's lifetime, relatively speaking.
I'm fully willing to believe the Xbox/PS5 is sold at a loss today, the margins are so thin that tiny changes in component prices could have enormous implications on how much money each hardware unit delivers. Neither of these consoles are iPhones, they don't have profit margins of 40% (or probably any double-digit percentage, for that matter). Transitioning from esoteric hardware has pretty much nothing to do with it, anyways: the N64, Gamecube and Wii all used non-standard architectures while being ludicrously profitable. The only truly significant advantage to using x86 in a home console is how easy it is to port/develop titles for it, not a single current-gen console uses commodity hardware besides the Nintendo Switch (since the Tegra board is commercially attainable).
Nintendo definitely has a different business strategy than Sony or Microsoft.
Anyway—the cost of esoteric / more custom hardware got higher, that’s why the console manufacturers moved away from it. It would make sense to shove a lot of custom hardware in your 3D video game console in the mid-1990s, because there is simply no other way to do good real-time 3D, and you have SGI who’s willing to sell you chip designs.
As time went on, the approach of shoving big custom ASICs in your console starts to look worse and worse. Most of the CPU vendors that previously sold you all sorts of architectures like 68K, MIPS, POWER, Cell, etc. stop trying to compete with x86 hegemony. Meanwhile, you’re making life more difficult for console developers, because these custom designs are just so different from everything else on the market.
So you get the PS3, which is expensive to manufacture, and requires a lot of specialized work to program the SPEs (painful for developers). That’s two generations after the N64, and the world has changed.
I would also be less likely to call the Gamecube/Wii architecture exotic, at least compared to the PS3.
> I'm fully willing to believe the Xbox/PS5 is sold at a loss today,
Given that sony themselves have said they're turning a profit, this sounds like a personal exercise in making yourself believe a counterfactual. Some people are into that though, like the flat earth stuff, or the people who think finland exists. See what you can talk yourself into believing, even when the facts are right there ;)
Anyway, your personal belief or disbelief or willingness to believe or disbelieve is kinda irrelevant here, given that sony has said it themselves.
> Transitioning from esoteric hardware has pretty much nothing to do with it, anyways
Yeah actually commodity hardware does have a big role in bringing down costs. Semi-custom APUs are commodity hardware compared to the standards of esoteric Cell/POWER stuff, and actually some variants are available off-the-shelf as well (see Ryzen 4700S which is a PS5 APu with its gpu disabled).
The fact that some custom systems were sold at a profit in the past is kinda irrelevant. The era of "commodity x86 APU with a wide gpu and GDDR memory" is qualitatively different from the era of cell, power, MIPS (PS2), and worst of all sega saturn. Nobody does the "our console is actually eight different processors in a trenchcoat segmented in three busses that you have to juggle in realtime to keep everything fed" anymore like the sega saturn or cell. And no that's not an exaggeration Saturn had eight different processors that all needed to be juggled... two cpus, a sound controller, a sound processor, two video display processors, a coprocessor dedicated to managing loads off the cd-rom, and a system controller, all with different capabilities and bus access. Same for cell with its weird-ass processing element model with a ring and no access to system memory, etc. Those are far far different from the way x86 chips (even semi-custom APUs with different buses etc) are designed and the cognitive load was huge for developers.
Microsoft was ahead of the curve in the sense xbox was a semi-custom intel processor and an nvidia gpu, and xbox 360 was a semi-custom power processor and an ATI GPU, but Sony kept at it far too long. They bet everything on cell, the original idea was that cell could also be a gpu on the same chip but it performed so badly they had to add a commodity GPU at the last minute to try and fix it, but that left them with a cpu with a super-weird programming model and completely undocumented opaque hardware that was a nightmare even to bring up a hello world application on. Then they went "never again" and went commodity x86 SOC with everything integrated, alongside microsoft. That brought costs down a ton and fully aligned them with what was happening in the PC space.
The overall trend was clearly from the arcade/sega saturn era of highly custom, arcane architectures with lots of individual weird chips towards "CPU+GPU" arrangements and then finally just integrated APUs. And that's also the exact same time when they stopped selling things at a loss - the move towards integrated, semi-custom commodity architectures was a major part of that. Xbox One and PS4 and their refresh consoles and pro versions both moved into hardware profitability very quickly.
Also, both of your examples of profitable custom hardware were nintendo and they have always been notorious for going really cheap on their hardware. The few times they haven't, they've gotten burned.
This is my understanding as well. A big part of this was the HD-DVD v Bluray war around that time - Sony winning with Bluray would be worth millions of dollars more, so they were willing to take a loss to get a Bluray player in each person's home.
I bought my first XBox because the XBox One Series S was at one time one of the very best 4k Blu-ray players on the market at any price and also one of the less expensive ones. The fact it could also play games was merely a bonus to me at the time.
When it comes to network subscriptions, at the time of this article Playstation's online services were pretty much all free. It was one of the differentiators compared to Xbox's paid online services at the time.
That said though, the original PS3 hardware was definitely sold at a negative margin and wouldn't be profitable until four years into the console's life.
This is largely not true these days. The Switch sold at a loss at-launch, but slowly turned a tiny profit as shipping prices went down. The modern revisions (Switch Mini and OLED) are also priced similarly, so any profits they're making off the hardware itself is incredibly marginal.
>I wonder if this is due to the fact that the Playstation hardware is (was?) competitively priced to encourage revenue generation through games? Or was Sony simply very good at mass-producing these units?
More likely game controllers over the past few decades have evolved to be really optimized for controlling the position and orientation of an entity in 3D space which happens to be the same problem for characters in videogames and many objects in the real world, and their ubiquity makes them both cheap and easy to integrate.
> The PS3s are the older, larger variety, since the newer slim models don't allow for the installation of Linux.
Big caveat there. Sony later removed the ability to install "OtherOS" (a.k.a. Linux or FreeBSD) on PS3s from those models. [0] I wonder if that firmware update basically neutered the supercomputer or if they were able to keep them from updating automatically.
1. No one updates supercomputers automatically. Instead you firewall the compute nodes and have a team dedicated to maintenance and support to do periodic upgrades.
2. I'm sure they had a support contract, which may have included running an entirely different firmware. I have no specific knowledge, but based on other supercomputers I've worked with, these are often very custom machines, despite the "commodity" hardware. The odds that they were running the consumer version of the firmware are low to nil.
> Rome Lab asked the Department of Defense for $2.5 million to assemble its supercomputer. By the time money to buy that many was approved in 2009, PlayStation 3s were hard to find. Rome Lab bought as many as they could — 1,700.
I can't imagine Sony really gave them custom firmware for just 2k units, right? Especially since all the money from a console is from games sales. Seems like a lot of work for nothing. Maybe they hacked something in on their own though
UGH! All this infrastructure and all this work into a software platform and it's something that will be obsoleted in a few years by off the shelf GPUs built on an standard development environment backed by industry
That's the problem of building something super-cutting-edge on a grand scale - you run the risk of making a super evolved version of an evolutionary dead end.
Sure it sounds like a cool idea at the time, the Cell optimized SETI At Home demo around the PS3 launch ripped through work units far more quickly than any Intel system.
I wonder what became of the cluster once it was decommissioned? Military surplus PS3s anyone?
The challenge was not rigging this up - but making things usable & efficient. The Cell Broadband Engine was notoriously hard in architecture & needed lots of optimization.
Part of the reason why PS4 & Xbox later pivoted towards x86-64 ISA
If you were building a supercomputer the CBE’s peculiarities (specifically the SPEs) were what you wanted.
And the 360 didn’t use SPEs, it has a pretty standard 3-core CPU built out of the PPE. The weird-ass structure of the CBE was not why it switched (although PPC might be).
In my naive understanding, instead of modern SIMD type execution they were using some form of token passing between cores to execute instructions async. That made programming such multicore APUs pretty hard for the programmer. They had a SDK for mitigating some of these pain points. I could be somewhat wrong on the specifics (and the fact I last touched any architecture related topic about a decade ago).Please correct me if I am wrong.
I remember in college we had a lab full of PS3s because they were cheap and powerful and there was a course you could take to learn to develop on them. The lab later expanded after a big donation from nvidia of high end gaming machines to teach students CUDA.
The courses sucked though, because everyone would take them just to get access to the labs.
In the PS2 era I considering buying it and using the "linux kit" instead of buying a desktop. I was prepared for the differences in performance but then I knew that sony didn't release all the drivers or specs and graphic acceleration was not available. Just gave up the idea.
The PS3 could be used as a computer, this allowed sony to pay less taxes in the EU. Since it is such a closed platform the "install other OS" feature could be disable remotely by the vendor automatically and, I think, without any user intervention. When it became economically "better" for sony, they disable the feature.
These are good examples of the problems with such closed systems.
I worked on a DARPA project a few years before this - where we were using CBE as the core for a polymorphic processor (one with an FPGA attached to every IO). We were also gutting PS3s to make mission computers for early unmanned systems - running Ubuntu on top. USAF wasn't the only one - not only were there commercial supply challenges with the PS3, various components were being horded by various nation states. We were pretty sure they didn't even no what to do with the parts, but was a basic attempt to prevent projects like this from getting off the ground.
The Cell's SPEs were such an inspired design - if you look at CPU architecture, 99% of the complexity comes from pretending there's a flat memory space - data hazards, cache coherency, latency hiding, prediction etc.
If you make a beefy processor that works like a microcontroller - reading and writing everything from SRAM, and making all main memory DMA explicit - you get all the speed a fraction of the logic budget.
If you look at a modern CPU pipeline, it's 20ish stages, with most of them about handling the aforementioned complexities, with fetch-decode-execute taking up at most like 6 of them. In MCUs (and SPEs) it's basically all there is.
Unfortunately they never did figure out how to write programs for such a curiosity in an accessible manner.
I'm curious if it's worth taking another crack at this architecture, this time with better tooling.
I remember reading about the air force buying a lot of ps3's back in the 2000s, and that just cemented to me that information is power and any government will do all they can to own as much of it as possible. IBM just published some quantum breakthrough[1] and my first thought was that it will be used by the military or intelligence agencies. Hopefully by the time this technology trickles down to more corrupt agencies the people will have caught up.
You might as well be talking about magic when it comes to quantum computing and me, I have no idea if that new openssh standard really does protect against a quantum computer trying to break it.
Wouldn't seeing the Air Force clamoring for commodity gaming hardware usually destined for a kid's bedroom be an inspiring example of equal access to technology?
Is there any evidence of the resulting “supercomputer” being used? It’s one thing to buy two thousand PS3s. Another to deploy them into a data center. And yet another to build processes and software that uses it. The challenges involved seem discouraging.
The most surprising aspect of this story to me is that the Air Force was allowed to exercise this level of resourcefulness. I just assumed that the procurement process would stipulate 10 year support contracts or something similar.
If some officer can cobble together a proof of concept from COTS parts and show that it is just as reliable and effective as the boutique stuff from Raytheon, et al., at a way cheaper price, you bet that it will be met with considerable attention if not approval. Most ground-based military drones these days are controlled with Xbox controllers because those controllers are cheap and ubiquitous. In the 90s, Doom (with mods) was pressed into service as a simulator to help teach Marines fireteam tactics.
About the most stiff-necked branch of the U.S. military is the Navy. I'd be more surprised if the Navy approved a PS3 supercomputer, especially if it were to be run aboard ship.
I was there, but not on that project. It worked well. The reasons it was attractive were the cost savings, but more importantly the unique aspects of the Cell processor in the PS3s.
What's the best lengthy write-up of the Air Force's PS3 cluster (if any exist)? From the headlines, I had always thought it to be some marketing gimmick.
Why is that? What was it about the Cell processor at the time that was desirable?
Afaik PS3s were sold at a loss for Sony, so it seems likely that they were very beefy computationally wise, but I am use USAF could have gotten an incredible deal with Intel, IBM or AMD, so what is it?
IBM, Sony, and Toshiba worked together to develop STI, the architecture behind the Cell and PS3. What was really unique about the Cell was two things: it was the first dual-core processor machine, and it had 8 specialized streamlined cores as well: the SPEs (Synergistic Processing Elements).
Michael Van de Luur of Gorilla Games said in an interview [1] “Even desktop chips nowadays, the fastest Intel stuff you can buy, is not by far as powerful as the Cell CPU, but it’s very difficult to get power out of the Cell. I think it was ahead of its age, because it was a little bit more like how GPUs work nowadays, but it was maybe not balanced nicely and it was too hard to use. It overshot a little bit in power and undershot in usability, but it was definitely visionary.”
Sony actually officially supported it at the time... and then realized that since PS3 was sold at a massive loss, selling thousands of units to bulk customers who would never buy any games was costing them millions.
Subsequent consoles dropped the "sell the hardware at a loss, make it up on games" model, they only lose money during a brief window at launch and then economies of scale take over and most of the console life it's profitable. It's likely that other kinds of hardware also share this model but we just don't hear about it - steam deck is running tight margins and day-1 sales are more expensive than the "average" unit sold mid-gen or late-gen, so similarly they are probably selling day-1 units at a loss even if they make a profit on the expected cost.
Of course... even though sony is selling the hardware at a profit, they still maintain the lock-in, just like apple/etc. The linux feature never came back even though the reason for the removal went away.
> I believe they had to jailbreak these to get Linux installed on them.
OtherOS was a fully supported feature on the PS3, that enabled you to install Linux on a secondary partition on the built in HDD. At least until George Hotz used it to get hypervisor access and exploit the PS3.
This lead to the feature being removed and him being sued by Sony.
Nope. PS3s were initially released with a built-in feature to install other OSes. This Air Force project got a ton of publicity. It directly led to Sony deciding to lock down the hardware and firmware on future releases.
I wonder if this is due to the fact that the Playstation hardware is (was?) competitively priced to encourage revenue generation through games? Or was Sony simply very good at mass-producing these units?