Hacker News new | past | comments | ask | show | jobs | submit login
Architecture of the Playstation 2 (copetti.org)
292 points by biwasa on Feb 23, 2021 | hide | past | favorite | 56 comments



I remember when Toshiba first showed us the R5900, which was a real screamer for its time. They couldn’t tell us at first whom they were designing it for. This was perhaps 2-3 years before the PS2 was publicly announced (at, IIRC, E3) and even before it had been discussed with any 3P developers.

In fact the secrecy before it was released was insane. Toshiba and Sony kept their technology very close to their chest and each would tell us stuff under NDA that prevented us from talking to the other about it.

I sometimes wondered if we were the only folks who could see both sides. On a couple of occasions we had to delicately get permission / forgiveness for slightly bending NDA to make sure that some misunderstanding could be resolved (think of two companies building a bridge, owithout looking across the river to see where the other guys are aiming).

Ken Kutaragi was (still is I imagine) an inspired thinker and leader and good supporter. He built a great team, some of whom are still friends.

One of the great, if grueling experiences of my life.


It may have been 4 years! I'm pretty sure the R5900 deal was just done when I joined in December '95, and the PS2 wasn't announced until 1999.


> Some time ago, it was discovered that the BIOS of this console could be upgraded using the Memory Card, this function was never used in practice, but neither removed (at during for most of the console’s lifespan).

The author is correct that official uses for the MagicGate-encrypted system updates were few and far between, effectively "never", but there were a couple!

There was a DVD Player installer/updater for early console models that shipped before that software was finalized, and in Japan there was an entire "Browser 2.0" update that enabled a full HDD OSD unlike the bare-bones HDD Utility Disc we got in US/Europe regions! https://youtu.be/dJnLJxVljGw?t=509

The first revision of slim PS2 still has the 40-pin ATA header on the motherboard, and it works if you solder on a connector, so who knows what could have been.


I have a... heavily modified Browser 2 install on an insanely large (for a PS2) 1TB HDD. Between that and some other tools you can install and run full games onto the console. I even figured out how to make custom Browser icons for them, although in practice selecting one game out of even a decently-sized PS2 collection is ungodly slow. It's still cool as a retro hobby project, if you're willing to jump through the hoops to get a working setup.

AFAIK the official purpose of Browser 2 was to support PSBBN and distributing software through it. They never launched that service outside of Japan. Hell, most of the games that supported the HDD got that functionality stripped out of it in the US. PS2 games could install themselves to the hard drive for faster load times, much like the later install features on Xbox 360 and PS3. However, as far as I am aware this was exclusively for online service games in the US and Europe. On the original Xbox, Microsoft was already trying DLC and digital distribution (no, seriously, it had it, but you needed a special Arcade disc to run games). So I have to wonder how Sony wound up sleeping on something they were already doing for a whole generation.


There were a couple of games released in Japan that got the HDD support stripped out in the North American release. Two such examples are Star Ocean 3 and Xenosaga.


The PS2 OS is truly a work of art, such beautiful sound/graphics and ui design.


The PlayStation 2 is actually one of the original reasons I got interested in the computer engineering space. The EE chip is a sort of who's-who of optimization techniques - hardware decoding, vector processing, VLIW architecture (for the VUs), memory bandwidth tailored to keeping the huge amount of functional units busy with a constant stream of math to do.

At some point I got a port of Debian (BlackRhino GNU/Linux) running on my PS2 - slowly, off a USB 1.1 thumbdrive, as I lacked the hard drive / network card adapter. Never got too far with it owing mostly to lack of the network, but in spite of being a 1999 console, it ran X11 at 1080i just fine.


Indeed awsome hardware to code for. It was amusing when the Sony guys joked people are writing their VU assembler in Excel. To make it easier to mentally keep track of the instructions in each pipe.


This is incredibly validating. I once wrote an Excel spreadsheet to keep track of register allocation in assembly and have felt ashamed ever since


There is a work-in-progress port of the newer Linux 5.x kernel to the PS2: https://github.com/frno7/linux


I got the official PS2 Linux kit on launch and even with the hard drive and all the accessories, it was still slow as molasses. There just wasn't nearly enough RAM to make a usable desktop environment.


Nice. I have a Red Hat Linux port for PS2, but haven't booted it in almost 20 years thanks to missing a sync-on-green monitor.


Edit: why am I talking about the PS3 cell? This article is about the PS2. :facepalm: please ignore, too late to delete?

Unmentioned is perhaps the coolest fact, about how it all went down:

The leading lady of chip design Lisa Su (presently President & CEO of AMD) parlayed her existing chip-design experience first doing Silicon-on-Insulator research at MIT, then spending some years fixing some copper interconnect emerging-problems at Texas Instruments, before first working for IBM, where she launched an Emerging Products Division at IBM (2001)[1], & focusing on low power chip design among other things.

She became IBM's chief point of contact with Sony in their collaboration to develop what became the PS3 (2006) Cell chip. And is largely credited with the design.

Afterwards She became VP of Semiconductor R&D at IBM before leaving for a CTO position at Freescale in 2007 (bought by NXP in 2015, in turn almost bought by Qualcomm, but China did not sign off fast enough). She joined AMD as SVP & GM of AMD in 2012, & became CEO in 2014.

Thank you Lisa. You continue to light the way. On a personal note, it is so nice to see some absolutely badass engineers given a role in companies, allowed to lead the way. It feels like this world rarely knows what it's doing, but you have the deep background, the in-the-trenches engineering chops, & the wide swarth of experience that makes you a true & genuine veteran of this field, & your work & developments have been ongoingly leading for the industry, each time. Thank you again Lisa.

[1] https://en.wikipedia.org/wiki/Lisa_Su#2000%E2%80%932007:_IBM...


Cell was a disappointment in the PS3 though. It was such an underperformer that the SPEs couldn’t be used for their intended role as a GPU and got shanghied into being used like SMT cores, while a whole separate GPU chip had to be added into the design at the last minute to compensate. The Cell’s weak CPU performance, strange programming model, and weak developer tooling was a problem for pretty much the whole generation.

The much more traditional XB360 tri-core plus GPU design was widely considered the better design and Sony reverted to a similarly traditional design the next generation as well.

It was a highly heterogeneous design, and yes the world has adapted to those much better now than they did in 2007 (and maybe it would have been more successful with better tooling), but there are also prior heterogeneous designs like Saturn that were also widely considered a mess apart from rare cases where developers were skilled enough to effectively utilize them. The tasks where it succeeded (like the oft-referenced air force supercomputer) tended to be embarrassingly parallel workloads (matrix math, password cracking, etc) that didn't have a high degree of dependencies in their processing workflow. That is pretty much the lowest-hanging fruit for a heterogeneous throughput-oriented system, if it couldn't do even that it would be fully useless, but merely doing that doesn't make it good.

I guess maybe think of it like a GPGPU - but a GPGPU would have been a terrible choice to use as your control CPU for a console even if it did allow higher degrees of processing in some narrow use-cases. Just not how that type of programming is usually done. Maybe at best you can say it was mis-understood and mis-applied by Sony but it really just wasn't as great a chip for most things as people thought at the time. Wasn't a great GPU, wasn't a great CPU, wasn't good enough to be the first "APU" to bridge to having both one one chip as originally intended.

It clearly was designed for HPC/supercomputing situations where someone understood the architecture. Probably still was held back by the tooling and documentation anyway. I know it was in some supercomputers, but there probably is a reason that 10 years later nobody is designing follow-on processors or inspired by that particular architecture.

People at the time jerked off heavily about it, both in HPC and the PS3, and I get that it powered a lot of people's fond childhood memories, but I think in hindsight it pretty much was a failed design and an architectural dead end. You can say that maybe APUs took some inspiration from it but the cost-savings of an integrated CPU+GPU are pretty much an obvious goal, and later processors used a much more standard CPU and GPU that happen to live on the same chip and have some coherent memory access protocols. They share nothing in common with the Cell's SPE/ring concept. Nobody ever used that in a design ever again, for a reason.


Sony wanted high-performace integrated graphics a full generation before such things were really feasible. I think that's ultimately why the CELL was such a failure - it was trying to do way too many things too ineffectively.

Also, building a GPU out of multiple tiny CPU cores on a ring bus just seems incredibly wrong. Even back then, the way you built a GPU was to have one instruction stream controlling many ALUs and register files. Intel made the same mistake with Larabee too - I guess everyone just assumed all the fixed-function bits would disappear eventually? (Meanwhile, ray tracing means even more fixed-function units...)


Cell's problem was the end of Dennard scaling hit them as hard as it hit the Pentium 4. It was designed in an era that was expecting to be able to get 5Ghz+ easily out of (albeit premium) consumer electronics chips.

https://en.wikipedia.org/wiki/Dennard_scaling#Breakdown_of_D...

By the time they figured out that they weren't going to be able to cash the metaphorical checks the process side had been writing up until that point, it was too late to go back to square one. The design unfortunately had to be pigeon holed out of it's designed niche, and they had to strap that anemic Nvidia GPU to the side of it just to have a shippable product. Turned out really well given all those factors working against it.


There's a good On the Metal podcast with Jonathan Blow as the guest, where he talks about his experience with the Cell processor. He also states that in his experience, the idea didn't really pan out.

https://open.spotify.com/episode/66a1dd3c2CbXLgTtMmaXUt?si=5...


I had no idea she had such a career..


Note this is part of a series: https://www.copetti.org/writings/consoles/


Nice! I loved the Mega Drive one [0]. This series is so cool! Its full of details on sprites and architecture. I admire the work that was put on this.

[0] https://www.copetti.org/writings/consoles/mega-drive-genesis...


Ok, the internet is finished. No more need to put more work into it. This is the best thing ever.


For homebrew exploits, there was also a new one discovered and implemented last year, called FreeDVDBoot. The newest model of the PS2 Slim fixed the memory card exploit, and this new exploit doesn't require a modchip or any other hardware modification.

https://github.com/CTurt/FreeDVDBoot


There also exists the "Fortuna" exploit (and also recently reimplemented and documented as open source "OpenTuna"), which uses a buffer overflow in the memory card icon reading code, providing a disc-less exploit for the later models of PS2 (including the "PS2 TV", aka the Sony Bravia KDL-22PX300 television set including a built-in PS2 console)

With FreeMCBoot and Fortuna, all PS2 consoles can be exploited from boot time. With FreeDVDBoot, it provides a convenient entrypoint to install the aforementioned exploits.


This reminds me of my homebrew days making custom Guitar Hero 2 tracks. There was a great modding community (scorehero.com) around. This was before DLC was a thing, so if I wanted more content, I had to make it myself.


People still hack GH2 a little bit. Clone Hero is also a pretty big thing for ease of use.


I remember reading some story way back with some ... Sony exec? Someone saying that there were people writing engine code in Excel to be able to track the co processors and get the timing right.

Does anyone know of the truthiness of such a story? I'm learning FPGA design and even there, when timing is pretty controlled, people end up recommending design patterns where you gate stuff similar to how you would do it in general software design (expecting one branch to go faster than the other, etc)


I bet they were writing Vector Unit assembly in Excel. The VLIW design meant that you literally packed 2 assembly instructions together into a 64-bit word. Each side had it's own subset of execution units it could utilize and would have it's own separate latency for each operation. So, to keep track of everything going on you pretty much had to write out two columns of code containing your own no-op notation to represent the delay cycles between instructions. That would be much more convenient to do in Excel than in a linear text editor.


The N64 RSP is similar. I've been playing with a N64 RSP assembler that'll understand how instructions can dual issue, and give warnings when two instructions on the same line can't dual issue, so I can for sure see myself using excel back in the day to help manage the complexity.


Is that the bass assembler, or something else? I've been playing around with the default assembler setup in libdragon, but I'd be curious about trying something more intuitive.


It's an assembler I'm writing as a Rust proc_macro. Lets you do fun stuff like share constants between Rust running on the main CPU and the RSP, and handles some runtime code generation pretty nicely


Not to pry too much, but I'd love to learn more about it. Is there a chance you have it on Git(ea/Lab/Hub/etc.) somewhere?


Hopefully the Vector Unit assembly didn't require more than 16,384 lines of code if they were using Excel 95.


The bigger VU only had 16k of of memory so that's not an issue, lol.


The story I heard was of a very, very heavily macro-based excel sheet that would automatically highlight possible problems including pipeline stalls in one of the VLIW units.

Been looking for confirmation or just a screenshot of that monster since :D


I found a comment from me speculating on how such a spreadsheet be set-up: https://news.ycombinator.com/item?id=12294472

I can't find any evidence of actual spreadsheets. They probably would have been setup by each programmer for their liking and maybe shared between programmers on the private sony sdk message board.

Or maybe I have an overactive imagination.

But I have heard is that the leaked Playstation 2 SDK from 2005 does contain a windows program called "VUEditor.exe". It's a custom IDE for writing VU assembly and it kind of acts more like a spreadsheet than a text editor, allowing the programmer to shift highlighted columns of instructions up and down.

It does show stalls and highlight issues. I couldn't find any screenshots, but one could find the sdk and try it out if they wanted.


Interestingly enough, it could be that said VUEditor embeds Excel. These days it seems forgotten trick, but all of MS Office on Windows provides components that allow you to embed features inside your own programs


It might have made things simpler. From memory there was also a VU compiler that would take a single stream of instructions and then try to interleave the the 'left/right' instructions of the VLIW in the most efficient orders. It never was quite as good as hand coding - which is something kind of satisfying once you've memorized the stats of the entire ISA.

One half of the instruction word was mostly devoted to vector instructions which would each have a 4 cycle latency, and a single cycle throughput, to say, multiple one 4 word vector of floats with another. The other half had the kitchen sink, like trancendental functions (sin/cos/tan/etc), divide (x/y, 1/x), etc. However an instruction would stall if a calculation was depending on a register whose calculation was still in flight.

So if an instruction has a latency of 4 cycles, you could interleave 4 similar calculations on more of the dataset in parallel. So in 4 cycles, you could start 16 float calculations, and by the 8th cycle, they would be available. Though likely you'd be looping with something like Duff's device to maximize throughput. It would also mean you might use the vector calculations with a taylor series, in preference to the sine instruction which had a latency of something like 15 cycles.


Excel as in microsoft excel!?


I can totally see it working well for writing in columns, and being able to do rudimentary automated analysis!


I find it fun that the essence of the PS2's Vector Unit 1 is reborn in DX12/Vulkan's "Mesh Shaders" https://www.geeks3d.com/20200519/introduction-to-mesh-shader...

TLDR: Where the GPU's traditional geometry API mostly worked 1 independent vertex at a time (with a few unsuccessful extensions), Mesh Shaders are a more free-form, compute-shader-like approach where you can load as much arbitrarily-structured data as you can fit in a fixed SRAM then compute however you like and output triangles directly to the rasterizer. Pretty much exactly like the VU1, but with Shader Model 6 syntax.


Dont forget that there are (homebrew and/or abandonware) sdks out there: for the PSP, PSX and other consoles.


The GS (graphics chip) in the PS2 had a great hardware bug whereby if you turned off the z-test, it would never come back on again. You had to change the z mode to always pass to get around it.


>In later revisions of this console, the IOP was replaced with a PowerPC 401 ‘Deckard’ and 4 MB of SDRAM (2 MB more than before), backwards compatibility persisted but through software instead.

Devwiki says it's a PPC 405 and this interview of Tom Reeves, IBM’s VP of semiconductor and technology services, says it's a PPC 440 which is more realistic for a 60 fps emulation of the first PlayStation.

Last question:

https://web.archive.org/web/20060806090854/http://www.reed-e...


Despite its advanced and complicated design it was outpreformed pretty easily by more general purpose cpus that where not nessicarily much newer.


Including the Xbox and GameCube. Well, the Xbox was built out of significantly more "general purpose" parts than the GameCube, but both still handily outperformed the PS2.


I'm really surprised how the Gamecube was able to run circles around the PS2, despite being cheaper and using just a standard CPU/GPU combo.


I think with the Emotion Engine (PS2) and Cell (PS3) Sony got a bit obsessed with things like vector coprocessors that are theoretically very powerful, but basically require a programmer to meditate under a waterfall for 5 years to actually master. I don't think they had a reputation for providing the best support libraries etc. either.

Of course Xbox and Gamecube also released at least a year later than the PS2, and graphics tech was moving fast at the time, so that was an advantage for them as well.


I think the GameCube would have been much more of a success had Nintendo not opted to go with 1.8GB mini DVDs in an era where huge open world blockbuster games were filling up 9GB dual-layer DVDs on other consoles.

Extra silly when you consider that the N64 had already been hamstrung by the decision to use <= 64MB cartridges over CDs.


Historically it's in-line with Nintendo's policies around guiding content however.

After the CD-I debacle, Nintendo was not at all on-board with the idea of disc-based storage. One of those reasons was loading times.

Another was likely that they had concerns about games that were more 'FMV' than game; This really -was- a legit problem in the 90s. By keeping things constrained, it limits the temptation for a publisher to push out a crappy title filled with cutscenes.


It wasn't though. The cube had a few extras built in, particularly in texturing (compression and a texture register combiner), but in terms of flexibility and straight line speed the ps2 is definitely ahead. Burnout 3 and revenge, for example, were not possible on gamecube at the same level of performance. Gamecube has no vertex shader equivalent and has less than 1/2 the SIMD throughput per clock.


I ran across an interesting factoid a while back where a line was drawn between the Amiga CD32 and PS2 - that somehow Sony picked up where Commodore left off but I’ve been unable to find a reference since and I’m certain at this stage I’ve got the names and dates mixed up ... anyone heard anything like this?


I suppose you might argue that the amiga paved the way for the architecture of the PS2 - the copper chip ran a rudimentary display list, the blitter hardware accelerated graphics operations. But I wouldn't say there is a direct relationship... wait you're not talking about the CDTV, are you?


Maybe it was blitter etc

Nope not CDTV, CD32 was a later project and really quite good but C= couldn’t sustain it. The story I think I heard was that the similarity in the controllers was some sort of homage. Maybe it was PS1 rather than PS2 ... thinking about the timeline that all lines up, and Sony would have been looking for a jumping in point rather than starting from scratch after being burnt by Nintendo.


>CD32 was a later project and really quite good

CD32 was a repackaged Amiga 1200. It was the worst fifth generation game system. It featured such gems as Wing Commander running at 1 frame per second.


It was a fine machine, and though it was weaker than its competitors it was ahead of the curve. You say “repackaged Amiga 1200” like that’s a bad thing. The Amiga was the premium home personal computer of the early 90s and only ever died out because of mismanagement at Commodore.


[flagged]


Nobody asked.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: