This same guy - Randy Linden - got DOOM to run on SNES. I don't know where I read the original article - but the amount of skill it took for him to accomplish this was just incredible:
"DOOM on SNES happened thanks to the genius and determination of a single man: Randy Linden. The man had an admiration for the game and decided to port it to a mass-market machine so more players could enjoy it. Randy never had access to the source code or the assets from either the PC or the console version. He started from nothing."
"After developing a full prototype, he later showcased it to his employer, Sculptured Software, which helped him finish the development. In the interview, Linden expressed a wish that he could have added the missing levels; however, the game, already the largest possible size for a Super FX 2 game at 16 megabits, only has roughly 16 bytes of free space. Linden also added support for the Super Scope light gun device, the Super NES mouse, and the XBAND modem for multiplayer."
I was lucky enough to work with Randy at Microsoft, an incredible developer all around.
Strangely enough for the majority of the time we were working with him, none of us knew he was a programming legend. I think eventually someone was Googling team member names and realized who we had in our midst!
What was he working on at ms? I watched a video about his snes port of doom, amazing stuff (and slightly depressing know I'll never be any where near that level).
Microsoft Band! He did a variety of stuff on the team, including some parts of the underlying runtime and also our build system. He added 3rd party compiler support to Visual Studio long before they officially supported it. :)
> (and slightly depressing know I'll never be any where near that level).
Randy is very persistent. That Quake demo was 2 years in the making, and that was after he'd already spent over a decade writing low level code.
Highly optimized code is a matter of time, time, and more time. It is a thousand little improvements that get you towards your goal.
If you want to learn to write that type of low level code, and it is very rewarding, Pick up an embedded board like an ESP32 and go have at it.
What kind of projects those boards might be use of? I actually owned a Tiva launchpad and went through a whole book writing embedded code (not really very practical though, just textbook exercises) but don't see any constraints there. I was about to write some code to drive a LCD screen for my "proto-type gameboy", but it didn't go well because I failed to write the driver. Maybe I should pick it up again.
He also wrote the commercial PlayStation emulator "bleem!". That is, he single-handedly wrote an emulator of a then-current console that ran well enough that people actually paid for it.
An absolutely incredible achievement. Though to clarify - this is not Quake running on Game Boy, but rather a custom engine hand-written in assembly that can run Quake levels (or, I think, just the first level anyways). Which, in my mind, is even more impressive.
Even Doom itself had the engine rewritten to work on some of the consoles it was actually released on. Most people consider "it runs Doom" to be "it runs the PC version of Doom, modified" but there were other engines.
Randy talked about the doom snes port (among other things) during fosdem 2021, can’t remember if he mentioned Quake , it was a very interesting interview though
To be fair, ports of games for that class of console would often be complete rewrites. Sometimes even the assets would be redrawn just from looking at another version of the game running. Remember, despite coming out later, the gameboy advance would be in the space of SNES/Genesis if it were a home console.
This was often by necessity, I worked on some of these ports and we would just get the game on whatever machine it was coming from (cart, disk, arcade machine) and then we would literally reverse engineer it as best we could - which usually meant just playing it a lot and video taping and making notes. Then we would re-implement / draw / compose a work-alike version on the target platform.
As a kid, I always thought the developers of these ports were simply not very good.
But once I started writing my own code and working in the industry, I realized that you guys were insanely talented people working under ridiculously punishing deadlines and (as you said) zero access to the original source code, assets, etc.
Not familiar with your work in particular but in general, what a heroic and underappreciated game dev niche. Much respect for all that the unheralded port programmers managed to accomplish against the odds.
The genesis is arguably a 32 bit console as well. The genesis has only a 16 bit bus to memory, but the GBA has only a 16 bit bus to it's cart and most RAM as well, leaving only on die RAM and MMIO with 32 bit bus accesses.
Additionally, it's GPU is very much derived from the NES->SNES family of PPUs. VRAM can be used as a LFB, but it's generally not the best use of the system.
I'd argue it's about halfway between a stock SNES/Genesis and an SNES/Genesis with a cart addon chip like an SVP or Super FX.
The genesis has 24 bit address bus, but my 64 bit CPU I'm typing this on only has a 39 bit physical address bus (48 bit virtual) and we still call it a 64 bit CPU.
The re-releases of Doom do use the original Doom, alongside Unity. Here's where we get into Theseus' ship: the game logic engine itself is still original Doom's, but Unity is used for the video output, sound, and input engines, allowing for easier ports because of the difference between console and phone graphic APIs and input options.
Compilers are pretty good most of the time. If it was a platformer game, it'd have been fine in vanilla C.
But this engine was desperate to squeeze absolute maximum performance (including hacky hacks to get extra registers), and you can't get that from C.
GBA has some quirks, like the CPU with SRAM that isn't a cache, but an explicitly addressable small bit of memory. If you want functions "cached", you have to do it yourself. It's doable in C with some hacks, but it's an area where code size matters. Getting C code to be small and maximally fast is a https://en.wikipedia.org/wiki/Full-employment_theorem
I've been looking into the SGDK for the Genesis / MegaDrive and from what I've seen performance seems to be pretty competitive.
There's been a resurgence of homebrew commercial releases for the console, and I believe the vast majority of them use SGDK. They seem to perform quite well.
I'm not sure if this is a matter of GCC itself being smart enough in general to generate performant 68000 code. There may be a lot of custom work within SGDK itself to ensure that the generated code is performance competitive.
Apparently this guy came up with some pretty goofy optimizations like switching in and out of thumb mode to reuse the extra register file along with more "normal" stuff like self modifying code. I don't think there are any compilers that do that (and I think most people wouldn't want that.)
You can totally write GBA games in just C though. In fact I learned C writing GBA homebrew in my teens.
If you want to be the best, you will have to write some of your code in ASM. Not all of it but the fastest paths will. And it really depends on how old your old hardware is. For example on the SNES you are not getting away with a compiler for more than the menus and other less taxing parts of the game if you want to be the best. Of course you can always use a compiler for every part of your game if it's something like Pong or Tetris.
Yes, the idea that wall-to-wall assembly is somehow superior is definitely flawed. High-level language compilers have many optimization passes and can transform code in ways that the user might not have imagined. Writing high-level code that causes the toolchain to produce the machine code the author wants is its own skill, though, and a habitual assembly programmer might not have that particular skill.
Yes, but Randy also had to get uber-expert level understanding of the hardware to reach such performance... It would be hard to do that without writing copious assembly specific to the hardware somewhere along the way, regardless of compiler quality, so his "100% assembly" strategy served both goals.
I agree with this. A smart coder is going to write it all in a high-level language and then profile it to see which tiny bits might be better off in assembler and then profile it again to test their theory. Modern compilers know a lot of tricks and can often find optimizations that would be hard to do manually in machine code.
I’ve spent a bunch of time learning the gba hardware and mapping out how a software renderer could perhaps work. There are so many cool tricks possible on the gba, with its arm32 and various different ram areas with different clocks. But going through and actually implementing a quake renderer is a huge amount of work. Wow.
But that's just a polygon model rendered on what looks like a few 2D layers in isometric view. Fully 3D environment is a whole different league. I remember doom for the GBA, which isn't even polygons but a very simple ray caster, and it ran notably slower.
As far as I've ever been able to tell, there are no polygons.
- Player and enemy ships + missiles etc are flat 2D scaled sprites
- Some backgrounds are pure FMV (I think some warping is applied as the player ship moves, to give a more 3D feel)
- Some backgrounds are "SNES Mode 7" style ceilings and floors
Probably the definitive example of "faking 3D without using any polygons" IMO. Even more impressive than Sega's super-scaler games like Space Harrier, Galaxy Force, etc.
I'm seeing a pre-rendered background loop with a little skewing applied as the player moves from one side to the other, with a lot of scaled sprites on top of it. Nicely done.
The big tell on the background loop for me is that the repeating background stuff never changes during a level.
Not sure about Iridion 2 but some levels in Iridion 3D are not purely prerendered background loops - this level is a Mode7ish flat "floor" (the ocean) with a perspective-warping sky BG
It's not super advanced technically, maybe, but it's very well done IMO
Is there any place that I find technical optimization details regarding porting graphical intensive games (such as DOOM/Quake) to a platform that is significantly less powerful than their intended ones?
Of course, for many of the "it runs Doom" the reality is that whatever the device is (fridge, toaster, printer) it has a much MORE powerful computer than a 1990s 386.
Saw it yesterday and as someone already mentioned in the comments, it's crazy to think the Game Boy Advance only lasted 3yrs before it got replaced with the Nintendo DS. This demo blows my mind, and even after watching the video on it I still can't fathom it.
The first few generations of DS were fully backwards compatible with the GBA though so the transition wasn't that bad. The Gameboy in general had an amazing run of backwards compatibility through its lifetime. The GBA could play original Gameboy games going back all the way to 1989.
The one annoyance of the GBA at its release and original lifetime was the screen. It wasn't backlit and was very difficult to see without perfect bright lighting in your room. They eventually fixed this with the backlit GBA SP some years later, but gaming on the original GBA was rough. It's easy to forget just how bad things were without backlit screens, we've been spoiled by decades of amazing smartphone screens in the years since.
I know, every Game Boy Advance (variant, except for the micro) was able to play GB/GBC games. Also for backlight: only the SP AGS-101 and the GB Micro were backlit.
Quite early on in the GBAs lifecycle, there were third party backlighting solutions. I fitted a few back in the day for myself and my friends. Even now you can get decent IPS LCD screen replacements from places like AliExpress.
The GBA was still sold for four+ years after the DS came out, and games were still popular.
The DS being able to play GBA games was a huge selling point, as people forget how "radical" the dual screens were and how uncertain it was that people would adapt, so having the GBA fallback was nice.
Depends on the GBA. The original model, sure. With the backlit SP it's less clear. The DS is more comfortable but less pocketable, can't play GBC or GB games and doesn't have a link cable port so no multiplayer and no trading Pokemon.
IMO a backlit SP is the best stock GBA you can get.
I remember arguing with my friends about this as a kid. A lot of the target audience was naïve enough to believe it, even when it was clear first-party games had moved to DS.
I disagree! I think Nintendo wanted both consoles to coexist in order to provide an additional revenue stream. When you want to play traditional games, you grab your Gameboy Micro or the still-highly-pocketable GBA SP. When you want a stylus and dual screens to play Nintendogs, you grab your bulkier Nintendo DS.
It didn’t really work and Nintendo pivoted—possibly also to compete with the PSP—but I see the Gameboy Micro as evidence of their strategy. Compare an original model DS and the Gameboy Micro side-by-side—the size difference isn’t entirely unlike that of an iPhone versus an iPad Mini.
It’s not the first time Nintendo tried to create a third category between console and handheld—see also the Virtual Boy.
This absolutely rips. So impressed by the 3D achieved here. The other examples of 3D GBA games I’ve seen play some pretty obvious tricks to make it performant, I cant spot any of those tricks here.
"DOOM on SNES happened thanks to the genius and determination of a single man: Randy Linden. The man had an admiration for the game and decided to port it to a mass-market machine so more players could enjoy it. Randy never had access to the source code or the assets from either the PC or the console version. He started from nothing."