The games industry is so stinking bad at archival that it hurts. Very little consideration is paid to using architectures and build tools that can be replicated later. Source code becomes unavailable (as was the case here). Game logic is written using highly platform-specific code that's difficult to emulate. Source textures and other raw resources get thrown out.
And not to diminish from the accomplishment of the team here, because it must have been a huge amount of work, but even with that amount of work this recreation ended up having problems with collision detection that changed the entire feel of the game. I don't think that's because the devs are bad, but just because the task they were trying to undertake was so unbelievably impossible -- of course they missed stuff.
If you're in the games industry, please put some effort into preserving your work for history. It does actually matter.
Let me hand you some binoculars, it might be hard to see the details from the vantage point of giants' shoulders:
* A lot of things "missed" is what further research proved to be "obvious" conclusions of exactly the groundwork laid by these games.
* Between 80h/week, personal life and the permanent question whether your game will sell at all, considerations what you should do in case your game becomes a cultural heritage are moot.
I have been part of this circus 15 years ago and while I definitely am proud of what I did, I do not think any lessons I learned are directly relevant today. Not only that, but I can not imagine to revisit my old code for any amount of money. Yes, my build system increased build speed by a factor of 200 over industry standards, but no, I am not going through those 2.5k loc gnu-make ever again. Or to even document the tricks we used to deliver a 80k jar, including assets.
A lot of the stuff was done to deal with hardware limitations that are irrelevant today. Why go through a lot of pain when the resources today are sufficient to play 100 instances of an unoptimized rewrite in an emulator?
So I'll say up front, you're right to say I don't have 15 years of experience to back this up. And if your takeaway is, "this idiot doesn't know what they're talking about", fine, I'll take that. That's a completely reasonable reaction.
But, you're wrong.
The top rated post on Hackernews right now is a bunch of people gushing about how cool it is to see the source code for the Apple II LOGO language[0]. Your code has value beyond whether or not someone is going to pick up a new game-dev trick for their next title. It's a window into how games used to be built. It is a tiny composite piece of gaming history, and history is not just about how Mario was built, it's also about how small games were built, and what trends across the entire industry looked like.
Your code has value beyond just publicly releasing the source. Maybe you would never revisit it in a million years for a million dollars, but the overall gaming industry is interested in remakes and remasters right now, and if you've ever worked for a publisher or sold the IP rights to one of your titles, it is not out of the question that the publisher might want to do a remake without you. That's the entire premise of this article, and if you have the source code, even if it's not portable, even if it's using hardware-specific hacks, even if it's written in an obscure language, your publisher will be in a better position than if you didn't.
To be kind of blunt, this attitude is exactly what I was getting at above. Game devs don't understand why they should even care about history. If your takeaway is, "I don't care about old games unless they make my new PS4 game run faster" you have missed the point. If your takeaway is, "archival only matters if my game is successful, and I don't know if it'll be successful yet", you have missed the point.
I completely, 100% understand that game dev is stressful. Zoom out a bit from that; working in the games industry in general is kind of hellish right now. If you're putting 80 hours a week into building a game for (on average) less pay than you'd get as a general software dev, and you're not doing it because you genuinely care about what you're making... that's a stupid career choice. And if you do genuinely care about what you're making, then try to make sure it lives longer than you do.
Bear in mind, what I'm asking for doesn't have to take much work. You don't have to go crazy building out your source-code in a special way if you don't have time. Change nothing at all about your dev process -- just take 3 minutes when your game ships, stick your raw source code, assets, and a DRM free binary on a flash drive, and then mail it to Jason Scott.
That's it. Just by doing that, you're now ahead of most other companies in the games industry right now.
> So I'll say up front, you're right to say I don't have 15 years of experience to back this up. And if your takeaway is, "this idiot doesn't know what they're talking about", fine, I'll take that. That's a completely reasonable reaction.
My takeaway is that you did not even read what I wrote:
a) I said I quit the circus 15 years ago. I did not claim 15 years of game dev experience.
> If your takeaway is, "I don't care about old games unless they make my new PS4 game run faster" you have missed the point. If your takeaway is, "archival only matters if my game is successful, and I don't know if it'll be successful yet", you have missed the point.
b) Neither of those are points I tried to make or that someone reasonably could read into my statements. So I am not going to address those directly nor into your ideas of why I might worked as a game developer.
You are missing the point: Just because you would like to have an introduction into resource-conscious development based on the games you enjoyed as a kid, this is neither the easiest nor an in any way appropriate way to study the topic. Go, read and understand some programs written in the late 60s and early 70s and you'll be 80% there - for 10% of the reading time. The other 20% are obsolete anyway.
c) I completely, 100% understand that game dev is stressful. Zoom out a bit from that
I did that, 15 years ago in a radical way - it is called "leaving the industry".
> Bear in mind, what I'm asking for doesn't have to take much work. You don't have to go crazy building out your source-code in a special way if you don't have time. Change nothing at all about your dev process -- just take 3 minutes when your game ships, stick your raw source code, assets, and a DRM free binary on a flash drive, and then mail it to Jason Scott.
> My takeaway is that you did not even read what I wrote:
> a) I said I quit the circus 15 years ago. I did not claim 15 years of game dev experience.
Just a note on this misunderstanding: you wrote "I have been part of this circus 15 years ago", which was ambiguous because 'have been' is normally used for an action or state of affairs that is ongoing (or has just finished). So the reader either skipped over the 'ago' and read the sentence as 'I have been part of this circus [for] 15 years', or noticed that you'd made an error and had to guess whether it was 'ago' or 'have been'.
> You are missing the point: Just because you would like to have an introduction into resource-conscious development based on the games you enjoyed as a kid, this is neither the easiest nor an in any way appropriate way to study the topic. Go, read and understand some programs written in the late 60s and early 70s and you'll be 80% there.
The point of preserving history is not to get good at making modern games. This is like claiming that we don't need any silent films because any modern film techniques class will already teach you how to do a long shot or a dutch angle anyway. Hence my original statement: If your takeaway is, "I don't care about old games unless they make my new PS4 game run faster" you have missed the point. We're not preserving history because we're worried we'll forget how to build games.
> d) That's called IP theft.
Of course IP owners and publishers are allowed to send code to the Internet Archive. It doesn't have to be (nor should it be) an individual dev stealing from the company; I'm not sure where you would get that idea. What I meant was that if Sony and Naughty Dog as companies don't have the resources to keep content around on their servers, there are 3rd party nonprofits that will happily do it for them for free.
This is not a new idea, the UK has gone so far as to mandate that every published book needs to be submitted to a national vault for preservation[0]. The US tends to avoid legislative solutions like this, but I know digital archivists who would like to see laws like this passed for games.
I read all that and still think, "you don't what you're talking about".
This is 100% a problem of dependency management, i.e. a problem common to software everywhere at all times that has never gone away. Whether more work survives is ultimately a factor of whether it's whether it's easy to make it survive, not on the "blame them for lacking effort" angle. Some folks will take the extra time, some won't.
The software that still has its dependencies, we can do something about. For most games that equates to "emulate the release platform, and we get the playable release back" - which is the defacto standard, since it's relatively easy to get copies off a ROM or CD, and it's the single artifact that is most representative of creative intent. It absolutely makes sense that almost all the effort would go towards that. We like seeing Leonardo Da Vinci's plans and studies, but he was focused on results, and we mostly know him for those results.
The source is more specialized, and not as easy to subject to the emulator model. A lot of shops "back in the day" did not even have good backup or version control practices and the game's build system involved a whole custom software project of its own, and only one person knew how to make it go. Getting it even to the point where all the code and assets were accounted for would incur a person-weeks expense, and studios then as now would finish a release with layoffs to shave off salary expense, which in turn gave an incentive to jump ship before finishing to dodge the layoff, leaving certain knowledge about the project in limbo. And employees are often not trusted enough by the management to let them bring in an external drive and dump the project: the optimism of "preserve the art for the future" goes right up against the pessimism of "i'm building a business, not art", and the psychopathy of "if they can't dump the project, they can't steal it". (yes, it happens - on both ends) There are numerous barriers in those crucial weeks around the ship date that can cut off this aspect of archival right at the start, and many a preserved game is the result of some hero who went against the direct wishes of their bosses and made copies in secret.
Nowadays, project tooling tends to fall a little more along standardized lines, and if you start with the goal of minimizing dependencies, you can feasibly have a smooth path to restoring the build without incurring the same overheads that make managers give up on archival to save a dime. So while commercial conflicts of interest remain, it's gotten easier, relatively, if your project scope is not so large that you need to bring in the custom tooling again(which, of course, is what happens in AAA). It's not "sneeze and you've preserved history" easy, but we can work on that.
And what if we're talking about an online game? That's the preservation nightmare taking place today. All the MMOs, mobile games, and so on - those are living things. You can't reconstruct a userbase. So even having running code and assets, it gives you a part of the experience, but it will be empty, fossilized. You will have to fall back on video footage to see it in contemporary context. That said there is the occasional revival effort, such as with Habitat[1].
tldr; if you want better archival, support game labor, support projects that improve basic software infrastructure.
Everything you're talking about is a legitimate problem. I agree with you completely. However, source access helps with the majority of it -- even with making better emulators.
But even disregarding source code entirely, the game industry is also not particularly great at preserving final releases. If you're a big, popular title, sure, that'll probably be available. If you're a smaller title, or you saw a limited regional release, or you're a demo for an unreleased game, it's not a given in our industry that you actually do end up in vault somewhere. And that's the first step -- you can't run a game in an emulator unless you have the game to run.
The easiest question is, "do Sony, Microsoft, and Nintendo have an official archive where they preserve the release binaries of every single game that is released on any of their platforms?" The second easiest question is, "how about source code?" And then after that things start to get complicated and all of the concerns you're talking about come into focus.
In the film industry, there are cross-studio efforts to make sure that raw footage isn't lost. That's the result of prominent directors and studios taking the industry to task for many years, often after they had been personally bitten by film degradation. Nothing like that exists in the games industry. On the contrary, if you read over the comments on this article, you'll find multiple instances of devs arguing that preserving history is less important than shipping the next title, a kind of weird throwback to the same philosophy that led to old film studios destroying film reels from unsuccessful movies so they could be reused on the next project.
Of course, that's not a problem with Crash. It is, luckily, famous, so lots of copies exist. But apparently very few of the raw assets do. And it seems to me that if Sony can't even solve that problem, when Sony seems to have a clear financial incentive to solve it, they're not really in much of a position to start tackling MMO community preservation yet.
> And it seems to me that if Sony can't even solve that problem, when Sony seems to have a clear financial incentive to solve it, they're not really in much of a position to start tackling MMO community preservation yet.
There's a problem with that argument, though. Sony et al have an incentive _now_ to want the content for older games. But it wasn't until much later that the idea of rereleasing games, much less archiving bits of them for future use, was a common thought.
I'm not arguing that the industry has gotten particularly better at that in the interim, just that there wasn't an incentive for it historically, and concluding that they can't do it when incentivized to doso doesn't follow.
IIRC from developer stories from this era, things had to be done "creatively" to save space, and ensure optimal runtime execution. Certain platforms may well have had undocumented bugs and platform-specific optimizations that you simply had to adapt to.
Archiving wasn't a consideration - launching a playable game that would pay the salaries and lead to future projects was.
We take for granted that websites but in 95 are still more or less viewable in their original format today, but that's because things are designed that way. Gaming platforms were more or less abandoned entirely and rewritten - I might be wrong here but games from PlayStation 1 are not playable on PlayStation 4 are they?
Yes and no. I don't really have a justification for saying this because I was never working in that environment, but I don't think the industry spent enough time thinking about how their games would be played in the future and I suspect that a lot of the "well, the platforms required us to do this" is an excuse.
Yes, there are platform-specific quirks and bugs and stuff to work around, but say you're taking advantage of a console specific hack -- well, is that hack encapsulated? Is that hack mentioned anywhere in your internal documentation? Do you have internal documentation? I think you're mostly right that the thought was, "why would we care about building a game that runs on PS2, we'll just make a new game for that console."
You can do a lot with even really bad source code if the other stuff is good -- if you still have uncompressed assets, or your build process isn't completely reliant on some weird third-party proprietary dev kit. Bear in mind that with this project, they didn't even have access to the source code, and they had to manually decode 3D mesh data. So platform-specific logic was the least of their problems.
I suspect that part of that is that the games industry is just really young. And maybe some people were yelling about archival back then, but there weren't any real tangible consequences. Now that remasters are a big thing, you've got situations like this, where companies want to leverage old IPs and are suddenly realizing that doing that is twice as complicated as it should be.
It just sounds like you underestimate the difficulty of building games and, especially, overestimate the certainly that they will be successful.
It’s easy to preach how software should be written in the hindsight of a successful project that you had zero stake in. I doubt the developers of games that flop (so, most of them) wish they’d spent more time and money ensuring their unwanted flop of a game was easier to archive.
Architecture is the last step in this process. The low hanging fruit is "do you still have your source code and original assets? Is any of it documented?"
Nobody will ever know if the original code for Crash Bandicoot was written well or not, because in all likelyhood nobody has it anymore. And I don't think you can chock that up to, "well, developing games is hard." We can debate whether or not it's possible to release a game with readable source code, but if that was the only problem the games industry had with archival, I wouldn't honestly have much to complain about.
The sort of second problem I have with this is the idea that flops aren't worth archiving. I want flopped games to be archivable. If you're a studio and you're saying, "we don't have time to worry about this", fine, that's your choice. What I'm advocating is that preserving history is important, not just for Mario, but also for games like ET. I want people in the future to be able to play your failed games.
Of course caring about archival makes developing games harder, just like caring about accessibility does, and just like caring about framerate does, and just like caring about localization does. These are all different concerns that we try to balance, and widely the games industry has decided that archival is not a concern that it cares about balancing.
Well I know, because I wrote (approximately) half of it. :) It was written well in that it was a technically innovative and beautiful game that pushed the boundaries of the PS1 hardware. It was written well in that the shipped code didn't crash or have horrible show-stopping problems. (This was important because there were no streamed patches back in the days of physical media.)
But it was absolutely not written well from the standpoint of maintainability, abstraction, documentation, testability, etc. This is partly due to the insane time constraints we were under, partly due to the primitive nature of the development tools at the time, and partly due to the intrinsically low-abstraction methods required to achieve the necessary performance -- e.g., the entire renderer and collision detection system were written in MIPS assembly (by me).
> The sort of second problem I have with this is the idea that flops aren't worth archiving.
So many game companies go out of business after a flopped game (hell, even seemingly successful companies like Telltale Games are going out of business) that I doubt anyone has the energy to care about archival in this instance.
> Of course caring about archival makes developing games harder, just like caring about accessibility does, and just like caring about framerate does, and just like caring about localization does.
The difference is that these other things help them meet business goals while archival only becomes important years down the line, when the company may not even exist anymore. The games industry is already notorious for incredibly long hours and crunch time, how can you expect these people to give up even more time?
Besides, there are often no entities who can legally keep archives of the work done at a studio when it is closed down.
Publishers should institute a rule for their own sake that says that the deliverable must not only be the full game but also all source files in editable format, including proprietary tools as well as an image of a build server capable of producing a running product when follwing a documented sequence of instructions. But this will likely cause legal trouble when third party engines and tools are involved. I guess this also needs a cultural change in the wohle industry.
>I suspect that part of that is that the games industry is just really young.
And ignoring archival happened before in another young industry. The majority of silent films are lost. From Wikipedia: "Martin Scorsese's Film Foundation estimates that more than 90% of American films made before 1929 are lost, and the Library of Congress estimates that 75% of all silent films are lost forever." I don't think it's surprising that the game industry acts the same way.
> I worked at Insomniac on Spyro while Naughty Dog was working on the Crash series. We had a friendly competition squeezing out the maximum possible performance from the PlayStation, and that turned into an even friendlier coopetition on our respective PS2 engines when we actually started sharing some code. Based on what I saw of both code bases, Naughty Dog did them a huge favor by sharing so little. It would have been more work to rehab than recreate.
Highly recommend reading the rest of the comment, too!
Fun fact, when Metal Gear Solid (1) was ported to PC, the work was done by a handful of people, and they didn't have access to the original resources so they found some of them on fan websites. Some of the audio files used in the release are 30 second loops of much longer pieces of music from the original game.
Huh, that's weird. PS1 assets were remarkably transparent. Generally the audio was just CD-A tracks or ADPCM files, and the rest of the data was a ISO-9660 filesystem that showed up when you stuck the disc in a computer.
They mentioned it in the commentary on an MGS1 speedrun video on Youtube. They use the PC version because it contains features (such as first person mode) which fundamentally break the game, as well as other bugs. I'll try and find it.
It is a problem that game development shares with other forms of mass-marketed art: film industry, tv, music... they are all pretty bad at conservation, and were downright atrocious until very recently (I’m sure you heard how the BBC lost the first few seasons of Dr Who — and they are an excellence for conservation in the field).
Simply put, the emphasis in creative industries is to continuously create (eh) new stuff, not revisiting the old. Couple that with inevitable financial pressures, add the cultural distortion of software development culture at large (where everything changes every year or so, and if you are not on the latest and greatest you are dead in the water) and you have a situation where conservation is the very last priority.
The systems we run on make it very hard. With emulation, the best way to run an old game is a final binary.
I wrote an iOS 3 game. I have a lovely git repository of code. Is fairly useless as apple won't give me either the OS or compiler I need to run the code.
Not really. The GOOL code was used for character logic and things like the load/save screen. The majority of the code was written in C and MIPS assembly.
It was used for all of the gameplay code. The C and ASM were just the engine level primitives. On the systems I've seen, the gameplay code can outnumber engine code 5 to 1.
I'm not convinced it'd be impossible to use the GOOL code though. It should in theory have been plausible to port a GOOL interpreter, and then simply use the original code in the new interpreter. That said, there are some possible complications with the approach, depending on how tightly integrated the old GOOL system was with the rest of Naughty Dog's very custom engine.
Of course they'd need to actually have the GOOL code to start with. It sounds like all they got from the original games was the original meshes and some other non-code assets, which ... I mean, that's some incredible reverse engineering work. I'm much, much more forgiving of the tiny little inconsistencies here and there given that context; the team did a fantastic job if they were primarily going off reference footage.
IIRC the GOOL code was compiled by a compiler written in Common Lisp. If that compiler code survived, it's probable that it would still work today with minimum / perhaps zero modifications. (No idea what non-standard CL features it might have used.) Of course the output wouldn't be useful for anything but a PS1, but you could change that, and never have to touch the GOOL itself as you say.
I would think the lesson from this is, "don't build an in-house custom Lisp variant for all of your games unless you're willing to distribute the documentation, compiler, and dev tools with your source code."
If it's intrinsically doing any code modification (which as I understand it does in order to dynamically load and link code), even with all of that they might be up shit creek. Some of these platforms don't allow anyone, even the kernel, to make pages executable without providing signatures.
Well, if your platform can't support implementation of typical Lisp/dynamic idioms, then it's frankly a shit platform and doesn't deserve to be called "general-purpose computer".
> even with that amount of work this recreation ended up having problems with collision detection that changed the entire feel of the game.
You know, I did 100% minus (non-required) relics on all three games, and I didn't really feel like it was any different. Sure I didn't compare them side-by-side, and sure I did explicitly ignore the speedrunning part of the games where minor differences would be easier to notice.
I'm curious if they ever patched this. It's certainly subtle enough that it got past their testing, so I'll give you that "changed the entire feel of the game" is probably an over-exaggeration.
The only method I’ve seen work semi-reliably for archiving console game source is to grab the physical machine that made the final build, spray paint “Do Not Dispose! Build archive!” on its side and stick it in a back room. Even that has a 2% chance/year to end up in a dump.
Otherwise, console game build systems were far to finicky to repro later (at least up to the 360/PS3 generation). You would need a huge collection of confidential tools from the manufacturer and those tools sometimes had OS/driver/hardware dependencies that can be reproed reliably with a VM. Think USB driver for the GameCube debugging connection. IIRC, the PS1 and N64 required add-in cards to connect a PC to them. Then there was the propriety test disc/cartridge burners.
All true, yet for PS1 there was the net yaroze dev system that was distributed for some time. I actually had a copy at one point.
Seems inconcievable it wouldn't just been easier to use a software emulator and the original data from the cd. But then I don't know if there is a good software emu for the PS1?
It's likely that Vicarious Visions was unable to get the source code from Naughty Dog due to Activision being unable to get it via Sony, though I'm not 100% sure about that since VV mentioned that majority of the original code is virtually unusable in a blog post.
This is buried in a comment but it's worth mentioning.
Naughty Dog used Lisp to write Crash Bandicoot. They had something they called GOAL--Game Oriented Assembly Lisp. From what I understand, this was a DSL they wrote so they could write assembly language using Lisp syntax, and call it from Lisp functions.
Don't you mean that the other way around? "Any C++, Java, or Python, or <insert your lang here> program will eventually be written in a Lisp-based DSL that generates that language" ? Seems far more accurate to me than to claim that all LISP functionality is derivable in (more rigid, less flexible) language. What prompted you to say this in the first place? If LISP is the mother language, of course all the rest are but a subset.
They rewrote Crash Bandicoot. It was originally written in lisp and is now written,I'm guessing, not in Lisp.
The parent poster was factually correct.
Your statement while still holds true isn't the opposite of what the parent said - you've replaced rewritten with written. If you said "Any C++, Java, or Python, or <insert your lang here> program will eventually be rewritten in a Lisp-based DSL that generates that language"- you would be factually wrong in this account.
Funny story about Crash: Everybody around 30 years old in Slovakia knows Crash Bandicoot. In late 90s - early 00s there was a very popular TV show where you could call in and play the game with your phone (basically just left/right/jump) I spent hours watching it https://youtu.be/v3RYappcWZg
Like even though the players die a good deal, it still feels like they can play it pretty smoothly.
I’d love to know the system that got built out for this. I imagine it was detecting tones, but would love any more insight into all of that. An rpg game would have been a good way to overcome any timing issues games may introduce.
I think it was like regular domestic call and you could won playstation at the end of the month. But since it was super popular, it was crazy hard to get through, I tried, but lines was always busy :)
They rebuilt a game in Unreal 4.
So yeah from the ps1 era almost everything had to be redone from scratch. I fail to see the problem here. Did player expected them to package the optimized lisp virtual machine they had back then in UE4?
Reusing prtions of the gameplay code in a remaster can be feasible. Other remastering projects have managed to go further, although they had more favorable starting conditions. The remastered Full Throttle game is running the original SCUMM game logic without any changes. They instead extended the VM to perform on the fly asset swaps for the remastered artwork and sound.
> Did player expected them to package the optimized lisp virtual machine they had back then
Well, yeah? It's a remaster, not a remake. A remaster implies taking an existing project and updating it. And it makes sense to keep the old game logic - why write from scratch if you already have the whole game in a working state?
It's a remake, not a remaster in the definition that you are thinking of. Vicarious Visions did not use any of the code, likely to the fact that either it was unusable or they did not ever have the source code.
N. Sane was done in Alchemy Engine, not Unreal Engine 4 (unlike Reignited), although VV is likely following suit with their upcoming project.
And not to diminish from the accomplishment of the team here, because it must have been a huge amount of work, but even with that amount of work this recreation ended up having problems with collision detection that changed the entire feel of the game. I don't think that's because the devs are bad, but just because the task they were trying to undertake was so unbelievably impossible -- of course they missed stuff.
If you're in the games industry, please put some effort into preserving your work for history. It does actually matter.