I'm the current owner of the Catacomb games, and have been working on improving and porting them in my spare time (not publicly available yet, but will be When It's Done). Fabien's Game Engine Black Book has been an invaluable resource for me. I highly recommend it!
The original publisher, Softdisk sold many of its assets to Flat Rock Software in the 2000s. Flat Rock sold off bits and pieces over the years, and the Catacomb games were one of the last things it held [1]. In 2017, I wrote a toy implementation of Catacomb 3D in WebGL [2]. Since the project used art that was under copyright, I contacted Richard at Flat Rock to ask permission to use it. That conversation eventually led to me deciding to buy the overall game ownership from him [3].
The most important factor is likely the current owner wanting to making sure it's going into the hands of someone who will do something caring with the rights.
I was looking into buying that IP a few years ago, and the asking price was affordable (I just didn't have the $ at the time so I went onto other projects). Glad to see someone doing something with it, I think it's perfect for a remake.
One omission from that book that would have been nice to have been included is an explanation of how the 3D scenes are drawn in those games. In one of the quoted passages, Carmack states that it is something other than the raycasting used in Wolf3D, with the added disadvantage of not being correct in every case.
From personal observation, the perspective looks strange in those games (no consistent single vanishing point). However, maybe this can be fixed by tweaking a few variables. Supposedly Catacomb and Hovertank have much higher frame rates that Wolf3D on old computers.
(another omission is that it doesn't completely explain the theory behind linear feedback shift registers, necessitating further research on the part of the reader to understand them).
Just to clarify: the question regarding LFSRs is not in how to implement them (which I think was described adequately), but how to determine what their orbit is upon repeated application, and how to choose one that generates a complete permutation when this is done. And furthermore, how was the particular one used in Wolf3D found. Was 100% Psyched to hear how the screen could be efficiently cleared pixel-by-pixel seemingly at random, but ultimately was left in the dark on this specific question.
Many people not know what you're alluding to, so I'll clarify. The current owner of the Keen Dreams IP (Javier Chavez) is a weird internet racist who got himself permanently banned from selling games on Steam after a meltdown where he changed his name to something that can't be repeated in polite company and began having a meltdown about SJWs and Trump.
In addition to releasing Keen Dreams, he also did a reskin of Hovertank 3D (which I believe is open source) except the plot is replaced with weird racist revenge fantasies about killing refugees.
Then, after being forced off Steam, he seems to spend most of his time creating anonymous email addresses and harassing people.
A company called Night Dive Studios (the current licensee of System Shock) managed to get Keen Dreams back on Steam with reassurances that they had gotten it from Chavez and that he would receive no ongoing income from it, but the game was mysteriously subsequently pulled.
Here are some Twitter threads that speaks to the above. Warning: racial and homophobic slurs, open praise of Hitler, pages and pages of abuse, etc:
Wow, how sad. We didn't have a video game console growing up, so I have very fond memories of Commander Keen. Does Chavez own the rights to Commander Keen 4 too?
What's more insane is the downvoting of actual factual information. But then again, I'm not surprised, it seems to be a norm. Too bad the moderators don't have the brains to implement and enforce a "You must post a reason which will be publicly shown before you can downvote, and you cannot delete this reason or edit it after submission" function to the site.
I've noticed that videogames, where killing is normally the main point of them, often justify it by dehumanizing the enemies, but does that make any difference to anybody except worried parents?
For example, Wolfenstein 3D has you killing Nazis which we're already trained to see as valid targets for violence fantasies. Doom's human enemies are "former humans", Quake's human enemies have probes in their brains to make them evil. Sonic the Hedgehog only kills robots, not animals or even Dr. Robotnik who always escapes. But you wouldn't know these things unless you read the manuals or really paid attention. So would a teenage player really feel any different killing non-dehumanized humans instead? Maybe it's just a storm in a teacup when there's outrage at games like this immigrant killing one, Postal 2, or CoD's No Russian level. It all seems kind of perverse to train people that killing is very bad unless it's somebody belonging to a class which has been labeled as "bad" in any superficial way, or a humanoid monster, in which case any amount of gory violence is fine, such as Doom 2016.
Except that it's a game, and unless a person has significant problems anyway, they will understand that there is a difference between killing something in a game and doing it in real life.
That said, children have been playing games of "shoot em up" for many years. At one point it was cowboys and Indians, earlier it would have been whoever the local defender and perceived "bad guy" was.
I have a feeling that this is human nature, and I actually feel that we are far better having our children playing computer games where they are targeting mythical creatures, than playing shoot the Taliban or some such.
I have really fond memories of buying Catacomb Abyss by mail order in the early 90s when I was 11 years old. We used to play — my brother and sister were younger than me — with the lights turned off, terrifying!
It can’t be understated how much impact Wolfenstein 3D and Doom had. Commander Keen was already quite the tech demo, with scrolling on a PC without dedicated hardware; the most advanced platforming you’d see at the time would be Price of Persia.
Wolf 3D felt like a next gen tech demo out of time. It was great fun, it was violent, it looked unlike anything else (besides Ultima Underworld). Of course, that monumental achievement looks insignificant when compared with Doom, which single-handedly added stairs, different floors, multiplayer (!) and modding.
Most of us here work in software. To see software so casually come in, introduce never seen before concepts, it’s so impressive to me. I don’t think That I know of something having quite this impact in other aspects of software.
> It can’t be understated how much impact Wolfenstein 3D and Doom had. Commander Keen was already quite the tech demo, with scrolling on a PC without dedicated hardware
But ... the PC did have hardware supported scrolling? CGA, EGA, VGA and the various SVGAs all had a register for "address of beginning of screen". It was not well documented, and nontrivial to use especially because of the banked memory setup -- but it WAS there even on the first ever CGA, Hercules and Monochrome adapter which were all based on the 6845 - and also available on EGA, VGA etc - the latter made things even less usable by practically requiring Mode-X plane memory addressing to use.
Looking back, it is hard to believe so much famous games were released in such a short amount of time, in the early 90s. 1991 was the year of Civilization, Lemmings, Street Fighter II and the first SNES Zelda.
The classics of gaming were laid during these years.
I've played much "better" games than Doom in many ways, but never one so revolutionary for its time, and I've never been so genuinely afraid when playing.
As a little tyke I used to actually duck left of my monitor when an imp threw a fireball with my heart thumping. Replaying as an adult I'm like wtf that's just a lumpy mass of pixels, how could I have been so scared?
I remember watching my father play Descent and that was pretty much the first real 3D game I've seen (Germany, so there was a bit of trouble with Doom and Wolfenstein). And I was constantly trying to peek around corners by moving my head.
The original Alien vs. Predator vs. pretty scary, too. Especially the Marine missions. I once didn't move in the game for what felt like an hour or so and refused to go down a particularly bad lit corridor.
Impressive id went through development of Quake 3 with no version control.
When I was interning at Apple back in the day I had a side project of porting wolf3d to OS X, and it was the first occasion I had where I came to admire John Carmack's code directly. His game source code is something I recommend checking out to coders interested in gamedev.
Also, I have fond memories of poking around making DOS experiments in Borland C++ as a teen.
But speaking of version control — I guess that's just how things were done in the 90s. I read recently that not only was Final Fantasy 7 developed and shipped without it, but Squaresoft contemporaneously lost the final PlayStation source code and art assets. The studio contracted to port it to PC was supplied with a mishmash of non-final code and assets.
Many times in programming, someone has given a name to something long after a segment of the population had already
discovered it for themselves. This can lead to a weird effect where people who in fact approve of a technique end up sounding dismissive because of course everyone already does this it's no big deal. Not everyone is doing this and it can be a big deal.
To date myself a bit, the first time for me was Refactoring, and one of the most recent was the Mikado method (a technique for top-down refactoring). The reason I'm book-ended this way, and the point of my response, is that Mikado requires you to be very comfortable both with version control and throwing away a big chunk of code you just wrote.
I don't think I would have ever sussed out Mikado if I were trying to do Poor Man's version control. I think the cost-benefit analysis would fail.
The thing is, the Mikado method, at its best, makes you look like some sort of coding god, because you've accomplished some gigantic change to the system behavior with fifty lines of terribly reasonable code. Instead of five hundred lines of error-prone nightmare.
Version control software was a lot more primitive and less fun to use in the 90s. On Unix you had CVS, on Windows just VSS. SVN, Mercurial and Git all came post 2000.
And it took SVN a couple extra years to be performant on Windows, which at the time was still the dominant development environment in many places.
Maybe my second SVN project, we had a monorepo, Windows, and a virus scanner (multiply pull time by five). I'd come in in the morning, log in, do an svn up, go get coffee and say hi to the people I was collaborating with, and be back to my desk all before it finished.
Many days I only synced to head twice because it was a pain in the ass, and the SVN maintainers were not at all sympathetic. A new contributor consolidated the config files and cut the number of file open operations by a couple orders of magnitude. I don't think we ever properly thanked that guy.
> A new contributor consolidated the config files and cut the number of file open operations by a couple orders of magnitude. I don't think we ever properly thanked that guy.
Unspoken benefit of “new person” — eventually someone not desensitized to the crap will throw up their hands and fix it.
Sometimes it's so tiny too, but requires you to look at code that nobody else has a reason to look at any more.
I once increased performance of a CMS by 30 percent my first day in the job because I happened to spot a handful of lines of unnecessary string copying while trying to figure out how the thing worked.
Everyone else could have, but none of them had any reason to look at that part of the code because it worked.
I remember using SVN in the early days and it was a huge pain. Some days it even seemed like it caused more problems than it solved. I don't think the situation really improved until Git came along and gave it a kick up the arse.
That said, modern SVN is nice to work with in my limited experience but I'm definitely firmly in the Git camp now.
Early 90s revision control flashbacks on the Windows front ... I recall living through migrations from MKS RCS (by Mortice Kern Systems) to SourceSafe (when it was a One Tree Software product before MSFT bought it) and eventually VSS.
Commandline automation of these for build or deployment scripts was painful but we were grateful for any source control on Windows at all because the alternative was still common and it sounded like this, shouted over a cubicle wall: "Hey, I'm going to edit UTIL.C on the WFW3.11 share ... everyone okay with that?" Cringe.
VMS had pretty nice version control, if you happened to be working on a VAX. It's hard to believe, but in the early 90s you still saw non unix (clone) OSes in the wild.
> Impressive id went through development of Quake 3 with no version control.
I'd say that taking a backup of the code at the end of the day and sticking it all in a directory on a server somewhere constitutes version control. Especially as Id documented everything they did so thoroughly (based on Carmack's .plan files etc).
When SVN and Git came along they basically just streamlined and improved practices that a lot of devs were already following - and added some useful features like merging with proper conflict resolution and whatnot.
"We didn’t have a version control system. Surprisingly, we went all the way to Quake 3 without one, then we started using Visual Source Safe."
That is both impressive and terrifying. I'm old enough to remember when I first started using VSS, and being amazed at how awesome source control was :D
These days, it's hard to look back at SourceSafe with anything other than horror....
There was a transition from the 80s to late 90s: until you had some kind of network, the important thing was to backup your machine. Daily if you can. Even if you had been running a local revision control on your local machine, you wouldn't be protected enough, so backups were more important.
Once networking and servers became more prevalent, it slowly made sense that backups turned into revision control. If your machine crashes, there is a remote repo you can connect to and get back to where you were.
From the book: Carmack stepped into the local bank and requested a cashier's check for $11,000. The money was for a NeXT computer, the latest machine from Steve Jobs, cocreator of Apple.
Interestingly that's the exact PC my dad bought us when we were kids. I loved how computers back then had a turbo button, but I could never figure out why anyone would turn off turbo! :P
I believe it was due to programs (mainly games) that relied on clock speed to work. If you ran some of them with turbo, they were so fast they were unplayable.
An example from Fabian Sanglard's game engine black book, wolf3d, p222
When the AdLib was released in 1986, developers were instructed to send data "as fast as
possible". At 4.77MHz, a PC was unable to out-pace the AdLib. Yet as CPUs got faster,
issues started to arise and the card was unable to keep up.
So if an old program ran on a new machine, it might mangle the sound or even cause a hardware crash. So the only way to play it was to slow down the new computer to the speed of the old one.
If I remember correctly, there were even some 286 motherboards with SRAM cache onboard, as well as on 386 boards. That would be L1 cache on those. For a 486 board, this would be L2 cache since the CPU has its own internal L1.
These SRAM chips were dual-inline pin packages (DIP), i.e. those black rectangular chips with fat pins down both long edges. Sometimes these were in friction-fit sockets and sometimes soldered down. On the 286, I think the DRAM was also socketed DIP.
Ummm.... I remember my father gifting me a 386DX 40Mhz with 4 MiB of RAM, 200MiB IDE hard disk and one of the first X2 CD-ROM that support all CD-ROM formats. All this on 1990 or 1991. Before, I had a ZX Spectrum +3.
I really have good memories from these two computers.
My parents bought me a 386SX 20MHz, 40 MB HDD, 2MB RAM just with floppies in 1991.
It was around 1500 euros, when converting directly the price into today's money, without taking currency evolution into account. Back then working as cashier would get you around 300 euro per month, before taxes.
I remember running wolf3d.exe on my 386SX/16MHz in the mid-1990s, and it wouldn't run worth a damn unless the game display size was reduced to the lowest possible setting, about the size of a business card on the ol' VGA CRT. I spent many hours squinting at pixel-Nazis.
Now, some 25 years later, I have been playing through Wolfenstein II: The New Colossus, and I had nostalgic good fun playing several levels of the original Wolfenstein via an in-game arcade machine.
That price seems a little high. I was selling these from a custom shop in 1994 and a pretty well equipped 486 and then Pentium was about $2,000 to $2,500. You could spend a bit more and get better audio and video cards, but the base models at the time ran Doom and Quake pretty well. Computers seem like they have stayed in that range all my life. Back then I had a $2k desktop and today I have a $2k MacBook. There were always cheaper options for budget conscious buyers and the sky has always been the limit if you want to go crazy.
I've tried using that book to explain to people how "real" software development works. How the inception of the real cool shit, the stuff that spawns a company, is almost always 2, 3 or a small handful of people working late, night after night, living on pizzas and cola, churning code. And how that apparent anarchy is a more productive environment that ten pointy headed bosses plugging away at their gantt charts and a hundred 9 to 5 developers carrying out their orders.
Not that I want to be seen defending pointy headed bosses, but I think there's a fair amount of survivorship bias in your statement. How many groups of 2-3 people have worked late, lived on pizza and coke and produced absolute garbage? Quite a few, but you never hear about them because they aren't notable.
You don't have to get into pointy headed boss territory to be a successful programmer and leave the office at a respectable time every day.
Are many successful game programmers in corporate environments leaving the office at a respectable time every day? I definitely don't get that impression at all.
At least in a 2/3 man team surviving on pizza and Coke you're not lining someone else's pockets with millions of dollars only to be thrown out onto the street at the end of the process.
I’m still in shock they started out in a basement in Shreveport. Shreveport! The tech industry in Louisiana these days consists largely of ghoulish “insourcers” like Perficient, DXC, EA, IBM and CGI or zombies like GE Digital.
If him blasting music and destroying things with various weapons while devs were trying to code and running ads telling customers he was "about to make you his bitch" was a hagiographic gloss, then reality must have been truly dark.
I think the "bitch" part was later, when his own company was developing "Daikatana", which was a famous disaster. Maybe he was more reasonable earlier on.
That advertising campaign wasn't Romero's idea; it was the idea of someone else at Ion Storm. Romero was initially a bit hesitant on the slogan but caved to pressure.
At least that was how it was presented in (iirc) Masters of Doom, who knows what the actual truth is.
Ah, good to know. It's been a couple of years since I read the book and I may also have misremembered something about the ad. I definitely had the impression that the guy would have been a very difficult coworker, boss or roommate, and he was all three for parts of the team at various times!
A version of Keen's tile editor was used to create maps for games as recent as Rise of the Triad, which was itself built on a modified Wolfenstein engine.
John talks about Id's Software Principles, one of the YouTube commenters (Sean Ramey) summarized them below:
1. No prototypes. Just make the game. Polish as you go. Don't depend on polish happening later. Always maintain constantly shippable code. (Large teams require more planning though.)
2. It's incredibly important that your game can always be run by your team. Bulletproof your engine by providing defaults (for input data) upon load failure.
3. Keep your code absolutely simple. Keep looking at your functions and figure out how you can simplify further.
4. Great tools help make great games. Spend as much time on tools as possible.
5. We are our own best testing team and should never allow anyone else to experience bugs or see the game crash. Don't waste others' time. Test thoroughly before checking in your code.
6. As soon as you see a bug, you fix it. Do not continue on. If you don't fix your bugs your new code will be built on a buggy codebase and ensure an unstable foundation.
7. Use a development system that is superior to your target.
8. Write your code for this game only - not for a future game. You're going to be writing new code later because you'll be smarter.
9. Encapsulate functionality to ensure design consistency. This minimizes mistakes and saves design time.
10. Try to code transparently. Tell your lead and peers exactly how you are going to solve your current task and get feedback and advice. Do not treat game programming like each coder is a black box. The project could go off the rails and cause delays.
11. Programming is a creative art form based in logic. Every programmer is different and will code differently. It's the output that matters.
Extra advice:
1. Only program for a few minutes and test code immediately. Try not to code for even as long as 30 minutes. This is will help to avoid debugging because you will catch bugs sooner, and won't have as wide an area of code to look through for the bug.
Interesting how they make sure to mention the use of 320x200 during graphic editing to maintain proper aspect ratio, then display screenshots from the game in improper aspect ratio
You are not making an equal comparison. DOS at the time didn't have multiple windows or multitasking. People that don't use IDEs these days still use multiple terminals or windows.
Using the Borland C++ IDE in 1991 is way more similar to using Emacs these days, than Visual Studio Code.
FWIW, I've had an account here for 5 years (last month in fact) and am typically here for an hour or so every day and have never seen anyone make this claim.
Yep, I started with Apple II and UCSD Pascal and kept using IDE since then... I can dive in the terminal and use make if needed but it feels unnatural to me.
Great article - cool to get an inside look at tools, IDE environments, art sketches etc.
Such an exciting time for PC games, with ID way out at the forefront - highly recommend the Masters of Doom book mentioned in the article too, it's a deep dive into the making and eventual impact of Doom. Great snapshot of the ID guys and of that time period.
Used to get severe motion sickness too until I figured out it went away at high FPS. At least for well made games with low input latency.
With a good 1000Hz wired mouse and games I where I get at least 100fps, preferably >144fps, I don't feel anything at all when I play. Still can't watch others play though.
My curiosity is not sufficient to test whether the same is true for me. Particularly not since if I get immersed in the game, by the time I realize it is happening it is too late. The potential upside just isn't there to make it worthwhile.
However I've also found that 3-D systems that everyone else oohs over whose operators promise don't cause motion sickness any more, always do. Really fast. So I'm apparently on the sensitive end of motion sickness from computer systems.
"Borland’s solution was an all-in-one package. The IDE, BC.EXE, despite some instabilities allowed crude multi-windows code editing with pleasant syntax highlights. The compiler and linker were also part of the package under BCC.EXE and TLINK.EXE."
My first Borland product was Turbo Basic, followed by Turbo Pascal 5.5 (I used all the following ones up to Delphi 3 or so), Turbo C 2.0 and then Turbo C++ 1.0 made me a C++ fanboy.
I thank Borland to my introduction to type safe systems programming via Turbo Basic and Turbo Pascal.
The OOP learnings from Turbo Vision, OWL and VCL.
And by teaching me already in MS-DOS that C++ was a much better alternative for people that care about safety than C would ever be.
Turbo Pascal was great. I had heard of this C language that people were using to make games, but in the heyday I could never find a copy of a compiler for it.
It was only much later that an uncle of mine got a promotional copy of Watcom C DOS4GW that he gave to me.
The Free-Pascal text-mode IDE (FPIDE) also uses that look, even on Linux. But there is no C/C++ version that I know about. TUI-style editors (with GUI-like menus, widgets, overlapped frames etc.) in general seem to be unpopular on Unix/Linux, the only ones I can find (besides the one mentioned above which is part of Free-Pascal) are essentially unmantained.
Of course the quickest way to get a working TUI-mode editor that is stably maintained is probably to add support to Emacs, which already has text-mode working with things like a global menu, widgets, multiple "frames" etc. many of which were directly lifted from the "GUI" version.
They even released the C++ version as public domain. Free Pascal uses it's own Free Vision which is back port of the TUI support into Graphics Vision[1] (which in turn is LGPL's reimplementation of Turbo Vision API in graphical mode)
Others have said that. I remember playing with BC.exe and the free DJGPP + Allegro library to make games at that time... oh man those were the days to make games, it was romantic (and I was in secondary school). Too bad I was not in an environment to be nurtured.
https://github.com/FlatRockSoft/Hovertank3D
https://github.com/CatacombGames/Catacomb3D
I'm the current owner of the Catacomb games, and have been working on improving and porting them in my spare time (not publicly available yet, but will be When It's Done). Fabien's Game Engine Black Book has been an invaluable resource for me. I highly recommend it!