Great interview! I would definitely be interested in a history of Unreal in full book length form. And if anyone hasn't already perused "Masters of Doom" by David Kushner on the early days of Id Software, it is such a good read.
Regarding the Haskell love. I believe Unreal contributed to the design of a research functional programming language prototype called Cayenne back in the day:
I think another historical note is how everybody developed on SGI machines in the late 1990s. Primarily, due to Maya modeling. But the 3D graphics API wars were certainly a factor:
Back in the "engine war" between Quake 3 Arena and Unreal Tournament, the map community loved UnrealEd, it was definitely easier to use and was just-in-time, no seperate minute-long build-process to build the level and try it out, in UnrealEd you could just try it out. Visual Basic 4-6 were incredible RAD product (rapid application development). Before Github, before GoogleCode, before Sourceforge, there was PlanetSourceCode and it was the go to place for Visual Basic dev community. For everything you could find little programs and code. Windows 95/98 era and Visual Basic was the perfect symbiosis. Very sad when MS announced the vision of .net and stopped VB6 in 1999 (it took them until 2003 to show up with something, yet many devs faced a wall with no support, and were forced to moved on to Java/PHP/etc). Anyway when I first read about UnrealEd was coded in Visual Basic 5 or 6 I couldn't believe it, it had an outstanding performance and integrated the C++ renderer perfectly. Though it was a bit unstable (which was common also for Word 97 and Excel 97 back then), so one had to save a lot, and use "save as" to not correct files. I would say the next evolution of game editors was certainly Sandbox editor of Cryengine 1 from Far Cry 1 disk (2004). Creating tropical islands with its landscape editor was never easier. Of course other tools soon reached parity.
On those days I was still a bit biased towards Delphi and C++ Builder, much more productive that VB and VC++/MFC.
Sadly Borland lost their way, and their key architects to Microsoft, so.
Even today, C++/CX with XAML is not as productive as C++ Builder, and it took all the way to Windows 8, for Microsoft to start taking .NET AOT compilation seriously.
Meaning VB 6 probably still produces better executables than VB.NET.
> Even today, C++/CX with XAML is not as productive as C++ Builder
I'm glad you think that - I work on C++Builder, and work hard to improve its strengths. Productivity is a key one compared to other C++ tools. These days it's not just Windows, but iOS and Android too, and you will hear more good news about platforms and C++ standards support sometime soon. Not all the team left for MS, by the way: we have some truly incredible people on staff.
From the article, re Turbo Pascal, the precursor to Delphi,
> You would be typing code, and then several seconds later it would be compiled, and you'd be running it.
That's still true of Delphi today! There's a video on Youtube of a million lines of code compiled and linked in about six seconds. It's hard to overestimate the productivity that kind of round-trip speedy development brings.
Does modern C++ Builder still have DRM? I remember looking some years ago at how Delphi/C++ are doing and i saw that they had DRM (some people even complained in the forums that they had to use cracks to make the tools work in their laptop while on the go, which is a big screwup IMO).
I have an ancient version of Borland C++ Builder - actually the first one - which i got off ebay some years ago (i like collecting old develpment tools [1]). I like that it basically works everywhere and is stable, even in Linux through Wine [2]. I used it mainly to make a few things for fun, like some patches for old games [3] (because it makes very small executables - like 250K or so), a small 3D editor [4] and a WinLIL [5] - a PyWin inspired shell for my LIL [6] scripting language (mainly an excuse to make yet another thing that uses LIL :-P).
Normally i use Lazarus [7] (on which i wrote the more serious alternative for WinLIL, LIL Studio [8]), but there is value to being able to use C and C++ code directly from a very easy to use RAD GUI builder. For example it took me like a couple of hours to make WinLIL and 99% of that time was making the graphics routines - for LIL i just dropped the lil.c and lil.h files into the project and it worked out of the box. Also when a new version of GLEW was released, i thought "well, this is C right? Let's see if it works with BCB" and it did, although i had to remove the C99-isms (only a few changes here and there) [9]. For comparison to actually access OpenGL 4.6 through Lazarus i had to write an entire custom generator (basically reimplement GLEW) [10].
Some people have actually suggested me to try Qt Creator and i did... and it sucked. Ok, not exactly sucked, but it was way less "integrated" and more kludgy than what you'd get with classic VB, Delphi, C++ Builder or Lazarus. It is certainly a step above than the dialog editors you get in other toolkits that feel like glorified Win16 era resource editors (actually scratch the glorified bits because you can't even embed actual resources in them, only edit the dialogs). But even Delphi 1.0 on Windows 3.1 feels better.
So yeah, C++ Builder can be a great thing to have...
BUT!
As i wrote above, the last time i tried Embarcadero's stuff, they seemed to be heavily infested with DRM crapware. I really detest DRM - notice how i actually have Borland C++ Builder running from its original disk basically 22 years after its original release from a company that practically doesn't exist anymore? I am certain it'll work even 22 years into the future - even if Microsoft goes braindead and kills Win32, it'll be Wine or even some emulation layer like DOSBox.
But can i expect the same from modern C++ Builder? Does it still have the DRM? There have been several changes since the day i wrote the article at [1] (and even tried C++ Builder a year or so later and found that it addressed some issues i had with the UI that i mentioned at the bottom - although the bloat issues were still there, but those are minor issues that will solve themselves with time - after all i have tried BCB1 in an old PC from late 90s i have and it isn't exactly fast either :-P) and AFAIK the company has changed hands again so things might have changed on that front too.
Personally i wouldn't like to pay $700 for a program that goes away in 3-4 years and shuts down their DRM servers making it impossible for me to use that again. And lets be honest, Embarcadero has changes hands (and names!) so many times over the years that their future prospects aren't exactly trustworthy.
EDIT: i just saw that C++ Builder now costs $2k... well, FWIW i wouldn't be able to buy it even without DRM :-P. When did the cost change? I remember it being much cheaper.
I can't comment on antipiracy tech, but I can state that I know of no reason whatsoever why anyone should ever need to crack the product to use it -- no bugs, nothing. Nor is there a reason existing installations would ever stop functioning (unless you have something where you'd expect it, like a timed license.)
In terms of price: Starter is completely free. Pro's cost is on par with VS, and if not, feel free to reply why, always worth revisiting.
The Unreal Editor was such a breath of fresh air compared to what Carmack at id had cooked up to create Quake maps. I worked in both editors as a young kid and the design paradigm with Quake, assembling a map in a void, left my maps with a host of problems usually related to planes not lining up correctly or having tiny gaps (leaks) which caused the map to not compile correctly or created an issue when actually playing the map. It drove me crazy.
UnrealEd on the other hand inverted that paradigm, instead of building your level a void you carved your map out of a solid mass. The geometry was much easier to work with and led me to really embrace Unreal's engine over Quake. I still loved Quake as the better game but Unreal's technology shined brighter for me as a young map maker.
> left my maps with a host of problems usually related to planes not lining up correctly
Note that this was most likely an issue with the editors people were creating at the time. AFAIK nobody has managed to build the original QuakeEd's source code and that one didn't had those issues (mainly because it didn't allow much manipulation in the first place - most brushes are axis aligned and neatly arranged on the grid).
> UnrealEd on the other hand inverted that paradigm, instead of building your level a void you carved your map out of a solid mass
Which is funny because this is also what Doom did: like Unreal, Doom had you carve up negative space in a huge solid world. This approach was also used by other engines at the time, like the Sith Engine by LucasArts which was used in Jedi Knight (in that engine all world geometry is made by sectors creating negative space pretty much the same way like in Doom).
Although personally as a programmer i tend to favor positive brushes for their simplicity. I have written a 3D world editor that has both negative and positive brushes and the code to handle the geometry generation for negative brushes is like 10 times more than the positive brushes (i don't use BSP) due to all the edge cases from floating point precision.
Jedi Knight was one of my favorite games, but the (community-built?) editor was a nightmare. I did a lot of UT level editing, which was a lot more comfortable.
I remember the next version (UT2003?) put more emphasis on imported models to populate a sparse UT-style carved-out map, which I imagine made things a lot easier for most people, but sadly was the end of my level editing days. I never got into 3D Studio Max (or Maya?) enough to be able to create maps as nice as those of others.
> the (community-built?) editor was a nightmare. I did a lot of UT level editing, which was a lot more comfortable.
Yeah, there are two editors, JED and JKEdit. AFAIK the latter was much easier but also it was shareware whereas JED is (or became) open source but harder to use but on the other hand it has more features. Also it is extensible and i think it uses some sort of text-based API to communicate with external programs since i've seen extensions being made in different languages. As the end result the entire community focused on JED (although there are still a few people using JKEdit).
But both editors work with the negative sectors directly while i think LucasArts' editor (Leia) had a more high level interface that was more similar to UnrealEd's subtractive BSP. The engine uses per-vertex lighting and i've seen some "artifacts" in the geometry that give me the impression that the geometry was generated through BSP subtraction instead of the more "hand-made" approach that JED has (i don't know about JKEdit).
(Also FWIW this was a similar case with the original DoomEd for Doom - unlike most editors made at the time and even later, in DoomEd the sectors were implicit and assigned through some sort of "floodfill" algorithm that found the connections between them, but in most Doom editors you had to manually create vertices, linedefs, sectors, etc)
> I remember the next version (UT2003?) put more emphasis on imported models to populate a sparse UT-style carved-out map, which I imagine made things a lot easier for most people, but sadly was the end of my level editing days. I never got into 3D Studio Max (or Maya?) enough to be able to create maps as nice as those of others.
Actually it made things harder for people since it added the requirement to use an external tool to create the assets - this was always (even today) was seen as a major drawback when it comes to modding. It isn't a big problem for big game studios because they have dedicated artists for making environment props and the level designers simply have a ton of premade props at their disposal so they don't need to actually make them themselves, but it is harder for solo and small team modders.
This is probably the reason why you see a lot of maps for games that come with in-editor tools (like Valve's games, despite having a clunky interface in Hammer, the brush tools are top notch and a ton of maps are made with that) or with a ton of premade modular assets (like Bethesda's tools which also encourage people to create reusable assets for others as mods as opposed to special single purpose assets).
Thanks for the long response! Brings back memories. I now know for sure I was using JED precisely because I was a teenager who wouldn't consider paying for JKEdit.
I also remember fiddling with surfaces to get nicer lighting, and being happy that I didn't have to do that for UT editing.
> Actually it made things harder for people since it added the requirement to use an external tool to create the assets - this was always (even today) was seen as a major drawback when it comes to modding.
Yeah, it was exactly why I stopped modding. I was more interested in level design than fiddling in a 'proper' 3D editor and I never got around to understanding 'skins'. Was a bit saddened by that and didn't like the idea of relying on other people to provide models for my levels.
I remember the revelation of realising that unreal used positive “null-space” instead of negative and being blown away... no more damned lighting leaks and placing giant boxes over everywhere I thought might have been the cause.
I think most of my love with editing as a kid was for the Build editor and Duke3D - you could make some impressive levels with a mixture of laying out walls/zones in 2D, and adjusting heights and textures in first-person 3D. Opening the existing levels was also possible and a great way to learn how they had been made. I swear I’ll remember parts of the sector ID tag list until the day I die.
It’s kind of a pity that with the increased complication of engines and design that this is a much harder process to get started on. Though perhaps I’m underestimating the wonder even now of loading up a box room and the feeling of I made this...
Indeed. Even though the build engine was very limited in comparison to true 3d, it was much more WYSIWYG. The mix of drawing rooms as shapes and then jumping into 3d mode to decorate within the game engine was so fast and seamless. In unrealEd, each playtest loaded the full game which took forever on machines of that day.
Carmck did do the tools for Quake, though. He wrote QaukeEd on a NeXTstep computer, I don't think id ever released an official Windows tool until QERadiant (which was after Quake III or somewhere around that time, and Robert Duffy ended up being the guy there that did the work on it)
AFAIK DoomEd was made mainly by Romero on NextStep too, but QuakeEd was mainly Carmack - although Romero most likely worked on it too. But that was the GUI, the command line tools that the GUI used were mainly Carmack in both Doom and Quake.
BTW Carmack released the source code for QuakeEd back in the mid90s [1], however it was written against the "traditional" NextStep APIs which were incompatible with OpenStep or GNUstep and AFAIK nobody ever ported it to anything nor managed to compile it.
EDIT: also id did release an editor before QERadiant - in fact this is how QERadiant came to exist :-P. They released QuakeEd 4 (i don't know what happened to the other 2 :-P) which they used for Quake 2 [2]. They also released the code for that which is what Robert Duffy used to make QuakeED 4 Radiant that actually made id hire him (you can read an interview with him here before he was hired [3]). The original editor was written in C and used straight Win32 API with a bunch of floating windows while Radiant converted it to C++ and MFC. AFAIK even though it was converted to Gtk through GtkRadiant, the MFC branch survived even in Doom 3 (which was embedded in the game and launched through the console).
Huh, funny - when I was starting to design a Quake map, if I remember correctly, I would start by making a giant solid, and then carved out rooms from it. Am I remembering that incorrectly?
Nope, that was possible, but the boolean operations could lead to numerical instability if you didn't align your brushes to the grid at a reasonable granularity, which could still lead to leaks. The flooding code was generally pretty well done but developers had to bandage issues like this in every Quake-engine game.
FYI Unreal still had leaks (which lead to major performance issues if uncontrolled, due to the PVS being too inclusive), but it was not as pronounced as the negative space still allowed the map to finish building the BSP (Quake just failed, sometimes without a clue as to why).
> "A lot of the features in Unreal arose from fallacies of what I misperceived other people did."
In both the programming and business world, I've observed this trait in successful people. I know I've had ideas that were ahead of their time, but was held back by that damn voice saying it can't be done or I don't have the resources. I wish there was some way I could "fix" that wiring in my own brain.
On the one hand, my skepticism has saved me more times that I could count from being conned or from bad deals. But it also has held me back by focusing on the problems instead of just blazing forward.
Same story happened to Starcraft 1, in which Blizzard revamped their Warcraft-2 lookalike game to a more isometric view to match something the competition seemingly did.
"At some point I talked with Mark and Patrick about how Dominion Storm knocked us on our heels, and they let us in on Ion Storm’s dirty little secret: the entire demo was a pre-rendered movie, and the people who showed the “demo” were just pretending to play the game! It would be an understatement to say that we were gobsmacked; we had been duped into a rebooting StarCraft, which ultimately led it to be considered “the defining game of its genre. It is the standard by which all real-time strategy games are judged”
Bill Atkinson did the same thing with some of QuickDraw's overlapping region code on the original Macintosh. He perceived that the Xerox Alto was doing things that were more advanced than it actually was, and ended up shipping a more advanced and efficient overlapping window implementation because of it.
I remember reading a story years ago about one early VGA card company. They had somehow heard about a competitor using a flip-flop(?) to make their VGA card twice as fast. They puzzled over this idea for a long time, wondering how that competitor could have done that. It seemed impossible but because the competitor had apparently managed to do it, they kept experimenting, and ended up shipping a product inspired by this idea, but not quite achieving that factor two. Later they found out the competitor had actually done something much less impressive. (Using only a latch instead of a full flip-flop? I don't quite remember the exact details. I thought this story was in the Graphics Programming Black Book by Michael Abrash of Quake-fame, or maybe one of Jim Blinn's Corner columns, or something like that, but I can't find it.)
(So I indeed remembered a few things wrong: It was about a read- and write-FIFO, not about a flip-flop and latch. And the speed improvement factors were not mentioned.)
Most people suffer from the same problem, it is very hard not to see all the faults with an idea in its infancy. Whether these are perceived problems or real ones, and whether these problems are products of our own self-doubt or legitimate concerns is difficult to discern.
I can't decide if it is a form of stupidity or intelligence that allows people to confidently throw themselves behind their own ideas.
This is why you write plans; you find the pain points that will occur and figure those out before committing yourself to an idea. Often I find any wild ideas I have are mostly killed off in the planning stage, but this generally also leads to new more solid ideas too.
A "bias for action", according to "In Search of Excellence". Which didn't let a lack of experimental evidence stop them reporting experimental evidence. And was very successful. Which, I guess, validates it.
The ilusion of competen competitors without budget oversight - what drives a industry forwards, i can allready see the powerpoint presentation and the bookline.
"Also, a bunch of the former Future Crew demo-scene guys had formed a hardware company, and they had released some screenshots with incredibly realistic volumetric lighting in an indoor scene".
The European demo scene was very important in the development of computer graphics (e.g. early 3d shading techniques) and music (e.g. music trackers which can be considered predecessors to modern DAWs), every now and then I go and watch those demos and I am amazed by what a bunch of teenagers was able to achieve on very limited hardware such as an Amiga 500 or an Atari ST. Many classic demos are now visible on Youtube so you don't have to go through the chore of installing emulators.
Holy shit. Mentioning the demo scene in an article about game engines made me remember the "Into the Shadows" demo by Triton from 1995 (one year before quake was released): https://youtu.be/MViZocLJVcM?t=28s. That was really impressive back then. Thanks for the throwback!
I think tools and toolmakers don't always get the credit they deserve. Tools are the other way of increasing the leverage of a team of developers (the first being increasing the level of the team through knowledge and upskilling). It's sometimes hard to get businesses to see the benefit of building tooling, though, because it seems like a yak-shaving exercise (and can be if it's not scoped correctly).
I think if you have people who are naturally inclined to be toolmakers, and they've got a problem which can be solved in two ways - repetitively, or with tooling - the deciding factor on whether they build a tool or not will be how much autonomy of decision-making they have. Tightly focused "agile" sprints will result in small packets of work, no one of which will be big enough to justify any tool building, so there won't be any space for introspection and planning, and it probably won't happen.
Tooling tends not to evolve from refactoring. It has a similar genesis - seeing repetitive work - but the approach requires a couple more clicks up the abstraction stack to plan out.
His comments on tooling at the end of the article apply equally to all development projects not just games. Tooling had an enormous impact on productivity for developers but far too few organizations spend enough time on it.
I never worked on the Quake one but the Unreal Editor is where I started getting creative with computers - both by creating maps and fiddling with Unreal Script to change behaviors and build mutators.
For the mapping aspect, as a 10-something year old - it was super weird having to think in reverse at first - but then it got just natural. Think of being a huge block of clay and working from the inside.
As the Unreal Editor evolved it just got better.
The best part of the early Unreal editor was the "skybox" hack that you would have to create basically a small box placed somewhere in the map with the inside painted as the sky (or whatever panorama you wanted players to see when they looked at the sky).
There was no better feeling to me as a kid than opening up a map like DM-Deck16 in the Unreal Editor, and investigating how the genius mappers had created this or that part of the map (especially the toxic zones).
That skybox thing exists in other engines as well. E.g. in CSGO you can find the skybox scene somewhere in the map with noclip, usually far outside the playing area.
> There was no better feeling to me as a kid than opening up a map like DM-Deck16 in the Unreal Editor, and investigating how the genius mappers had created this or that part of the map (especially the toxic zones).
That's the thing I really liked about the game back then. You could just open the included maps in the editor and look at how they've been done. Usually the maps are either no longer in the format the editor supports and have to be decompiled (id games) or things were stuffed away in some large nonstandard archive file (I remember trying to find the Starcraft campaign maps and couldn't). Back when internet access was only sporadic and short I tended to give up where I couldn't figure things out further on my own instead of searching for tools.
Blew my mind as a kid when I didn't really understand the skybox but I think I left one of the 4 armed guys in that area on the map and when I went into the level he was towering over the level like a giant.
Anyone remember bsp holes? I never found a definitive answer where they came from. Was the algorithm technically flawed? Was the implementation buggy? Was it some shortcuts taken to speed up things, knowingly causing these glitches?
Also There were so many pieces of advice out there on how to prevent them or if they occur, how to get rid of them that I was convinced at some point that half of that must have been cargo cult like.
I remember working with QuArK to build Half-Life levels which must have been somewhere in `98-`00. I loved building the levels and triggers for opening doors and moving elevators. I remember at the time being impressed with the sculpting from UnrealEd when it was introduced, but I don't recall it being more intuitive. Anyone else who used QuArK who has an opinion?
This brings back so many sweet memories. I started with QuArK, then I was introduced to UnrealEd. I remember finding UnrealEd to be more clunky than QuArK. It also took some time for me to get used to the 'inverse' editing in UnrealEd (in uED you basically start with a solid block of infinite size, and then cut out the parts you want accessible by the player).
I then went on to make multiplayer maps for half-life (can't remember which editor I used), I still play those maps with friends.
After HL1 I went to college and couldn't afford a PC capable of running the newer games/engines, and that's where my map building hobby stopped, unfortunately. Nowadays the engines and tools are so complex, that it just takes too much time to build maps for a hobby IMO.
>Nowadays the engines and tools are so complex, that it just takes too much time to build maps for a hobby IMO.
What tools? I can't think of a single recent AAA game that actually has an editor besides Unreal and Unity and those are, IMO, just as easy to use now as they were back then. The processes are a little different but the basics are still pretty similar.
Now that's a name I haven't thought about for a long time. The gamer in me wishes Rebel Boat Rocker had worked out because their game had some promise. I only talked with Billy a couple of times and he seemed a good guy.
I met John Romero once in the Ion Storm tower, I would agree with the interview as he seemed a nice guy. Although there's a bit of a rep there. Shame Ion Storm was such a mess.
Met Jay Wilbur as well. Seems I've met quite a number of people in the industry during that time, wish I had capitalized on that a bit better.
Never did meet Carmack or Sweeney, that would have been nice to add to the collection.
Personally, I actually preferred Half-Life level development for a long while simply because the output was so nice. It was all about the lighting for me. Once the level building tools evolved a bit from the first crude Quake1 tools, dealing with them wasn't as bad as people seem to remember. It was those tools that instilled in me that the concept of making things more efficient and to look for more efficiency was well worth the effort.
Eventually a common thing for Unreal, probably UT, level development was to start with one large empty space and then fill it as necessary. As I remember it anyway. Especially with the usage of meshes and instances, it was simply more efficient.
My favorite editor was always DromEd for the Dark Engine used for Thief. The concept of designing levels with real-world units, that lighting had serious gameplay implications, and having to consider sound throughout the level was a nice change of pace.
I recall Unreal Tournament (1999) used octrees and not BSPs. Was this changed between Unreal and UT? I recall comparisons between Quake 3 (also 1999) and UT engines noted the spatial model impacting the performance of certain map designs: indoor vs outdoor.
While I'm going all nostalgic, anyone remember playing UT on Linux? There were so few good Linux games back then. I think I played it until the hard drive platter was thread-bare.
> While I'm going all nostalgic, anyone remember playing UT on Linux?
Not the original UT, but 2004 had a Linux version. Damn, that's a long time since I tried that game the last time. Gotta get myself a real licensed one though, back when I was younger we shared a serial key and well, someone got the serial banned on most servers somehow...
What wondered young me was that UT2k4 was able to run without any compilation or whatsoever on any Linux system no matter which kernel version, distribution, driver version, xorg version, ... Older me knows it's due to static compilation, but still absolutely awesome given that the only binary that does the same is busybox in static compiled mode, everything other will break when crossing distribution boundaries.
It definitely still had BSPs. UT2003 had terrain rendering that seemed custom (not likely octrees but could have used quadtrees for all I know) and much more of the environmental detail was made up of static meshes (meshes were culled based on the bsp but didn't contribute to it) presumably because GPUs could churn through them faster. The basic BSP editing was still there though, kind of amazing how long they kept that unchanged.
No, ut was based on the unreal engine, level and data files were compatible, you could copy the unreal files over to ut and play it with all those little enhancements the engine has gotten. No octrees in ut, but good old bsp holes.
As I sit here fighting to get 4.18 to compile I just want to say epic has really left the gnu/linux community floating in the wind and it makes me sad.
'make CXX=g++ -k' for any other linux ue4 users
I'm considering switching to pure blender/godot even though it's not even close to feature parity with ue4, at least it's a gpl license and linux is a first class client.
When you enter the building of a AAA game studio, the majority of Epic paying customers, it is filled with Windows computers and some macOS ones on the 3D rendering and iOS departments.
So they are paying attention to whom gives them the bucks.
I tried to use UE4 on Linux and gave up. My guess: someone on the UE team supports Linux as a pet project, and they're productive enough on other tasks that nobody tells them to stop. It's pretty clear to me that they support Linux as a runtime environment but they will never put much effort into supporting development on Linux. Most game programmers are used to Visual Studio anyway.
This was the biggest takeaway from the article, for me:
I thought to myself "Holy shit, Carmack wrote a real-time BSP editor!" What I didn't realize was that it wasn't actually real-time, there was this re-build process and all this other offline stuff. I didn't know that, and so I thought I had to create a completely real-time thing, and so I did.
It basically goes to show you nothing's impossible until someone says it is. Had Tim thought, "There's no way you can write a real-time BSP editor on current hardware", he would have never done it.
>So I figured out that I needed to compute the line integral from the eye to every point on the screen. I learned some calculus in college, so I said to myself "I should be able to do this." So I figured out the formula for it with some crazy complicated trigonometry. I implemented it, but it was 100 times too slow. Then I realized, "Oh wait, I can do this in the lightmap space”, because the lightmap is a discretization of geometry into bite-sized chunks. I did that with lightmaps, so it was real-time.
It was too slow to calculate lighting in real-time, so a process executed at design-time in Unreal (at build-time in iD engines) that calculated how lights lit the level geometry and then baked those changes on to the textures that were applied to the geometry.
It's like painting the glow of a torch on to the wall once before playtime rather than actually doing it every frame. Lighting doesn't typically change every frame, so it works.
Regarding the Haskell love. I believe Unreal contributed to the design of a research functional programming language prototype called Cayenne back in the day:
Cayenne—a language with dependent types
https://dl.acm.org/citation.cfm?id=289451
I think another historical note is how everybody developed on SGI machines in the late 1990s. Primarily, due to Maya modeling. But the 3D graphics API wars were certainly a factor:
Direct 3D and OpenGL by Paul Hsieh
http://www.azillionmonkeys.com/windoze/OpenGLvsDirect3D.html
And, yeah, MSDOS ZZT from 1991 is still playable via Internet Archive ;)
https://archive.org/details/msdos_ZZT_1991