Hacker News new | past | comments | ask | show | jobs | submit login

Is it just me, or the work logged by Carmack for a single day seems like a plan for a two week sprint in Scrum, or even a full (3-month) Program Increment in Scaled Agile Framework (SAFe)?



- No hour a day (conservatively…) lost to meetings and other activities that exist to generate data for managers to make pretty graphs out of.

- No hour a day lost on PR reviews (reviewing and being reviewed).

- No hour a day pairing on someone else’s task.

- No two hours lost because some part of your hellishly-complex stack and set of alpha-quality vendor tools decided to shit itself today for no reason.

- No hour babysitting yesterday’s code through CI and testing et c.

- No hour a day lost to context-switching from the above and a dozen other distractions.

- Oh look there’s only an hour left in which to actually produce code in one eight-hour day.


I’m an architect at my company but we’re in a bind with a particular project that I had started a couple years ago, so they temporarily assigned me to basically one-man it over the finish line.

Aside from the stress of the deadline and knowing that I’m going to get stuck being the expert on this project forever, it’s been pretty nice (and productive) just working in peace and quiet with no bullshit scrum ceremonies and navel gazing as distractions. Actually reminded me of why I like programming, and that the things I hate about my career is the fact the organizations spend a lot of time keeping you from actually performing it for some reason.


> No hour a day (conservatively…) lost to meetings and other activities that exist to generate data for managers to make pretty graphs out of.

IIRC even in the Doom 3 and probably also Rage days Carmack had mentioned that they didn't really do any meetings, if they needed they just discussed things in the kitchen (or something like that).

It did help that the team was small though (IIRC Doom 3 was made by ~25 people total).


>- Oh look there’s only an hour left in which to actually produce code in one eight-hour day.

Usually around 7pm, once everybody's left for the day.


Well, it is Carmack we're talking about here. He's well known as a prolific programmer prodigy ;)

But also, some other things to note:

* A lot of "agile" development in corporate environments is anything but agile because of overhead in horizontally scaling human gray meat (until we get neural interfacing between one another or something)

* It was a simpler time back then. Carmack was coding against a much simpler architecture, with significantly fewer variants.

* It was a simpler time back then. Carmack could focus on blitting pixels to the screen as fast as possible, rather than spending 6 months trying to wrap his head around Vulkan.

* It was a simpler time back then. Carmack didn't have to worry about building for Windows, macOS and Linux, and iOS. And Android. And ...

* It was a simpler time back then. Carmack didn't need to worry about accessibility requirements. Web service integrations. Digital distribution complexitities. etc...

Even in the modern day there's still people who get prodigious amounts of work done when they can focus on doing something they like doing, and the stars align. A good recent example off the top of my head in game development is The Witness. Jonathan Blow + 2-3 other programmers IIRC.


He was building for at least Windows and Linux, and didn't have OpenGL so he was doing all the 3D manually. Plus assembly code, hardware specific versions like Verite, had to handle all the raw networking code, wrote tools to work with assets and process levels, also wrote Quake C, encryption to unlock the full game on the shareware Cd...

Sure, Cash, Abrash and Romero were helping out


Everyone serious about gamedev at his age would be well versed in assembly language and such. It's unthinkable nowadays but normal for graphics programmers in 80s and early-mid 90s.

You can do it too if you follow his steps. I mean by programming Apple ][ and then IBM PC in assembly.


Quake 3, I do believe, had it's own shader language so the designers could have more control over visual effects.

https://icculus.org/gtkradiant/documentation/Q3AShader_Manua...

The dude was definitely doing stuff no one else had ever done. Then again I have a 90's game dev book that covers ASM as well as manufacturing then programming your own sound card using the parallel port so... you're not wrong.


Yeah I agree. He is definitely doing new stuffs no one is doing. I don't really think it's realistic for any programmer to set a such high objective.


Abrash did most of the graphics related assembly IIRC


> He was building for at least Windows and Linux

Are you sure about that? I remember in Doom 3 era Carmack was rocking a fancy NeXTStep computer and doing cross platform development. But before that I'm pretty sure everything was a strictly Windows affair...

Wikipedia seems to suggest in the first line that it was originally released for Linux. But the "Ports" section it mentions that:

> The first port to be completed was the Linux port Quake 0.91 by id Software employee Dave D. Taylor using X11 on July 5, 1996

And on that topic:

> In late 1996, id Software released VQuake, a source port of the Quake engine to support hardware accelerated rendering on graphics cards using the Rendition Vérité chipset.

> and didn't have OpenGL so he was doing all the 3D manually

So in other words he could focus on the fundamentals of doing vector math and rasterizing pixels, rather than worrying about 600 different incompatible OpenGL extensions, pipeline stalls, and memory management bugs?

> Plus assembly code...

Assembly isn't that hard. Pokemon Red/Blue was written entirely in assembly.

> had to handle all the raw networking code

Again, not that hard when all you're doing is blasting out UDP packets on a very simple network. You didn't have to worry about double NAT, people didn't expect your game to work for diverse peers connected from Germany to Belgium.

> wrote tools to work with assets and process levels

I think Romero wrote a lot of the tooling. Side note - have you read Masters of Doom? It goes into a lot of this stuff and is a great read.

> encryption to unlock the full game on the shareware

I couldn't find anything online but I doubt this was anymore more than some xor + rot13, it's not like Carmack was also Daniel J. Bernstein in disguise :) Back then the state of computer security was, well, rather nascent.

I hope this comment doesn't come across as contrarian. I also hope it doesn't sound like I'm trying to diminish the achievements of Carmack. He's a role model for me personally as a programmer. And he and his mates spawned an entire game genre that I have enjoyed for ... an amount of time I'd rather not disclose or think too much about!

The original point I was trying to make is thus: computers, and computing, have steadily become more and more powerful, which results in more and more complexity, and progressively more unwieldy abstractions to deal with all that complexity. Back when Carmack was cutting his teeth, computers were still fairly early on that complexity curve.


> Are you sure about that? I remember in Doom 3 era Carmack was rocking a fancy NeXTStep computer and doing cross platform development. But before that I'm pretty sure everything was a strictly Windows affair...

By Doom 3 NeXTSteps were way obsolete and the company had already been bought by Apple long before. He did show the Doom 3 tech first on a Mac at a WWDC though (and later did the same for Rage).

NeXTStep was used for Doom 1 and Quake 1 but by Quake 2 they had switched to Windows NT based computers (with a second Win95 computer for testing - and some artists still used DOS-based software like Deluxe Paint).

> I think Romero wrote a lot of the tooling.

AFAIK Romero wrote the editor for Doom and previous games but Carmack wrote most of the editor for Quake (the whole "brush" idea was Carmack's). Romero has mentioned a few times that he wasn't happy with the editor's usability. AFAIK by that time Romero spent more time on working on the levels (which were more time consuming to make than the Doom ones) and the QuakeC scripts.

> have you read Masters of Doom? It goes into a lot of this stuff and is a great read.

Indeed, i have read MoD, it is a neat book. One of these days i want to also read Romero's "Life in First Person" too.

> I couldn't find anything online but I doubt this was anymore more than some xor + rot13

It was a little more involved than that, there is a blogpost series[0] about it from someone who tried to reverse engineer and reimplement the original qcrack (which calculated a decryption key). The original qcrack was made almost instantly though.

IIRC from Masters of Doom, Carmack didn't really like the idea (most likely he expected people to crack it quickly).

[0] https://faehnri.ch/finished-with-qcrack/


To be fair we don't have 99% of those problems and still manage to deliver fuck all.


I like the cut of your jib


Peter : Well, I generally come in at least fifteen minutes late, ah, I use the side door - that way Lumbergh can't see me, heh heh - and, uh, after that I just sorta space out for about an hour. (...) I just stare at my desk; but it looks like I'm working. I do that for probably another hour after lunch, too. I'd say in a given week I probably only do about fifteen minutes of real, actual, work.

(...)

Bob : What if - and believe me this is a hypothetical - but what if you were offered some kind of a stock option equity sharing program. Would that do anything for you?

Peter : I don't know, I guess. Listen, I'm gonna go. It's been really nice talking to both of you guys.

Bob : Absolutely, the pleasure's all on this side of the table, trust me.

Peter : Good luck with your layoffs, all right? I hope your firings go really well.


I am not sure I would consider much of what he did very simple (or easy) compared to what most of us are doing today. The last few chapters of Michael Abrash's Black Book is about his work with Quake (he was involved doing some of the graphics code together with Carmack) and it is pretty hardcore low-level advanced things they were doing. Remember they were software rendering everything in the first version.

https://github.com/neonkingfr/AbrashBlackBook

And also they did pretty soon support MSDOS, Windows 95, and Linux (and possibly some more platforms?). In addition to supporting software rendering, 3Dfx, OpenGL, and possibly some more 3D API.


Didn't he code on Solaris or something weird in the workstation family and port quake over to dos? I only remember him having a giant CRT monitor where he'd sit and code for photos back in the 90s


It was NextSTEP and that was for Doom. I think they switched to Windows NT for Quake (or was that Quake 2?), which is what he was working on in that pic with the giant CRT.


The switch to Windows NT happened with Quake 2, the original Quake was still NeXTSTEP.


I think it’s more so that when you are an expert in a particular system and you have very little bureaucracy/overhead in any particular task you’re able to get a lot more done.

I was watching a video from one of the creators of Fallout and he was talking about how things that used to take a day now take 2 weeks for similar sort of reasons: https://youtu.be/LMVQ30c7TcA?si=eJm_u-i1xwttfcRL

Our industry in general has added a lot of overhead and bureaucracy and does everything overly cautiously in comparison to back then. Things take exponentially longer to do in software development today. You can also observe this in how long it takes a startup to build a feature vs a FANG company.


I'm not sure it's fair to say that the industry has added overhead versus the 90s. Unless you're talking about gamedev, which I can't comment about. But it's hard to say that tech in general is slower moving as an industry than thirty years ago; I think there's more nuance to it than that. What exactly that nuance is I'm not completely sure, but here are some counterexamples:

Intel shifted the entire direction of its company away from memory and into microprocessors as early as the 1980s, in response to smaller, allegedly faster moving Japanese competitors.

IBM has been a slow-moving behemoth for decades; the story of how DOS began is an idiomatic David vs Goliath tale of the industry.

Apple debatably rose to prominence on the back of tech lifted from Xerox, which for one reason or another (maybe they were too confident in their fax machines) didn't ship their own tech.

Google was at one point a fast-moving startup, up against slow-moving competitors like AltaVista, arguably Yahoo, MSN.

Amazon's original web stack was written in C, iirc. I personally would have a hard time arguing that things done today in web development could be done faster in C.


It’s not tech in general that’s slower moving, but rather that software development is slower moving today (in my opinion).

To launch a minor feature at a modern tech company you need to:

- pitch your idea to your manager and get alignment

- get approval from your skip manager and buy-in from your product managers

- align your idea with your partner team’s roadmap

- go through a design review with your team

- go through a design review with your principal engineers

- write a task breakdown

- actually write the code

     - write tests for the code

     - write integration tests for the code

     - fix build issues

     - push multiple code reviews and respond to comments

     - whoops you need to rebase and that shared component you used just changed it’s interface yesterday breaking your tests

     - your integration tests failed in preprod blocking your coworker’s launch, better fix that
- a couple rounds of AppSec, UX and PM review

- time to coordinate the launch of your feature toggle with 6 partner teams

- oops there was a minor bug you need to roll back your feature and resolve it overnight

How much of this overhead existed in the 90s?

If a single lone genius like Ken Thompson or John Carmack wants to write a new feature or product and get it out to customers they have 10x+ the work today (at a modern tech company).


What about a "modern tech company" with only one (or maybe two) workers ?


There are some sweet spots to be found in software development productivity.

Finding them is an art/craft, and very sensitive to the project/company context, the people you have available, how the work will be distributed, etc. It includes making process lightweight by default, and having a good sense of when to use something non-lightweight.

In the right circumstances, a few top "programmers" with great team software engineering sense, and someone representing product sense, unencumbered by BS, can usually wipe the floor with most contemporary much-larger teams.

But starting off a startup by saying "We'll do Leetcode rituals so we know we'll hire aspiring techbros who spent their spare time at Stanford memorizing those, and we'll (pretend to) use this fashionable branded lifecycle process designed for companies that have no idea what they're doing but have a lot of warm bodies to fumble blindly through it, and ignore unfashionable process tools that we were confidently told don't work by students with zero experience based on what a professor with zero experience told them, and this huge list of popular tech we heard of and will never understand will be our stack, and then we can show promise to get to the next funding round and hire even more warm bodies..." is one way to make the road to finding effective expertise much longer. :)


The point of SAFe is so that remote workers can be as productive as Carmack on their own side gigs during the ceremonial duties and meetings. I rather like it!


It depends on yohr programmers. At $JOB, we had one programmer who was absolutely slaying. We would plan a sprint, and two days later more than 1/2 of the board was done and in review, mostly done by him. When I joined, I did a similar thing, until I realized that its no fun to just chew through all tasks for the team and then only do reviews for the rest of the sprint. So instead now I mix reviews and writing code, which managers like to see anyways, and progress is a magnitude slower. Setting up a test env, context switching to a problem, code review, manual testing, writing comments, etc. definitely takes a lot more time usually than solving that same issue.

I think it definitely depends on the programmers, though. I've seen people take a week for what, for maybe another dev, would probably take a day.

One of the ideas of scrum and sprints is that everyone does a bit of everything, right? And that is long-term good, short term a bit of a hindrance.

The guy who could do that task in one day has a lot of knowledge about that part of the code, and if he always does that, and then leaves, there's a knowledge gap. If instead we let someone else spend a week on the task, because they have to figure out how everything works and is structured, then we have two people who have this knowledge at the end.

In my experience, there are also big skill gaps between people, and some are just much, much faster than others. Carmack is likely on the higher end


I reacted to how his way of working sounded actually agile (vs some method sold as agile but mostly consisting of a lot of processes and automatic tools to keep everyone from having any hope of being agile), like:

> When I accomplish something, I write a * line that day.

> Whenever a bug / missing feature is mentioned during the day and I don't fix it, I make a note of it. Some things get noted many times before they get fixed.

> Occasionally I go back through the old notes and mark with a + the things I have since fixed.


Move fast, gib stuff.


I can complete, maybe half the tickets of a sprint, in one day, if I get everything I need and have a full day without disturbance.

Of course it's not going to be as difficult as some of his tasks.


Is the code a mess? From the notes it looks like the code is a mess, but don't want to assume.



Indeed, and of course for a full team of developers, not a single person.

Carmack is a god.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: