I don't think that's exactly true. It's not the same thing to write a piece of code as part of a whole system, vs rewriting that piece of code as an isolated part.
> The 100x slower renderer was also written by one person (or at most few).
The difference between one person and a few is massive in my experience.
> It really is the difference between programmers.
> It's also a good example that 10x programmers do exist, at least in certain situations.
> The Microsoft programmers are not dumb. They certainly are not cheap.
> And yet here we have one guy who can write code that is 10-100x faster, in just few days.
I think it's a difference between incentives too. Microsoft programmers are not here to optimize for performance, but to push features, performance being one of them in some cases. For example, the research in file explorer is still painfully slow, probably because they have some other things to do. Many people could make it faster in a few days I guess. But these people aren't inside Microsoft being pressured to do other things.
Software made to demonstrate how fast something can be is made in a very different environement and with very different incentives compared to almost all comercial software. For pretty much anything out there, someone can take it, isolate it and make it run 10-100x times faster I think. The thing is, in most organizations you don't have that kind of time.
It's funny because Casey made this video partly to push back on people like you who come up with limitless excuses for why slow code is "actually the right way to do it".
I guess it didn't work.
Are you really arguing that the code is 10-100x slower because it was written by 2 people and not one?
And if the reason we write slow code is to save programmer time, as your other argument goes, why is it ok to put 2 people to write code that can be written by 1 guy?
That excuse doesn't make it better, it just makes it look doubly bad for Microsoft. Not only they can't write fast code, they spend twice as much time doing so!
> It's funny because Casey made this video partly to push back on people like you who come up with limitless excuses for why slow code is "actually the right way to do it".
I'm not saying it's "actually the right way to do it", I'm saying that it's how it's done in real life. That kind of argument reminds me a lot of Robert Martin that is always saying that the fix to software is to have everyone be more disciplined. It's true but it's also absolutely useless. It's like saying "people should be less mean to each other". Yes, they should, we've been saying that for thousands of years. Does that come with anything? A new tool to show how fast your code can be? It's nice to show to people how fast the code can go, especially since the students didn't seem to think that it was possible. But then what?
> And if the reason we write slow code is to save programmer time, as your other argument goes, why is it ok to put 2 people to write code that can be written by 1 guy?
> It's not the same thing to write a piece of code as part of a whole system, vs rewriting that piece of code as an isolated part.
But, AFAIK, the new Terminal was new. Sure, it's 3 years old already, but Microsoft still didn't have the hypothetical "this is part of an old system" constraint when they started writing it.
> Microsoft programmers are not here to optimize for performance, but to push features, performance being one of them in some cases
Microsoft's marketing copy for the new Terminal cites "fast, efficient" in the first paragraph. It also cites GPU optimisation. Clearly performance was a goal.
Also, as been discussed ad-nauseam in other HN threads, Casey also added some extra features that didn't exist in MS's New Terminal. The "busy doing features" excuse also doesn't apply.
> The guy in video called what your doing now "the excuse parade".
> Instead of finding solutions, your time and energy goes into making excuses.
Excuses for what exactly? I don't remember writing that code. I'm explaining incentives that leads to slow code. I usually try to write the best code I can in the time I have, but if one day the problem is that my code is too slow, I won't be making excuses. I chose at the time to not spend very much time on performance to ship faster, and that was a conscious choice on my part.
The "excuse parade" mentioned by the grandparent poster is another name for what psychology calls "rationalisation".
Microsoft fucked up about performance, period. The reason for that is not because the Terminal provided too many features, or because they wanted the code to be "readable". The manager didn't even claim the team didn't have time to do it, he flat out said it would be impossible without taking a few years to research.
Time and time again we see this kinda thing happening in software and people jump into made-up rationalisations because they're in denial about the root cause of issues.
I think there's a difference between most software being slow because it's not a priority, and some specific parts like that terminal being slow because people have absolutely no clue and are in denial.
But the fact that higher performance is not a priority is something that is held together by those excuses and rationalisations.
People complain about performance all the time, both users and people inside their teams, but the thinking you're espousing is so widespread that developers clamouring for optimisations are just shut down.
End-users not having to wait 5 or 10 seconds unnecessarily can be a boon in productivity in some industries that use Enterprise Software. But we can't rely on having Casey Muratori going to a supermarket whose POS is slow and rewriting the software on a weekend and publicly shaming the supermarket chain on Twitter. Change has to come from within. Even accepting reports that the software is slow would be a good start.
The problem is not that the inefficiency exists or that they can take too long to fix. The problem is that no team ever really cares to stop and say "ok is this inefficient, how long will it take to optimise? Will we get gains from that?".
Instead of asking and researching, people just do as you do: they rationalise by saying "I can't prove that it can't be faster and I can't prove that it's not costing us or our users money, but it is my belief and no proof will make me change it".
This is a industry wide problem. It's anti-scientific posturing that's leading to widespread software slowness, programmed obsolescence and excessive spending of hardware.
> Instead of asking and researching, people just do as you do: they rationalise by saying "I can't prove that it can't be faster and I can't prove that it's not costing us or our users money, but it is my belief and no proof will make me change it".
How exactly do you get that from what I said? That's a strawman of my position. I've been really clear about it multiple times: our users are asking most of the time for new features more than they are asking us for better performance. That's it. That's the beginning and the end of the issue at my job. Sometimes, we have a specific things that's too slow for users, and they will ask us to make it faster, and we will make it faster. Most of the time, they ask us for new features. We add them, while trying to make code that's reasonably fast, maintanable, understandable, localised, and that does what the users wants.
> End-users not having to wait 5 or 10 seconds unnecessarily can be a boon in productivity in some industries that use Enterprise Software. But we can't rely on having Casey Muratori going to a supermarket whose POS is slow and rewriting the software on a weekend and publicly shaming the supermarket chain on Twitter. Change has to come from within. Even accepting reports that the software is slow would be a good start.
You seem to think that I'm working at Microsoft and am one of the people that told Casey that he was wrong. I'm not. My point is that most people are like me, try to do things well but have to balance many incentives. And a few people, like the ones Casey interacted with, are just plain wrong. But these people aren't the majority and aren't the sole reason software is slow.
I've said it before and I'll say it again: having to wait 5-10 seconds is tolerated because exporting the data and doing the thing in Excel yourself would be slower. Imagine a task that takes 5 minutes manually, 10 seconds unoptimized and 1 microsecond optimized. It's a shame that in most of these cases the software will take 10 seconds. But it's still a huge boost compared to the 5 minutes of doing it manually. Even if the software is not the best, it still has a huge value. My point is that most of the time, at least in my industry, users will want more tasks going from 5 minutes to 10 seconds, than tasks going from 10 seconds to 1 microsecond.
Now, if one day our users want a task to go from 10 seconds to 1 microsecond, or even 1 second, and we tell them that it's impossible and would require a PhD, we are wrong, period. That would be an expression of our limitations as programmers. I completly agree with you on that point. However if we tell them "We would love to do this, but you're a minority wanting that, so it wouldn't make business sense to do that", I think we are honest and doing our jobs. There's a chance that we are wrong, and that focusing on that would bring way more business value than we think. And if that happens, we should increase the business value of other cases like that. But outisde of that, I think we operate rationally and in good faith.
> But the fact that higher performance is not a priority is something that is held together by those excuses and rationalisations.
> People complain about performance all the time, both users and people inside their teams, but the thinking you're espousing is so widespread that developers clamouring for optimisations are just shut down.
They do, but they ask for new features even more. Honestly, I would love my job more if most of my work was optimisation instead of new features. I'm not in love with the software we develop, and I find performance work more interesting. But our software saves a lot of time to our users, and continue to do so with new features, so we develop new features. I try to be sensitive to performance, stuff like avoiding O(n²) for an array intersection, avoiding making three times the same loop in a row. But I don't have all the time I want to dedicate to that. I also don't have control over everything, some parts are held by other teams which are a bit territorial, and since their stuff isn't well tested, it's hard to go in and make changes.
All of that to say that you seem to see evil everywhere by focusing on a few bad examples, while most people are actually trying to do not terrible software, but don't really have the time to do so.
Nobody here is saying you're personally responsible for slow software, or that you're personally pushing for software to be slow. But I'm saying that rationalisations about slow software are partially to blame. Preconceived assumptions keep being proven as incorrect, but developers keep repeating them.
I also never assumed anywhere you work at Microsoft or that you blamed Casey for anything, I'm just using his case as an example. The gist of that sentence is that software can't be made faster if the only way to get that to happen is via public shaming, like Casey did.
About having to wait for users asking for speed increases: I feel like this is a bit of a strawman in itself, because even when users complain, it's rare that product managers follow trough.
But even discounting that: users not asking doesn't mean that software can't be better made faster, that it won't be a competitive advantage, or that users even know it's possible.
Both features and optimisations should not be driven by user votes or managers hunch. Teams should see the data, analyse usage, the market, predict outcomes, predict how difficult. And implementation should be iterative. Software made by continuously slapping spaghetti against the wall is the #1 cause of teams too busy.
Also, about the cost of optimisation: it's (more often than not) nowhere near as big as people assume. It was demonstrated by Casey and in other cases. And no, it's also not something only super programmers can do.
> But I'm saying that rationalisations about slow software are partially to blame. Preconceived assumptions keep being proven as incorrect, but developers keep repeating them.
As I said, I think that it's important to show people how fast software can be. Once it's done, either you agree that your software could be faster, or you're acting in bad faith. I think we both agree on that point.
> About having to wait for users asking for speed increases: I feel like this is a bit of a strawman in itself, because even when users complain, it's rare that product managers follow trough.
That's how it work at my company. I can't really say about how everyone else works, but it would make sense to act this way.
> But even discounting that: users not asking doesn't mean that software can't be better made faster, that it won't be a competitive advantage, or that users even know it's possible.
> Both features and optimisations should not be driven by user votes or managers hunch. Teams should see the data, analyse usage, the market, predict outcomes, predict how difficult. And implementation should be iterative. Software made by continuously slapping spaghetti against the wall is the #1 cause of teams too busy.
I mean, sure, but most people aren't at the level where they can decide everything they do. Most developers follow what managers/product owners tell them to do. Asking the developers to say no to the managers and work on something else is a bit easy to do, and will lead to zero consequences in the real world because that's not something people can do.
> Also, about the cost of optimisation: it's (more often than not) nowhere near as big as people assume. It was demonstrated by Casey and in other cases.
It was demonstrated by Casey in one specific case. I assume that most software is more complex than that. At work we have lots of moving part and not one specific hot path, which makes it hard to do a big optimisation like the one he did. It's not like we have one process taking 80% of the CPU. Again, that doesn't mean that it's impossible. It would just take time. Time that we can't really take.
> And no, it's also not something only super programmers can do.
I never said that. I think everybody can optimize code. You need some basic knowledge like "use a profiler instead of only relying on intuition", "use good metrics", and things like "I could use another hash here" or "SIMD would make sense there" or "My problem seem to be an union find-type problem. Where can I find an optimal algorithm for it?" but then the final limiter seems to be time.
> The gist of that sentence is that software can't be made faster if the only way to get that to happen is via public shaming, like Casey did.
Maybe public shaming isn't the right word, but "making noise" is important. If our clients don't ask for speed, we'll continue delivering features that other clients ask for (as long as it's reasonable). I think we should push back against people like the ones that told Casey you need a PhD for this. But we should also push back against bad incentives. Maybe the person said that a PhD was needed because there is no place for failure, ignorance or errors in their team, and saying that you need a PhD is the only "escape" if that makes sense. I don't think I'm a bad programmer, but in almost all domains, there are people that know a lot more than me. An important part of programming is to have the humility that some other people will do way better than you. But often people from big companies seem to react as if they're gambling with their job if they admit that someone else did better. Being able to accept a better solution is also an important skill for a developer, I think. In that case we should ask for the developers to do better, and shame the companies that don't let their developers accept better solutions from outside.
> "Asking the developers to say no to the managers"
I never said we should. "Teams" includes everyone. And engineers do have cachet to push for things.
> "I never said that"
I never said or implied you did, it was a general statement.
> "It would just take time. Time that we can't really take"."If our clients don't ask for speed"
Here's what I've been saying: not taking time to estimate how much time it would really take (or even if it's needed) is nowhere near as bad as saying that "a PhD is needed for that", but it is bad. Also, acting reactively is also bad. Maybe your software is already good enough, but not knowing is also an issue.
> The 100x slower renderer was also written by one person (or at most few).
The difference between one person and a few is massive in my experience.
> It really is the difference between programmers.
> It's also a good example that 10x programmers do exist, at least in certain situations.
> The Microsoft programmers are not dumb. They certainly are not cheap.
> And yet here we have one guy who can write code that is 10-100x faster, in just few days.
I think it's a difference between incentives too. Microsoft programmers are not here to optimize for performance, but to push features, performance being one of them in some cases. For example, the research in file explorer is still painfully slow, probably because they have some other things to do. Many people could make it faster in a few days I guess. But these people aren't inside Microsoft being pressured to do other things.
Software made to demonstrate how fast something can be is made in a very different environement and with very different incentives compared to almost all comercial software. For pretty much anything out there, someone can take it, isolate it and make it run 10-100x times faster I think. The thing is, in most organizations you don't have that kind of time.