In my experience, whether this is top of mind has a lot more to do with what people work on and with what tools than with level of understanding. For instance, in your example:
> For example a report that took over half an hour to generate, I made a one-line change and cut the time to a few minutes
In essentially all the work I've done in my career, this would be the result of expertise in SQL and the relational model, not in data structures and algorithms. I don't recall ever working on reporting code that isn't a dumb pipe between a SQL query and a mature library for writing CSV (or parquet or whatever). Sure, there are tons of data structures and algorithms on both the database server and client side, but that's not what I'm working on.
And I think this is pretty typical for people who mostly build "applications", that expertise in tools is more of a value-add than expertise in data structures and algorithms.
But having said that, I do agree with you that everyone benefits from these kinds of "fundamentals". Not just this, but also other fundamentals like computer hardware and systems, networking, etc. I think fundamentals are very useful, while also thinking that many people are good at their jobs without them.
In my case the processing was happening in our backend. I can't remember exactly why it couldn't be SQL, actually it's possible it could have been sql. But changing it to sql would have been a bigger change and this wasn't really the task I was working on, I just happened across it while doing something else.
I have also seen and fixed similar travesties where someone iterates through a huge list making one query per element, where it was fairly trivial to rewrite it to a swl join.
Point is just that understanding what you're doing is is valuable and in my mind DSA is a fundamental part of understanding what you're doing. Anyway I think we agree :)
Unsurprisingly we've now reached the perennial "is premature optimization actually premature" of it all :)
Would it have been better for the person who originally wrote that just-iterate-the-list implementation to have been thinking about data structures and algorithms that would perform better? Opinions on this vary, but I tend to come down on the side of: Optimize for human productivity (for both the writer and the many future readers) first, then profile, then optimize any bottlenecks.
My assumption when I come across something that turns out to be a performance bottleneck that is easy to fix with a better data structure or algorithm, is that the person who wrote that was consciously doing a simple implementation to start, in lieu of profiling to see where the actual bottlenecks are.
But I also understand the perspective of "just do simple performance enhancements up front and you won't have to spend so much time profiling to find bottlenecks down the line". I think both philosophies are valid. (But from time to time I do come across unnecessarily complicated implementations of things in code paths that have absolutely no performance implications, and wish people wouldn't have done that.)
> Optimize for human productivity (for both the writer and the many future readers) first, then profile, then optimize any bottlenecks.
I don't agree. The problem with this approach is that there are some optimisations which require changes to how data flows through your system. These sort of refactorings are much more difficult to do after the fact, because they change what is happening at the abstraction / system boundaries.
Personally, my approach is something like this: Optimise first for velocity, usually writing as little code as possible to get something usable on the screen. Let the code be ugly. Then show people and iterate, as you feel out what a better version of the thing you made might look like - both internally (in code) and externally (what you show to humans or to other software systems). Then rewrite it piece by piece in a way thats actually maintainable and fast (based on your requirements).
I've moved both ways along the continuum between these perspectives at different times. I don't think there is a single correct answer. I'm at a different place than you on it currently, but who knows where I'll be in a year.
Totally fair :) I have the same relationship with static typing. Right now I couldn't imagine doing serious work in a dynamically typed language, but who knows what I'll think in a year too. From where I'm standing now, it could be ghastly.
Actually I think you're misunderstanding me. I'm not saying you should profile and optimize all the code you write, I'm saying that a basic understanding of algorithms, data structures and complexity analysis allows you to write better code without any extra tools.
I didn't profile this report to find out why it took 30+ minutes to run. I just happened across some code, read it, saw that it was essentially two nested loops iterating through two huge (50-80k elements each) lists matching items by name, changed it to use a dictionary instead of the inner loop and that was that.
It's a trivial change, it wouldn't have taken any longer to write this the first time around. There is no excuse for doing this, it's just a dev who doesn't understand what they're doing.
That's my point. Understanding these fundamentals allows you to avoid these types of pitfalls and understand when it's okay to write something inefficient and when it isn't.
> For example a report that took over half an hour to generate, I made a one-line change and cut the time to a few minutes
In essentially all the work I've done in my career, this would be the result of expertise in SQL and the relational model, not in data structures and algorithms. I don't recall ever working on reporting code that isn't a dumb pipe between a SQL query and a mature library for writing CSV (or parquet or whatever). Sure, there are tons of data structures and algorithms on both the database server and client side, but that's not what I'm working on.
And I think this is pretty typical for people who mostly build "applications", that expertise in tools is more of a value-add than expertise in data structures and algorithms.
But having said that, I do agree with you that everyone benefits from these kinds of "fundamentals". Not just this, but also other fundamentals like computer hardware and systems, networking, etc. I think fundamentals are very useful, while also thinking that many people are good at their jobs without them.