I'll admit I'm a little surprised to see a topic like this get much attention. Have programming courses & curriculums changed so much in the past ~20 years that this isn't simply part of every introduction to the subject?
Yes, the topic of code performance as a whole is more complex than just Big O, but as its own concept it was, in my time (get off my lawn!) pretty effective covered everywhere, and certainly the moment "algorithms" we discussed in any learning material.
Maybe it's just that it goes back to the common topic here on HN that there's a lot more inefficient code nowadays because faster processors & more memory helps to paper over the cracks. But if something like Big O isn't taught as one of the most primitive bits of knowledge in programming then I can't completely that trend either.
I think this might be for the self taught crowd of developers who never formally took Comp Sci, yet, who are also wielding important positions in software development that pay as well if not more than the guys who did take comp sci.
You'd be astounded how big this self taught cohort could be and how much power they wield.
They do their job pretty well, and yet, the basics of computer science is something that they never learned.
I met a senior guy the other day who had never heard of a freaking truth table for fuck's sake.
I was dropped into the tech lead position at my job last year; I started in game design at art college, so having to run and gun has been fascinating. I'd be lying if I said I wasn't scared reading about much of the stuff in this comment section that really should be bread and butter.
I'm talking to my boss to see if there's some kind of training program I can pick-up on the side to help me gather what should be the basics that I've missed out on, although we're so overloaded finding the time and money is challenging. I'm lucky it's mostly CRUD, but I can't help by worry every architecture decision I'm making is going to cost us massively down the road.
A few people have made incorrect mathematical statements in this discussion, so having a math teacher might be useful when learning this stuff. My unsolicited advice: take your time and learn some mathematics you might enjoy. Trust simple mathematical definitions over long-winded "explained as easily as possible" essays.
Mathematicians congratulate each other for simple, elegant definitions (sometimes developed over decades) which make deriving results easy. If you don't understand a definition which requires only a few words, learn some of the background instead of doing 10000 Google searches for the "easiest" explanation.
Here's an example. In physics, a vector is something with a "magnitude" and "direction" and we associate feelings and intuition with this. In mathematics, a vector (in 3-dimensional space) is simply "an ordered triple of real numbers". Many people might find this definition unsatisfying, but it is simple, precise, and lots of USEFUL mathematics is created from it.
Yes, the topic of code performance as a whole is more complex than just Big O, but as its own concept it was, in my time (get off my lawn!) pretty effective covered everywhere, and certainly the moment "algorithms" we discussed in any learning material.
Maybe it's just that it goes back to the common topic here on HN that there's a lot more inefficient code nowadays because faster processors & more memory helps to paper over the cracks. But if something like Big O isn't taught as one of the most primitive bits of knowledge in programming then I can't completely that trend either.