I think the reason you don't see this kind of course offered is because it is primarily concerning training and not education.
Imagine if your training course 25 years ago focused on the Turbo C IDE and that's all the University offered. You would be amazingly proficient at Turbo C and know all its keyboard shortcuts but that wouldn't be too relevant in today's market.
Keeping such training material up to date with the latest trends is exhausting work and not worth maintaining, especially given how difficult it is to predict what may be the next tool de jure. Contrast this with more timeless educational topics and it starts to be clearer why this sort of thing is explicitly not taught.
This is one of the issues that I have with how CS and software engineering is taught in universities.
Yes, you could look at this as "vocational training" and not "education". But you could also look at this as education with a hands-on component using commonly available tools.
Sure, you could have students memorize a list of git commands. That would be terrible. But you could also teach students to understand how a distributed VCS works, the graph operations involved in commands like "checkout", "merge", and "push", and the underlying way of thinking about data and how it changes. That both provides students with an extremely useful practical skill as well as setting them up for understanding much more complicated topics like CRDTs and distributed data synchronization. And when the tool that was used gets replaced, so what? If the concepts used by the tool were taught correctly, then picking up a new one should be trivial. I've been through countless IDEs and debuggers and version control systems in my career, but the reason I can do that is because at some point I learned the first one of each and was then able to transition that knowledge along the way.
"Training" gets looked down in a way that seems actively harmful to me. Just because a tool might not be relevant in the future doesn't mean there's not a huge amount of value that can be gained from learning it today.
I absolutely agree. If we in the industry are expected to constantly remain up to date on new technologies, it seems reasonable to expect our CS educators to do the same. Realistically these tools change infrequently enough that courses which teach these tools will likely remain relevant for at least the first few years after students enter industry. I've had many courses in which my peers didn't understand git even as seniors because it's not properly taught at my school. If you enter industry without basic git knowledge you're going to have a bad time and the one of the main points of education is to prepare you to enter the workforce.
As someone whose intro to programming class WAS built on Turbo C almost thirty years ago, let me tell you how it was:
One day's worth of handouts told you about the IDE.
Another handout was tips for debugging, including common compiler errors and what they meant.
That was it! The rest you learned in the computer cluster, with the help of your dormmate, TA, or computer cluster tutor. The course itself, from then on out, focused (quite rightly IMO) on programming concepts, structures, abstractions, and basic algorithms.
The main computer clusters on campus also had a whole panoply of brochures on how to use Unix, as well as various editors/tools. They were for use by all students, not just the CS students.
In short: it should not take a massive investment of class prep and time to teach students the practicalia they will need to know. Save it for the labs. Don't make a semester course out of it. Include the whole student body as your audience by publishing self-help materials for all.
There was a special set of libraries the class provided that wrapped the standard libraries and made sure that stuff like that was well-defined. (Well, as well-defined as it gets in C anyway.)
FWIW, Carnegie Mellon runs a "tools of the trade" course which is targeted to CS first-years, GPI [0], that is mostly taught by student TAs with the supervision of a professor. Asking students to create and teach the material seems to be a great way of solving this problem; TAs get teaching experience, students learn about the tools, professors are not duly overburdened.
Presumably though there's change in a lot of different fields (not just CS), and it would just be up to whoever is maintaining a course to keep the tools up to date with what's used in industry.
But also, how long has it been since vim/emacs were first used? Or a vcs? I think a lot of this stuff would remain pretty well the same year on year.
Imagine if your training course 25 years ago focused on the Turbo C IDE and that's all the University offered. You would be amazingly proficient at Turbo C and know all its keyboard shortcuts but that wouldn't be too relevant in today's market.
Keeping such training material up to date with the latest trends is exhausting work and not worth maintaining, especially given how difficult it is to predict what may be the next tool de jure. Contrast this with more timeless educational topics and it starts to be clearer why this sort of thing is explicitly not taught.