I have to admit, I've never been satisfied with any of the arguments for teaching this, other than intellectual stimulation. Now, I took a PL course before my working career, so I don't have a frame of reference, but does it really make a difference?
That is, does a Programming Languages class make it faster to learn new languages, give us the ability to choose the "right tool for the problem," or provide any other benefit? This paper (and my professors) all said so. But I can't help but wonder if this is confirmation bias: most of the time, language choices seem to be made based on what other people are using on the target platform, and there's no scientific way to track how quickly any given individual learns a new language.
From my POV as a corporate Java-trained accidental software engineer with an Econ degree, yes it makes a difference. Besides everything the paper mentions, all of which I agree with, there are a few other things it does for you.
I'm not a computer scientist, as the article seems to target, but a practicing software engineer, but I feel more like an amateur hack. The reason is b/c with almost every problem I'm tasked to solve or app to build, I'm never quite sure whether I'm solving or building it the most efficient, robust way (and some times I know I'm not, but am limited by my tools).
That lingering doubt makes me feel like a charlatan, or a used car salesman, hocking my wares while suspecting their quality.
To fix that, I've recently taken a sabbatical just to cram through SICP, PCL, Norvig's AIP, K&R and some other C and kernel hacking books, and Haskell and Ocaml, Erlang and/or Google Go, and probably some of the other highly regarded books on the topic (Code Complete, etc). It was just too slow-going trying to do that while working full time. If there were a degree program with that curriculum I'd go do it, but there's none that I know of.
Consider it your due diligence. Doing it as thoroughly as possible gives you confidence and a valuable psychological edge, be your opponent a tough problem or a ten-cent-per-hour rent-a-coder overseas.
The other tangible benefit is to your personal brand marketing: there's not much more of a clear signal that you are curious, eager to learn, open minded, and passionate about and dedicated to your craft than learning multiple languages, and these things are essentially requirements if you ever hope to work for/with the best programmers and engineers in the business. The only things better are grad school and having built and made money off a cool startup or app. If you want to get away from the slavery of the 'enterprise java douchebag' shops, to quote Zed, this is one good way to do it.
Now, I took a PL course before my working career, so I don't have a frame of reference, but does it really make a difference?
Most people judge a language by a) what the syntax looks like and b) what other people say about the language. If you have a formal understanding of programming languages, you can make an actual engineering decision about which language to use instead of being herded from trend to trend with your mouth agape at each marvelous reinvention of old CS concepts.
I think it definitely does. If you understand Programming Language fundamentals, it tends to be easy to understand a new feature in any language, as you can usually figure out how it really works underneath. You know "why" more than "what," which can be really helpful if you're changing things up all the time.
I agree. As an example, when I learned C++ I already knew the principles of object-oriented programming (from Smalltalk) so learning the OO parts of C++ was relatively easy for me.
Learning a new language, provided you already know the concepts the language uses, is mostly a matter of learning new syntax.
"The key to building a complex system in any domain is identifying and implementing
suitable abstractions for that area’s core concepts."
Without a broad overview of the key abstractions, coming up with powerful new features is going to be that bit more difficult. They cite the example of Lisp influencing Google's MapReduce framework.
As a long time hacker (and BS/MS of CS degree holder), I think there are tremendous academic/"real world" disconnects between understanding the paradigms of programming languages, choosing appropriate implementation languages for projects, and the fact that programming language is often chosen for you (because of platform reasons or "popularity" in the SW engineering labor pool, e.g. Java, C#, or others because of management perceptions of skillset availability in the programmer market). Heck, I struggle getting my group to allow us to write new software instead of using unsatisfactory existing frameworks and solutions. Forget about pushing the envelop by saying something crazy, like maybe Ruby/Sinatra would be a good choice for this CRUD app . . .
I suspect, but cannot prove, that a proper programming languages class would improve a hacker's ability to know and choose the right tool to attack a problem. My own experience with, of all things, SQL, has led me to be much more productive/effective in producing software that works when measured against lines of code written. But frankly, I think that it is going to be hard to get the kind of deep exposure you need in a different programming paradigm in a one semester course as an undergrad to make much difference here - it was only after spending months of time in the deeper pool of db programming, that I wrapped my head around actually thinking differently about problems.
The simpler truth is that there seem to be a number of bright programmers who can write the vast majority of standard corporate software with a less than optimal language/platform choice. Sure, maybe it would have been faster or more expressive to have used J to solve a particular problem, but if somebody hacked the solution out in C#, the end-users of the software don't care at all.
I think you are correct that most of the academic world vastly overestimates how much choice is available at most software shops to chose the "right tool for the problem."
I do like the CTM book by Peter Van Roy mentioned in the article very much - it is not a book that I used as an undergrad or grad student, but I have been skimming it and like very much how it is laid out and has a single language/env where it implements different programming paradigms.
What I wish I was personally better at would be having a small set of languages, easily available and justifiable to non-technical decision makers that I knew would make my team more productive. Sort of a "problem patterns" knowledge that you would automatically think "aha, language X is often used for this type of problem space" that was taught. But you'd need someone in a programming languages course with a deeper exposure to industry computing to pull it off as an instructor . . .
But I think the key benefit might simply be breaking the hacker's mind from thinking OO and procedural only . . . if people don't learn functional and declarative programming, how can they begin to fathom the situations where it's exactly the right way to conceptualize a solution?
That is, does a Programming Languages class make it faster to learn new languages, give us the ability to choose the "right tool for the problem," or provide any other benefit? This paper (and my professors) all said so. But I can't help but wonder if this is confirmation bias: most of the time, language choices seem to be made based on what other people are using on the target platform, and there's no scientific way to track how quickly any given individual learns a new language.