> Teaching C, for instance, is nearly useless if you're trying to convey FP concepts or object-orientation.
C is superb for teaching how to manually implement polymorphic behavior in the absence of object-oriented features. Using it to demonstrate how (class-based) OO languages work should be standard in any course that has CS in its title. (Using it to demonstrate how to implement prototype-based inheritance would be an advanced course).
> C is superb for teaching how to manually implement polymorphic behavior in the absence of object-oriented features.
Building a complex language from scratch is fine, but doing it by extending C isn't the best way.
I happen to like C. I'm good at C. I understand the C mindset, to the extent it has one, and it isn't especially conducive to OO. (It's absolutely horrible with FP, because the fact you have to manage memory manually does terrible things to function composition.) Which would be fine, if C were actually a low-level language. It isn't.
C gives you a simplified, high-level machine model. It isn't that close to the machine; if it were, you'd have access to things like SIMD hardware and everything else C can't give you because it would break portability.
(OK, you could possibly have access to them if you used C to implement an interpreter for a virtual machine. But that kind of defeats the purpose.)
So sure, implementing a non-trivial programming language is a fine goal; however, I'd do it by generating assembly language, because doing it in terms of C forces the programmer to adopt C's limitations (every function has a single return value because there's no real stack access; there's no access to processor flags; no access to parallelism of any kind; etc. etc. etc.) for no real benefit in that context. (School projects don't have to be portable.)
Edited to add: And if you're writing a compiler, C isn't especially interesting. Haskell and parser combinators are interesting; Common Lisp and macros are interesting; Python is, again, not especially interesting, but at least with Python you're debugging errors related to the problem domain, as opposed to errors related to the intricacies of manual memory management.
> Using it to demonstrate how (class-based) OO languages work should be standard in any course that has CS in its title.
I'm not sure if you meant to put it that way, but that is a horrible standard for "any course that has CS in its title". I would not like to walk into a course on, say, intro to AI, and have a session demonstrating how class-based OO languages work using C.
For the general thrust of your point, cbd1984 pretty much covered everything I would have said in response.
C is superb for teaching how to manually implement polymorphic behavior in the absence of object-oriented features. Using it to demonstrate how (class-based) OO languages work should be standard in any course that has CS in its title. (Using it to demonstrate how to implement prototype-based inheritance would be an advanced course).