I'm no fan of universities focusing just on Java programming either, but the contentions Dewar makes are completely ludicrous. Either he's deliberately distorting things to make his point or he's completely out of touch with real-world engineering projects; Java is pretty much the main language for business programmers these days, and plenty of things like financial transaction systems are written in it, so saying it's only used in simple web applications is laughable. A large percentage of people writing simple web-apps have moved on to Ruby and Python anyway, leaving Java to be used primarily for back-end enterprise systems.
Similarly, the idea of quizzing candidates on whether they can diagnose and track down compiler or processor bugs is similarly useless; there are far more critical skills to test for in an interview. If I was interviewing someone for a position at a hardware company writing software for prototype chips, then I'd probably care; if the job was for about 98% of the other programming jobs in the world hardware knowledge is about 152nd on the list of things I'd care about.
It's hard to imagine how someone could take this guy as an authority on how universities should educate people and what job skills graduates will need when he's so obviously out of touch with the real world.
#!/usr/bin/python
####################################
# Curmudgeon_Article_Generator.py #
####################################
import random
i=[ "C", "Ada", "Assembler", "Fortran", "lisp", "binary programming with patch cords", "time sharing mainframes",
"by dropping off punch cards and wait two days for the output", "butterflies", "a magnetized needle and hard disk platters",
"C++", "C#", "Python", "Ruby", "Django", "Ruby on Rails", "Cobol on Cogs", "Scheme", "Smalltalk", "Haskell", "Erlang",
"Distributed Systems", "Parallel algorithms", "network programming", "Win32 API's", ".NET", "Perl", "Unix", "Windows 95",
"Windows Vista", "GUI's", "VAX command line", "vacuum tubes and a sodering iron", "an abacus", "clay tablets",
"wire, duct tape and spit", "transistors", "slide rules", "microcomputers with 5 1/2 inch floppies", "machine language", "a PDP 10"]
for x in range(20):
print "\nKids today don't know squat about programming."
print "Why when I was that age, we didn't have %s." % random.choice(i)
print "We learned programming the hard way, using %s!" % random.choice(i)
print "Boy, am I ever grateful."
I am biased, but I completely disagree with you. I had him as my professor three times while I was at NYU, and Dewar is very much in touch with real-world engineering projects. Also, hardware and OS-level knowledge is very important during performance optimization, which very often gets left out in "real-world" programming jobs due to lack of time in the project schedule and lack of knowledge on behalf of the very same programmers that came out of a vocational education. This is something to lament about in the industry, and not say "oh well, since we don't do it and seem to get away with it, let's maintain the status quo".
That's a different argument than the one that was made in the article; the argument in the article, that Java isn't used for complicated projects and that engineers need to be able to debug compiler or processor bugs, is an incredibly poor one.
I've done a lot of performance optimization of Java apps in the last 6 years and I've never once had to look at bytecode or think about processor instructions or OS issues; I've had to think about memory allocation and consumption, synchronization, algorithmic complexity, caching, query plans, database denormalization, etc. Obviously optimizing a C application is a different world, but it's definitely not the case that even the majority of software engineers ever need to dive that close to the hardware these days given the rise of managed/VMed languages like Java, Ruby, Python, and C#; it's good to hire someone who can do that or who can learn to if your application might need it, but in general it's just not as important as a lot of other skills, and I think it's pretty incorrect to say that engineers who can't do that will have no job prospects in the US.
The point is not that he thinks "Google is a simple web app". His point is that the vast majority of engineers that learn Java as their first programming language, and never hit systems-level stuff below that are unlikely to ever create a system as complex as Google. That seems plausible since we know how much low-level hardware stuff Google does to keep their systems running, from running a custom Linux kernel to writing their own in-house programming languages.
Not that I don't agree with you, but are the universities meant to be that vocational? Are they there for specific job skills or more for general knowledge rather then specific skill sets?
Universities should not be vocational and neither should high schools, IMO, but for some reason in the US the idea of purely vocational education is taboo. So the two get mashed together and you get this tension between the lofty notions of a liberal education and the need to make a living.
Separating people who want to build car engines from people who want a liberal education at the secondary level is seen as a sort of elitism.
I totally agree that they shouldn't be vocational, I was just pointing out that the guy's arguments about what was useful for the job market were totally specious. I personally think schools should teach Computer Science instead of just Programming (or, if they want to be vocational and give people certificates in Java Programming go for it, just don't call it a BSCS) and that they should expose you to a large variety of languages, techniques, and ways of thinking. There are plenty of things to complain about with CS coursework at various places, but saying Java is useless in the real world or that debugging compiler or processor issues is a critical skill for every engineer is ludicrous.
Similarly, the idea of quizzing candidates on whether they can diagnose and track down compiler or processor bugs is similarly useless; there are far more critical skills to test for in an interview. If I was interviewing someone for a position at a hardware company writing software for prototype chips, then I'd probably care; if the job was for about 98% of the other programming jobs in the world hardware knowledge is about 152nd on the list of things I'd care about.
It's hard to imagine how someone could take this guy as an authority on how universities should educate people and what job skills graduates will need when he's so obviously out of touch with the real world.