Hacker News new | past | comments | ask | show | jobs | submit login
Coding Horror: Skill Disparities in Programming (codinghorror.com)
17 points by raju on Feb 5, 2008 | hide | past | favorite | 9 comments



Interesting because it is in direct contradiction to a lot of research on expertise which strongly suggests that expertise takes time (and surprisingly, regardless of the field, it takes more or less 10 years) - summarized in this book http://www.amazon.com/Cambridge-Handbook-Expertise-Expert-Pe...

Of course we've all read Peter Norvig's "Teach yourself programming in 10 years" http://norvig.com/21-days.html

Caveat: Just doing the same thing for 10 years does not an expert make. To quote Norvig's article

"the maximal level of performance for individuals in a given domain is not attained automatically as a function of extended experience, but the level of performance can be increased even by highly experienced individuals as a result of deliberate efforts to improve."


You need ten years to know how to program, yeah. But once you've done that, you can probably get solid in Java/Ruby/Blub++ in six months.


The more I learn about development - and the more I work on real projects - the more I realize that we're perpetually at the "apprentice" level. By that, I mean that we're writing new code with no standardized solution. With few exceptions, old problems eventually get packaged up in a preexisting library or app, so unless you're fanatical about reinventing wheels(a good student exercise), all programming problems eventually involve a stage of research; maybe several stages, as once you get far along one track you realize the structure of the code, which probably looked OK at one point, is now completely wrong and get the urge to rewrite. The old "write one to throw away" adage.

And indeed, once you've rewritten three+ times, you've probably figured out a nearly optimal structure for the particular problem, but most production code can never get there. The perceived cost of rewrites is very high, and it's hard to know, in a strict business sense, whether it's the right thing to do.

That's why I think experience can only help so much.


Probably as management becomes more adept at spotting true programming talent, average or sub par programmers will for essentially the first time be forced to improve themselves. Many probably haven't improved because until recently they haven't really needed to.


As a consultant, I met a lot of people in organizations with very useful (but decayed) applications in maintenance mode who just were there 9 to 5 for a paycheck. They had no interest in architecture or patterns or learning more about their craft at all. To them, it was just drudgery.

More shocking, years earlier in graduate school, I met a fellow student who old me, "To heck with understanding this. Just give me the answer so I can get my diploma!" She wanted to go into management.


If the labor market works, it seems like people should settle into positions where they are about average--and the standard deviation should stay small. If you take a random group of coders and give them brain teaser-style problems, of course large differences between performance will occur. If you gave people tasks completely outside their abilities, of course they will score a "zero". That says nothing about how they do their actual jobs.


Back in 1997, I met a programming idiot. He was one of those C programmers who regarded programming as a kind of "magic." He didn't understand anything about virtual machine semantics and separation of processes. Well, he moved to Atlanta for an $80k job, which he got through connections from his church, which was also a significant increase from his IT job at a public TV station.

No, the labor market is not perfect!


What does "an average of 7 years' experience" mean?

My years of programming have been very different to the guy who has been programming at a 9-5 job for the same amount of time. And, yes, also different from the guy who has sacrificed his entire life on the altar of programming for the same amount of time. It would be hard for a study to capture these differences, but I bet they are correlated with programming ability.


typically years of experience refers to using the given technology or product as if you were working full-time using it (so about a minimum of 40 hours per week). I'm going to average it out to 20 hours since there is plenty of overhead in a 9-5 job. Assuming 50 weeks per year that's about 1000 hours per year. I'd say 7 years translates into 7000 hours of experience (this reminds me of the way pilots count experience as the number of flight hours).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: