The world of software development moves faster than almost all other fields at the moment, particularly with regards to the tools that we use.
All other things being equal, the person with experience in the specific tools the organisation uses will get the job.
Its just an unfortunate consequence of the industry we are in.
You are of course free to not chase the bleeding edge tools in your spare time, and youll probably stay employed just fine. But there is a risk you can be left behind or miss opportunities.
The people who were hacking on early IOS, Rails, Scala etc are todays experts enjoying niche markets. The people who didnt are working on VB or PHP for a Sausage factory somewhere.
Side note: The people who chose to hack on the less-sexy Android OS are also extremely employable right now. I've had to turn down enough work this year that I could have employed three other Android programmers full-time, if I'd had them on call.
I learned Android on my own time, because I liked the idea of a Linux-based phone OS. Most of the jobs I've gotten have been EXACTLY because I've been playing with the right technology on my own time. OP is right on, and VexXtreme is making noises like a mediocre developer who has a chip on his shoulder.
Agreed. It can be very frustrating to see technology change so, so rapidly. What we worked hard to learn is worth a bit less each year.
But The fundamental data structures/algorithms don't change and the problem solving skills don't change. For example, get an Oculus Rift dev kit, fool around on that for a few months when the consumer version comes out you'll probably have more work than you can imagine. It's the same pattern.
Yeah, that is rather annoying isn't it. It's like, do we really need yet another way to build a web application? Most of these frameworks don't seem to offer anything novel but rather re-implement the same MVC pattern over and over again. But I'm not a web programmer, so I wouldn't know.
On the other hand, Linux kernel programming, which is what the author mentions, hasn't changed that much over the years. What the author seems to bemoan is that the developers are letting those core skills slide because they become dependent on the tools they use in their day-to-day jobs. But actually, I'm not convinced that the mediocrity of the candidates was due to decaying skills. They may not have been particularly skilled to begin with. Experience doesn't necessarily correlate to competence, after all.
I don't think that's true at all. According to this paper: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2909426/ The number of new medicine journal publications significantly outpaces the number of new computers science journal articles per year.
Doctors are expected to keep up to date with the most recent research as part of their jobs, and they are payed to do so by going to conferences etc... I don't think the same can be said for software development.
Academic CS research has practically nothing to do with professional programming. Professional programmers constantly learn new tools and technologies... for the most part these aren't academically novel. (Much to the field's detriment)
Does your employer not send you to conferences and training? Any hack-days, "20% time" or whatever percentage or encouraging people to set up lunch & learn equivalent meetings?
All other things being equal, the person with experience in the specific tools the organisation uses will get the job.
Its just an unfortunate consequence of the industry we are in.
You are of course free to not chase the bleeding edge tools in your spare time, and youll probably stay employed just fine. But there is a risk you can be left behind or miss opportunities.
The people who were hacking on early IOS, Rails, Scala etc are todays experts enjoying niche markets. The people who didnt are working on VB or PHP for a Sausage factory somewhere.