Hire an individual. Don't hire an old person or a young person or a guy or a girl or a... just.. don't.
An individual at age 40 is generally more qualified than he or she was at age 25. However, absent of other context the fact that one individual is 25 and another is 40 has no bearing on the prediction of whether the one individual or the other is a better fit for a technical role. Go for the individual who can show adaptability and/or a history of compounding success in the areas that you need.
My anecdote:
I once had an argument with my former non-technical manager. She was arguing in favor of hiring a woman in her late 30s with two masters degrees to serve as a lead software architect. The degrees were in two subfields of physics, one theoretical, one applied. This would be the candidate's first position outside of academia. I participated in the candidate's entire interview process, and I recommended against hiring her for any position. She didn't have enough experience and she was awful at all of the technical portions of the interview process, especially her code. At the same time I was arguing that I should've been the lead, as this was a position the rest of the team had thrust me into and I was effective in the role. The manager's argument was that the candidate was more qualified. After some discussion, "more qualified" turned out to mean older (we both agreed the degrees weren't a strong indicator).
In the end, the candidate was hired for nearly twice my salary (apparently upper management likes letters after a candidate's name). She never fit the lead role, so the team never accepted her for it. Further, she was very ineffective as a developer (we were working in Java, her experience was in Matlab and R), so we either had to refactor a lot of her work or let it go to review and decline to accept it. Everything that went her way took months longer than necessary. In the end I think she wound up working in some type of higher-end sales/administrative capacity.
Being at least an average coder is a prerequisite to leading a team of coders.
A team lead that can't code is like an Army officer that can't pass his physical tests or clean his rifle. He will not have the respect of the rank and file. A team lead (as opposed to a project manager) that can't code is doomed.
(Notice I don't mention language, a good programmer can be coding in something new within a week or maybe two and after a couple of months should be completely fluent.)
> a good programmer can be coding in something new within a week or maybe two and after a couple of months should be completely fluent.
It's a risky choice.
The project we were working on had very, very heavy emphasis on idiomatic Java. In this situation a good manager would need to recognize the deficit in the prospective lead and only ever accept the candidate if they have a rock-solid track record of success, if the team is proficient enough with the language that they can fill in the architect's gaps, and most importantly, if the team is on board.
Unless you have a very mature team that has bought into the leadership of this individual, deficit in language skills might be perceived as a deficit in overall skill, and the team will have difficulty accepting the new leadership.
[Edit: My very, very strong preference is to never hire leads. Hire team members, pay them what they're worth with frequent raises for good performance, and let the team decide who does what.]
Agreed. Learning a language is one thing. Learning its libraries is another; a stack a third; its idioms a fourth; its broad patterns a fifth. The J2EE part of my resume is an alphabet soup of supporting technologies. Rails is almost as bad, just with double entendres instead of acronyms. If you're learning the basics as you go, how are you going to make strategic, architect-level decisions? Did this potential hire ever encounter MVC in her work with Matlab/R?
I've never understood why the common line is that any good hacker can pick up a new language and be just as effective as someone who has used it for year in a short time.
I've been thrown into overdue projects in languages I hadn't used before, and have been able to start fixing bugs almost immediately... but "tweaking existing code" is a far cry from implementing new functionality, which in turn is far below making architectural decisions.
Languages exist in ecosystems that develop over years; knowing those ecosystems in depth takes a lot of time.
Absolutely--however, there are issues with rockstar/lone-wolf devs.
My old boss was an amazing coder--dropped KLOCs like you wouldn't believe, very smart guy, and could debug damned near anything through sheer stubbornness. I respected his technical skill and learned a lot from him. :)
That said, leading a team is more than just being able to catch the ball when your subordinates drop--it's structuring things so that the balls are small enough and obvious enough that even a greenhorn can make useful progress, it's making a supportive environment and architecture that is able to be extended without your supervision.
Depends on what the "lead" means. Certainly, he won't be able to be a technical lead, but non-technical people can lead technical people on other senses as long as the leader is humble, recognizes his shortcomings, and is willing to listen to and apply advice.
It is extremely arrogant for developers to believe the only thing of value a leader can bring to the table is coding chops. Compared to what it really takes, coding chops are immaterial.
While that is fair enough, I've never seen a company put a non-technical person in a lead technical role. Conversely, I've see no end of "a non-engineer can't manage engineers", which is what this feels like.
Of course, all other qualities being equal, having the addition of technical skills on top of managerial skills is good, but I've had more non-technical bosses I'd choose to work for again than technical bosses.
The prevalence of discrimination toward extreme youth in the software industry is probably a historical artifact of the fact that it's a new industry, so for the most part young people were the ones who "got it."
That being said, I think it cuts both ways. Both younger and older developers have advantages.
Older developers tend to see pitfalls and avoid them better. They tend to have a more diverse skill set, better organizational skills, and more knowledge of related domains outside programming (such as marketing, business, finance, etc.). Older developers are also less vulnerable to faddish thinking, choosing tools that they can ship with over the latest faddish language or technique. (And recognizing today's fads as yesterday's fads repackaged, which they often are.) Finally, older developers sometimes remember things that didn't work long ago but that might work today. Younger folks might not have a clue that certain things don't have to be the way they are, while an older person might be able to explain why a certain decision was made and then help them question whether it's still necessary today. A lot of innovation can be realized by searching through the discarded piles of yesteryear's ideas and asking, for each one, "are the reasons this was discarded still valid?"
Young developers have the advantage of fresh eyes. They don't carry the accumulated baggage of prior environments and old ways of doing things. They can more easily dump legacy cruft, embrace genuinely novel and interesting ways of doing things, etc. They also sometimes have more energy, having simpler lives and lacking things like kids and mortgages and personal accountants. I think it's easier for a young developer to focus laser-like on one thing for a long time.
I think the ideal team has both, and listens to both.
It's also possible for anyone, regardless of age, to take on either of these sets of characteristics. It's possible for younger programmers to take the time to scratch the surface of fads, or listen to the older ones and learn from them. It's also possible for older programmers to intentionally "forget what they have learned" and open themselves to the possibility that such-and-such isn't needed anymore, etc.
Edit: I also think it's important for teams to avoid the dark side of both. For older developers, it's the curmudgeon, or the developer so set in their ways they are not open to any new idea. For younger developers it's the cowboy who thinks they know more than they really know and will waste a lot of time reinventing wheels, chasing things already known not to work for provable reasons, etc.
I wonder how much the preference for youth has to do with the fact that younger workers are (1) generally cheaper and (2) more open to working longer hours since they are less likely to have families and more likely to be looking to make an impression.
Not to mention younger workers are often more risk tolerant / seeking. If you're 43 years old, with a family to support and a mortgage, it might make perfect sense to take a safer job.
If you're 20 with no family, no mortgage, etc. you sign up with Zuckerberg to 'change the world,' with little concern over whether it crashes and burns.
I think that equation artificially pumps up the pop culture relation between youth and doing great things at new startups (never mind the rarely told stories of failure and misery that can go with crashing and burning). Leading to a false conclusion that youth is superior at things like programming / engineering / innovation.
> If you're 20 with no family, no mortgage, etc. you sign up with Zuckerberg to 'change the world,' with little concern over whether you crash and burn.
I don't think that's really fair. Even if you work on a failed startup and invested all your money in it, you still have more programming experience than you started with. And programming experience is worth plenty of money in our field.
What does that have to do with the quality of the coder though? I've worked alongside developers young and old (as I suspect most here have) and I see no relationship whatsoever between age and effectiveness. If anything, the older developers are probably consistently better. Here I am defining older as 35+.
It doesn't actually have anything to do with the quality of the coder, it has everything to do with the perception.
The hackers that built Facebook would be perceived (by the public / media / whatever) to be better because of the halo that comes with their involvement in creating FB.
I might be wrong, but you seem to think that "young software developers" equates to "people working on startups". Programmers working on startups represent a very very small fraction of working developers.
Less experienced workers are typically cheaper because they're less productive. You may luck out hiring a super-productive, recent graduate on the cheap, but they'll quickly learn their worth.
Your second point sounds purely anecdotal. My experience has shown just the opposite; older developers with families are much more committed because...well...they have families to support.
I can't speak for anyone else.. I'm an older developer, and a bit of a "rock star" so to speak. I know I'm at the pay ceiling, and peaked out into upper mgt a couple years ago, and never again. I'll accept where the pay is, and simply work where the best fit is. I've never been one to show up at a certain time every day (usually 9-ish).. but after lunch, I tend to get a lot more done than my contemporaries.
I try to at least be aware of what's going on around me. It's not always easy as there's a lot of new stuff every day, and when you finally get around to it, things change.
Most recently, I've been dabbling in NodeJS, as it's a good fit for one-off import scripts and backend systems. That allowed my to have a grunt script setup for the client parts of a new .Net project, where the backend is ASP.Net MVC ... I've seen the bundles out of the box in mvc, and feel that it's excessively painful.
By the same note, I can usually make a good judgement call as to when to plug my nose, and just make the patch work. Experience counts for a lot. And at the higher end the pay doesn't generally match your productivity... I'm a cog, but good enough at what I do that people tend to overlook my quirks regarding daily schedules.
For every prodigy I've seen (about 3 others in my career), I've seen a several dozen that were competent, not great, but get the job done, and several more dozen idiots who really should have a different career, and enough people that could be good, but can't break out of the same patterns they've used for over a decade to be more effective that I don't like thinking about it.
There are plenty of stereotypes, and plenty of exceptions. It's funny that SO is mentioned as the source in the article, as most of my best answers happen to come from me revisiting and updating older answers/questions with newer material. Beyond that it's pretty narrow as a focus, just tend to gear towards my interests.
That's a matter of environment. A startup with equally young and inexperienced management is going to fall for the fallacy of longer hours. An environment with older more experienced management should know better. However I think its common for that experienced management to end up in corporations as opposed to start ups. So its merely a matter of perception.
I believe research (sorry no link) is showing that older people tend to actually have much better focus than younger people, and better memory of what they were focusing on (in the absence of actual neurological problems, e.g., Alzheimers).
"And the more complaints about memory they had, the more we saw this extra activation." When all was said and done, those women did just as well on the test as the noncomplainers. "My guess is there could be something happening in the brain, and people are trying to compensate for it with these extra areas," Ms. Dumas says. "And they are doing it successfully."
Really like that coda, very familiar with folks who've crusted over and become curmudgeons and have also experienced young people so smart that their new imagining of the wheel will be much better than the status quo. I encourage folks to stay curious, ask questions, listen to the answers.
Except that prejudices aren't always developed so much as taught/learned. The standard trope that "the young folk get technology" is accepted as axiomatic, and passed down, regardless of actual experience.
The slightly different statement "young folk use technology the best" is closer to the truth. Just the other day I was saw my cousin's toddler pick up her father's iPhone, unlock it, and pick the game she wanted to play. Even at 4 years of age, she's a decent user of technology.
But they don't necessarily understand the technology they're using. For example, teenagers these days are addicted to their cell phones, and texting is such an integral part of their social lives that taking away their cell is tantamount to locking them in a room for all the social isolation it causes.
They are expert users of technology, yet they don't necessarily understand the technology they use. As an example, if you give a teenager a URL for a page with information they need, instead of typing/copy-pasting it into the address bar, most will type 'google' into the address bar, put the URL into the search box, and click the first link.
Just being a highly proficient user doesn't make you an expert. Young folk are highly proficient users of technology, but they're not experts unless they've deliberately learned how the technology works.
I was lucky to be a kid around the time NES was the new thing and people were starting to get PC's at home so I've played video games from 5 years old. Still when I first started programming games, I had to start from square 0. I don't think there's any correlation with being a "natural" user of the web and understanding the technology behind it. As a engineer though, you'll immediately start dismantling that tech in your head.
I was luckier, because I got an actual computer instead of a console, and in the era when kids were just as likely to get an Usborne book of BASIC programs to type into their computer if they wanted to play games.
Kids getting consoles and similar hermetically-sealed boxes depresses me.
One day a young programmer flies into the office talking about the new MVC frameworks and how it would revolutionize how we write programs.
He was mystified when I explained how MVC works to him without even looking at what framework he was ranting about.
He asked how I knew? I told him it was invented at PARC in the 70's and I've used them starting in 1995 with the MFC C++ for document view. This was more of a reintroduction of an older methodology suited to web.
Older developers also get to see the linear progression of current technologies and how they arrived at current day implementation. What worked and what didn't work.
There is very little new under the sun, at least if you're talking about what's going on in the mainstream rather than what's happening in the theoretical space (with GADTs, dependent types, etc). It's been a very long time since I saw software development technology that wasn't just the application of something developed in the 1960's through 1980's. "Oh, Java has lambdas now? Here are the design patterns that take advantage of them. They were well-understood in 1975." "Oh, Ruby has generational garbage collection now? Here is the paper describing the performance characteristics of that--from 1984."
I think TypeScript is actually one of the more genuinely interesting developments recently--it's state of the art circa 1996 (Abadi and Cardelli's "A Theory of Objects.")
This is not really a dig at software development. There is, for example, very little in aerospace that has changed more than incrementally since the 1970's. But people have this perception of software development as this rapidly changing field where knowledge becomes obsolete in the course of a couple of years, when in reality the basic principles are really quite well-developed and mature. Keeping up with software development technology is mostly a matter of figuring out how people have gussied up old ideas with new syntax.
All of the interesting stuff now is domain specific. Algorithms for self-driving cars? Yeah, that's something that we didn't have 20 years ago.
Yes. Twenty and even thirty years ago we did have essentially the same ML algorithms used in self-driving cars today, just not enough computing power to use them effectively (I think SVMs are only about 20 years old but neural nets have been around in some form since the late '50s) There is much to learn from those who have gone before.
That's interesting. What is it about Typescript specifically that you think aligns it with "A Theory of Objects"? (not expecting a thesis in your answer)
Abadi and Cardelli lay out an object oriented calculus with object literals and structural object types. Typescript's object literals and structural object types seem very similar.
Yup, it has happen to me too. I have around 20 years in software development.
Around 9 years ago...
Me: We can abstract most of that data access with a simple table oriented orm. Not all of it, but a good 80%.
Younger Developer: Nope, That is what record sets are for, it is right here in the Microsoft docs (pointing to the ado api)
...
5 years later, he had moved to a different company, and I received a phone call from him: Just called to say that now I understand what you wanted and how it is simpler.
Me: Then I failed to communicate properly. What could I have say that would have make you give it a chance?
Him: Nothing, I needed to learn it the hard way.
Experience teach.
A few months ago, working with very young dev. team.
Tech lead (late 20s): We need to do all using TDD, if it is not TDD we don't touch it.
Me: Agree, but TDD is one of several tools to DDD; the developer needs to understand what its building, not just that all checks are in green.
Tech Lead: The devs only need the proper test.
Now they drop the core calculations from the project because they couldn't understand them by just looking at the tests, I got task with that, and a salary increase. And some how the tech lead think it is unfair because he "gets" Git and Resharper much better than me.
Your first younger developer sounds awesome. I've worked with people (some older, some younger) that refused to learn from their mistakes.
All developers make mistakes -- "I built this because I was worried about X but I should have been worried about Y." The problem is making the same mistakes over and over.
Respectfully, I have to disagree with your assessment that age is the contributing factor here. I'd argue that pragmatism is what you're seeing.
Pragmatism might correlate with experience, and experience is only gained with time (thus age), but I'd be careful about jumping to that conclusion. There's nothing beyond assumption (this article included) that says younger people are always, or even usually, less pragmatic.
[Edit: That said your tech lead is an idiot if he thinks he should be paid more for a better understanding of technologies tangential to his core goal while he has a weaker understanding of the goal itself.]
Respectfully I submit that we might not be in a big disagreement here.
Pragmatism is a very high contributing facts. My personal experience has not show me IF pragmatism is better. Instead experience has show me WHAT is worth to be pragmatic about.
My observation is that experience leads to emotional maturity, which leads to better pragmatic choices. It is the fear to make mistakes, hence the industry coined "Fail fast". It is the ego boost of "I just learned this cool technology", as opposed to "If this is so cool, hasn't been invented before? How did it look like?" (Another post showed an example with MVC). It is the "I own the world" feeling that we all have as young adults. And there are some "bad" apples in the VC culture that can exploit that.
[And I don't think the tech lead is an idiot, he is where I was 18 years ago]
> Respectfully I submit that we might not be in a big disagreement here.
I like it when that happens. :-)
> And I don't think the tech lead is an idiot, he is where I was 18 years ago
I shouldn't have phrased it that way. The idea that you're more valuable than someone who can solve a core business problem when you can't is idiotic. He himself is probably not an idiot - we've all let our emotions get the better of us.
There is indeed very little new under the sun. A lot of stuff are just repackaging of old ideas. I remembered reading the Self language paper and thought it was cool as heck, objects all the way, no classes. Then later Javascript came along with its prototype and people thought it's the shit. I was like, whatever, it has been done before.
If I remember correctly Brendan Eich was originally promised that he could build Self for Netscape, but later due to various reasons had to change the syntax and all that.
Also the popularity of javascript in the geekdom is in my recollection fairly recent phenomenon. It became interesting only after ajax, before that it was a leper like php.
And MVC was talked about a lot as a design pattern for Web apps before Struts was started - I remember we rolled our own rather nifty one in 2001 or so and failed to get our execs to open source it so it died when the team withered away after our acquisition.
But it is quite different from web MVC that was popular with web frameworks like Struts.
I haven't looked at the more recent javascript frameworks, but my intuition tells me that they are probably going back to the original MVC pattern. (Observer pattern to notify views when model changes)
Said history was referred to in the top level comment that started this discussion.
I've had as a low priority idea for a few years implementing a web MVC framework that is somewhat closer to the original inspiration - not because it makes sense for generating HTML (though it actually works quite well for that) - but because it would reduce the impedance mismatch with a client-side library using the original MVC pattern.
I have not actually done this for a number of reasons. The top one being that when you scratch an itch that you think someone else should have, it is much less fun and likely to work right than when you scratch your own itch. But it still bugs me that nobody has written one that works like I think it should...
What I meant was that Struts wasn't the first mention of MVC in relation to web apps - lots of people were rolling their own frameworks in 2000/2001 and MVC was a common buzzword.
As most developers in those days hadn't started with web apps it wasn't surprising that the term MVC was used - e.g. we were writing MVC C++ wrappers for simulations in 93/94 and Java MVC simulations in applets from '95.
I was writing RDBMS in dBase, compiled with Clipper, on MS-DOS, in the early 1990's, when I was about 30 years of age. (I was the "old guy" 20 years ago)
Wow! The decades just keep passing by with increasing relevant speed.
It is a well-documented fact that time passes as a percentage of life lived. There is far less relative distance between decades for me now than there was between my fourth and fifth birthdays. I have kids who have kids, and they weren't around for that last episode of MASH (left out the asterisks for formatting reasons) that I'm pretty sure (in what's left of my mind) was first aired a couple of years ago.
A friend of mine told me this longer ago than I care to be reminded of. "The older you are, the faster time goes. The telescoping effect can be quite startling."
I've found a wealth of talent by swimming against the Silicon Valley tide of favoring youth.
There is an annoying pathology at work though. During the 90's most experienced developers were incentivised to move into management or sales roles. Many of the management/sales roles these developers moved into have disappeared.
There is a population of older software developers who kept coding through the late 90's and early 00's who are almost invariably awesome.
There is also population of older developers who left day to day coding work and have seen their employment prospects diminish who are trying to get back into development. This is a more difficult population to work with.
As a 40-something who is 20 years into his career, I don't think I'm a particularly better PROGRAMMER than I was about five years in. But there is SO much more to being a useful tech than programming! And I'm much more sophisticated about big-picture thinking, business, and planning these days. Back when I was barely past "junior", I got to work on some really cool software at a dotcom-era startup. Technically, it was a blast. As a business decision, it was absolutely idiotic, and it was a massive management failure to have us wasting our time on it at all. (Basically, the idea was that six man-months of code at the time would save the company millions in the future. Which would have been true, if it HAD a future. Never, ever waste precious development effort on something that can be bought easily off-the-shelf and isn't core competence for your product! That's not even a good use of time in an established company, much less a ticking-clock startup!)
As I'm prone to saying, I cost as much as two junior programmers, but I can do things that two junior programmers can't do. And actually, cost is one of the problems older programmers have... senior == expensive. On the other hand, if you're in the industry for 20+ years and haven't found a way to make yourself and your work more valuable than what some drone straight out of college can do, but you expect to get paid more than said drone, you're asking to get sidelined.
The problem isn't the myth that older programmers are less flexible than younger ones. The problem is that so many older programmers haven't attended to their careers in a way to make themselves substantially more valuable than some pliant kid. It's not enough to do the kid's job better... you have to do things the kid CANNOT do.
"I cost as much as two junior programmers, but I can do things that two junior programmers can't do"
Don't undercut yourself; you provide much greater value than that. People outside of engineering don't have a really clear picture about the roles and positions, and the cost structure of running a development team.
I remember a conversation with one of the clients. He was trying to interview candidates for project manager position and asked me for its job description. I explained to him what the typical structure of a development team is and what roles different people play, like project manager, development manager, architect, team lead, front-end developer, back-end dev, dba, UI designer, qa, etc. Then I casually mentioned I'm doing all those for you in this project and you are actually saving a lot of money despite my high billing rate. He appreciated that.
This reminds me of a conversation with an old friend... a pic of the two of us when I was 20 or so appeared on Facebook, and he asked that if 20 year old me met today's me, what would surprise him most? I replied that it would be how today's me kept calling 20 year old me "dumbass".
There are a number of such domains... security, DSP, real-time systems all have "unnamed" skills to them.
Games happen to be pretty good as well because they let you wrap several different kinds of technologies together - the tricky part with games is really that they need great design much more than they need good engineering, and yet you can ignore this as a coder because you can always make your renderers and simulations more generalized and complex.
I find the study to be highly suspect. Looking at Stack Overflow means that you're only looking at people who are looking to learn & grow. It could be perfectly possible that programmers "weed out" of career advancement as they age and stay stuck with the same skills but those same programmers never think to create a SO profile.
I have reservations about the study, but for different reasons. I suspect there is a correlation between age and the perceived value of participating in online discussions. I also suspect that there is a correlation between age and the perceived value of that person's contributions to an online discussion.
I suspect that in Stack Overflow discussions which discourage lols and memes, older programmers are more likely to find an intrinsic value from participation and are likely to be better able to communicate their responses clearly (just by having more experience writing).
Joel Spolsky created a site he (as an over forty programmer) deemed constructive. Those sharing his general experience are more likely to share his views about what is constructive.
The same might be approximately true for HN (PG is about a year older than Joel Spolsky). But the less ruthless editing of HN may play a role.
The discrimination towards extreme youth is all about money, because they can pay them 1/4 what a senior developer would ask for. They also will probably do a bunch of free overtime and not complain, until they've been there a few years and realize the shaft they are getting.
I was surprised how little the young Sr Android developer at Google makes. They obviously hired him to pay him as little as possible as compared to an experienced lifelong developer. Google I guess doesn't care about bugs or experience seems like most tech companies they just want to keep pushing new releases to get you to buy and upgrade to them.
The Sr Android developer is constantly complaining about a lack of free time on his google+ page and has been taking a lot of leave. He's yet another young developer working obscene amounts of hours so Google doesn't have to pay for extra employees.
It is fundamentally hard to find good developers, but that is not because there are so few good developers, it is because there are so many bad developers.
I meet great programmers all the time, but I never see them on job interview. The reason is because good developers do not apply directly to jobs. They find out about positions through their friends and move through their network, not through recruiters.
And it's the correct way to do things. You're never going to be able to learn who a person is, the whole person, from a resume and a battery of interviews. You'll learn who they have groomed themselves to look like, which won't be a real person.
The answer to all the problems with hiring--not enough female candidates, not enough skilled candidates, etc.--is all entirely due to the hirer not socializing enough with the types of people he or she wants for their position. If you don't know the people you want, then you won't find them.
In short, treat people like people, not equipment. The noun in HR is "resources" not "human".
>> The reason is because good developers do not apply directly to jobs. They find out about positions through their friends and move through their network, not through recruiters.
So if I were to go by that then, I'm a bad programmer?
There appears to be no attempt to correct for survivorship bias in this study, making it at best highly suspect and at worst, worthless.
For all we know, people who are good at getting karma on Stack Overflow stay on the site simply because they're good at getting karma on Stack Overflow.
Obvious response: the programmers whose skills declined over time stopped being programmers. Selection bias, in other words. Apparently the study didn't take this into account?
Edit: I'd love for this study to be meaningful, as I'm in my 30s. But it doesn't seem to be.
Some say, "Use it or lose it," but I think it's more like riding a bike, you never really forget how, but you might need some practice with the new bike before you get up-to-speed.
As a developer, it scares me to think that I may have a harder time finding a job after 40. For me, 40 is not far off. I love coding and want to do it professionally until I retire. I hope this trend of discrimination ends soon. It is sad to think that people would be discouraged from doing what they love just because of discrimination.
Wow - is this what constitutes 'research' these days? Look at some data on stackoverflow.com as if it's a representative sample, and then make some general conclusions?
> They found that an individual’s reputation increases with age, at least into a user’s 40s.
I really hope that's some bad paraphrasing, because I take that to mean for user X at time T, rep(X, T1) <= rep(X, T2). Unless you're very actively malicious (for which you'll get banned) your reputation should never go down by a meaningful amount over time.
Honestly I think it's more poor interpretation than bad paraphrasing. They were sampling reputation and correlating with age, not tracking reputation scores of single users with time. So a more precise, if not clearer statement would be "They found that median reputation increases with poster age at least into a user's 40's". Better?
Whether this constitutes good science or not is not something I'll speak to, having only skimmed it. But certainly stackoverflow.com looks like good data for this kind of research to me. I'd hope they'd have done things like control for different fields, different activity times for the accounts, etc... But I don't know.
Dr. Murphy-Hill uses absolute StackOverflow reputation as the only metric. He demonstrates that older developers spend more time sharing knowledge than their younger counterparts, but I'm not sure about the jump to "skills improve over time."
I suspect high reputation on SO reflects (at best) either a developer's perceived confidence in her skills or her desire to share whatever knowledge she has! Perhaps the title should have been "older developers like teaching." Not as sexy, though.
I was older when I started-36 year old freshly minted Computer Engineer. Dumb lucky enough to take the first offer I got from the folks I had been interning with. Siemens Communications in Boca, 1996.
I was working with about a hundred engineers on a huge code base in an
arcane language (CHILL).
I was able to learn Intel assembly to patch live systems, pretty good proprietary configuration management, and quite a bit about the Dilbert principle.
At that time, I worked with Old guys who were great, and old guys who were less proficient than me. One of my friends who was 22, came in and ran a project.
The bottom line is, if you were adept when you started, you most likely get better with age, if you just apply yourself a bit.
As long as the developer keeps curiosity and learning new software technologies rather than getting comfortable with the tool that he trusts to get things done, which, I think, is a stage which the majority of developers will reach one day. I am not sure if it's wiser to continue coding without the fun of learning new tricks (meaning when new tricks no longer interest you).
I think when that happens a real hacker will stop hacking code and start to hacking something else. That being said, I am sure there are hackers that never get tired of learning new tricks.
I'm curious as to how the situation will look 20 years down the line as the current (young'ish) workforce grows up. My hunch is that as the industry matures, the current trend of age discrimination will gradually fall by the wayside.
There will always be some predisposition, as within most industries, to tend towards age discrimination in certain situations. However, my guess is that software will begin to look more like established engineering professions, demographic wise, as time goes on.
The problem with using stack overflow as your dataset is most of the programmers on the site are interested in expending their technical knowledge. So if 50% of "young" programmers are on stack overflow but only 20% of "old" programmers are on stack you get results that don't indicate the actual population. The only thing we can actually gather from the data is that with "programmers on stack overflow older is wiser."
I'm not old, per se, I'm probably at my peak marketability-- old enough that, if I were stupid, I would have washed out by now, young enough to be stereotypically perceived as vigorous. I can grow a nice lush beard and there's no gray in it yet.
The only real doomsday issue I ever ran into with older programmers was an increasing unwillingness (in some) to add onto their stack. The good news is that after the third or fourth new language you pile on, the next one gets a lot easier. If you know C, a lisp variant (although I prefer an ML variant), and a decent scripting language (I learned perl first, but I sort of hate it), you're not likely to get surprised by much. The programming hivemind really can't process new ideas all that quickly, so you rapidly gain exposure to the core ideologies that form the foundation of the fancy 'new' trends. And then you're just learning some syntax and getting comfy with a new API. Unless you get lazy, shit really does get a lot easier. But with tech, the next wave is inevitably coming, and you're either on it or under it.
"If you know C, a lisp variant (although I prefer an ML variant), and a decent scripting language (I learned perl first, but I sort of hate it), you're not likely to get surprised by much"
I think a person with that knowledge set might be surprised by this:
New programming paradigms emerge over time, and force us to rethink our approaches to solving problems. I think the key is not learning many languages, but learning many paradigms, and becoming skilled in the use of one paradigm before moving on to the next. Learning several languages with the same approach to programming is not as useful as learning several languages with completely different approaches.
Those 40 somethings? Yeah, they built the modern Web and Internet.
To use an easy example: who do I want on my team, John Carmack (42), or some 19 year old kid that is working on his first engine (and is likely to stumble over a million landmines due to lack of experience)? Easy choice, I'll take the ridiculously vast experience and refined skillset.
Sure, and the programming work he was doing back then required very little prior knowledge. He invented a lot of it as he went, and has been able to scale from day one with the radical increase in complexity.
Try building something like the latest id Tech engine or the Unreal Engine from scratch. Carmack could do it, a 19 year old kid writing his first engine could not.
Isn't it clear? Experienced developer can give you huge list what to do and why, and what not to do and yet again exactly why. I'm seen millions of ways writing extremely bad and unreliable code. I can tell exactly why not to write such code. Nothing kills more productivity than totally unreliable code.
Use transactions, locking, handle exceptions, give clear error indicaton, log possible issues. Use auto-recovery, if possible. It's not that hard, it should be really obvious for anyone. Don't write stuff that totally kills performance, use batching, sane SQL queries, indexes, etc.
You guys don't get it. It's not about how old you are, it's about what kind of persona you project. If you are confident, tall, good looking, have all your hair, young enough _looking_ for your age, well rounded, intimidatingly brilliant, experienced, and professionally successful, then you are good to go - age does not matter. If, on the other hand, you are all washed up then yes age is a negative factor. This doesn't just apply to software programming, the same patterns occur with, e.g. investment bankers.
I am unaware of any technique that translates knowledge metrics into skill metrics. This leads me to question why no is discussing that this research is solely based on stackoverflow.
Knowledge != skill
To me, all this study proves is that people who have spent more time on the earth have accumulated more information, on average. Which should be completely unsurprising.
When interviewing programmers, I've not noticed any stark differences between the old and the young, but I have noticed a difference between years of experience, as expected.
Yes, hire an individule that is excited and willing to learn new things and keep up with this crazy industry the way others keep up with sports, politics, and whatnot ;)
I must admit ... I am in awe of the programming prowess of some of the older devs I work with - we're talking about people aged 55+! One guy I work with seems to actually know what the bazillion options in Eclipse do. You can only do this with experience in my book.
Typical availability heuristic. We meet a couple of Jurassic programmers that are stuck on their old ways and we correlate those unrelated properties.
You'll be surprised of how many "young" developers are reluctant to learn a new language, tech or OS.
This idea deserves more than this short post, but, in my experience, developers who learned to program in the 80's are consistently better than those who grew within GUIs.
Either that, or I just communicate better with people my age ;-)
I'm sorry, isn't that what's called "experience", which is about the only factor that differentiates an "old" person vs a "young" person when they are considered having identical theoretical knowledge ?
There are a lot of badass older programmers in Real Technology, but VC-istan does seem to be a young man's game.
It's not that these people think older programmers are incompetent. A few of them do, but mostly, it's just brutal age-grading of peoples' careers. If you're a 45-year-old non-manager (never mind that you might not want to be a manager) they're afraid to have you around their 22-year-old, Red Bull-drinking brogrammers-- especially if you've done a couple startups and you're still not rich. Then the VC-istani founders definitely don't want you around because you might indicate to them that billion-dollar exits don't happen just because you're a nice guy.
Before we backslap too much, note that the study isn't about programming skills, it is about sitting on SO all day answering questions instead of working.
That would be a more compelling argument if the startup community hadn't spent the last several years claiming that participation in online forums like StackOverflow is a good metric for evaluating the quality of potential hires.
An individual at age 40 is generally more qualified than he or she was at age 25. However, absent of other context the fact that one individual is 25 and another is 40 has no bearing on the prediction of whether the one individual or the other is a better fit for a technical role. Go for the individual who can show adaptability and/or a history of compounding success in the areas that you need.
My anecdote:
I once had an argument with my former non-technical manager. She was arguing in favor of hiring a woman in her late 30s with two masters degrees to serve as a lead software architect. The degrees were in two subfields of physics, one theoretical, one applied. This would be the candidate's first position outside of academia. I participated in the candidate's entire interview process, and I recommended against hiring her for any position. She didn't have enough experience and she was awful at all of the technical portions of the interview process, especially her code. At the same time I was arguing that I should've been the lead, as this was a position the rest of the team had thrust me into and I was effective in the role. The manager's argument was that the candidate was more qualified. After some discussion, "more qualified" turned out to mean older (we both agreed the degrees weren't a strong indicator).
In the end, the candidate was hired for nearly twice my salary (apparently upper management likes letters after a candidate's name). She never fit the lead role, so the team never accepted her for it. Further, she was very ineffective as a developer (we were working in Java, her experience was in Matlab and R), so we either had to refactor a lot of her work or let it go to review and decline to accept it. Everything that went her way took months longer than necessary. In the end I think she wound up working in some type of higher-end sales/administrative capacity.