Hacker News new | past | comments | ask | show | jobs | submit login
Why Ageism Never Gets Old (newyorker.com)
212 points by eludwig on Nov 16, 2017 | hide | past | favorite | 262 comments



> In the nineteen-twenties, an engineer’s “half life of knowledge”—the time it took for half of his expertise to become obsolete—was thirty-five years. In the nineteen-sixties, it was a decade. Now it’s five years at most, and, for a software engineer, less than three.

This sort of mythology is NOT helping. An experienced Software Engineer has valuable skills that do not age at all. Yes there are new languages and frameworks we have to learn, but if we apply any effort (and we've stayed current over the years) that happens really quickly because we're just integrating all the knowledge we already have around a new set of features.


This mythology leads directly to horribly created software that will either ensure many terrible security breaches in the future or the continued proliferation of "magic" throughout a product.

One of the issues is using the word "engineer" for any of the brogrammer types. an engineer in any other proper engineering discipline does not become worthless with age, their experience actually counts for something.

Somehow we have confused people who stitch together software from pieces of "magic" that they do not understand with engineers.

The work I did in my 20s cannot even begin to compare with what I am doing today in my 30s. Those years of experience really makes a massive difference.

It is time people are honest about ageism and simply states the real reason they prefer fresh grad or 20-something brogrammers. It normally boils down to those people being willing to work longer hours for less money. At least in most of the cases of ageism that I have seen.


One of the issues is using the word "engineer" for any of the brogrammer types. an engineer in any other proper engineering discipline does not become worthless with age, their experience actually counts for something.

My fiancee who is a commercial bank loan underwriter made this observation: There are some people who have been in her industry for decades, but effectively have little experience, because they don't examine their decisions and figure out why things went right and went wrong. Then there are other people who have far less "experience" in terms of the number of years they have been working, but are effectively more experienced, because they can analyze new opportunities using the data they've encountered before.

Experience as a software developer does count for something, so long as you're aware, and you're constantly developing models for evaluating cost/benefit and risk. If you can add in the ability to deal with and manage people, then you will have a very valuable and uncommon toolbox of skills!


It goes with the adage: "Some people have 20 years of experience; some have one year of experience repeated 20 times."


First encountered this concept in Reid Hoffman's book Startup of You


I don't think it's ageism if one person is selected based on being willing to work more for less money. That's just a labor market.

There's an argument to be made that 20-somethings shouldn't work more than 40 hours a week or should require higher pay. Or that employers should place more value on the work of more experienced developers. Or both.

But I don't think experienced developers should be exempt from keeping what they're bringing competitive with less experienced ones. That's a market--which is reality.


> I don't think it's ageism if one person is selected based on being willing to work more for less money. That's just a labor market.

Agreed, however very few ever come out and says that so directly. In a lot of cases the reasons are given as not having the right "cultural fit", or being "over qualified" or some other excuse because the person hiring cannot stomach it to simply say "we want cheap resources that we can work to the bone".

No one should feel like they need to perform the equivalent of slave labour in order to get a job. 20-somethings should not have to work 80 hour weeks. Allowing them the free time to pursue their interests will make them better employees.


Agreed on all points, but we're going down a dark road when any measurable disparity is assumed to be an unfair "-ism" without evidence, that must be corrected because clearly the decisionmaking gatekeepers are "-ists."

It's possible the problem (at a macro level) is caused by other correlating factors, such as differing salary expectations and how up-to-date skillsets are.


I'd like to share my story related to ageism:

https://news.ycombinator.com/item?id=15642428

If it's that hard for me, it seems true to say it's much harder for someone much older.

I hear you about the -ism fanaticism. But there seem to be serious problems on the horizon. I can't make you believe them. All I can do is try to relate it to something in your life.


Thanks for sharing this--and sorry in advance if I came off abrasive.

I just think the key difference in our industry is that it's far more punishing than others if you're not keeping up to date with relevant skills.

I'm 29 and my last search was two years ago. My last job had me primarily doing ColdFusion-based web development and occasionally iOS development. A big part of the reason I left was specifically to avoid finding myself trapped with few relevant skills a few years later.

I got into and built hobby projects around an upcoming "hot" new paradigm (serverless) and learned Python along the way to do it. Being able to talk passionately about that (and cloud architecture in general) was key to finding my next position even though most of my cloud experience on the job was with the plain stuff (like normal VMs).

But despite being 27, I wasn't immune from having to branch out, learn new things, and become passionate about something to find a new position. Respectfully: our industry doesn't allow for one to stick to the same skillset for long. Your post reinforces what I had feared, which is that people are blaming ageism while sitting on a skillset that is no longer competitive.

I think this is a problem that needs to be handled, and maybe that's through a more industry norm "20% time" for trying new things, a stronger case for experience applying well to new concepts, or something else. But I don't think the problem is ageism.


> But despite being 27, I wasn't immune from having to branch out, learn new things, and become passionate about something to find a new position. Respectfully: our industry doesn't allow for one to stick to the same skillset for long. Your post reinforces what I had feared, which is that people are blaming ageism while sitting on a skillset that is no longer competitive.

That's shortsighted and your employer's process sounds shortsighted.

I don't give a crap if you've already used React, Redux, Express, whatever new flavor of the months are coming. If you're 23 and actively using them, odds are you're still going to be worth much less to my team than someone 40 who has to pick them up their first couple months but can immediately contribute valuable, hard-earned broadly-applicable timeless lessons about how to build and operate software.

My job as employer, and my team's job as onboarders, is to train you in the specific skills. Yours is to bring a proven ability to learn, and the general knowledge, wisdom, and thought processes to know how to build things right.

I don't care if you're passionate, I want to know that you'll be professional enough to take pride in what you do and hold yourself to a standard worth being paid for.

If you're a hiring manager and can't figure out better questions than "can you prove you know some things about X technology already," well, please don't get too much better, cause I enjoy being able to hire the skilled people you overlook without as much competition. ;)


> I don't give a crap if you've already used React, Redux, Express, whatever new flavor of the months are coming. If you're 23 and actively using them, odds are you're still going to be worth much less to my team than someone 40 who has to pick them up their first couple months but can immediately contribute valuable, hard-earned broadly-applicable timeless lessons about how to build and operate software.

Odds are fine, but they aren't a shortcut for using your brain. If a team's culture makes its members instantly side with either the 40 year old or the 23 year old when he/she makes a claim like "we don't need to write tests", "there are no problems in my code", or "we should convert our entire existing repo to <different style/library/language>", then that culture is stifling discussion and actively harming the quality of the code the team writes. Same applies to hiring decisions.

It's frustrating to see ageism swinging back and forth like a pendulum. There's value in having both the crystallised intelligence of older workers and the fluid intelligence of younger workers on one team, provided they firstly get hired and secondly manage to work together.


> I don't give a crap if you've already used React, Redux, Express, whatever new flavor of the months are coming. If you're 23 and actively using them, odds are you're still going to be worth much less to my team than someone 40 who has to pick them up their first couple months but can immediately contribute valuable, hard-earned broadly-applicable timeless lessons about how to build and operate software.

Just as the majority of 23 year olds are what you are imagining (inexperienced, know-it-alls, chasing the latest fad, etc.) I would say the majority of 40 year old programmers are just as useless in the other direction.

Maybe not on HN where everyone is above average, but in the flyover states it's absolutely the case. Exceedingly few 40+ year old programmers around these parts I'd describe as holding a relevant modern skillset or holding any interest whatsoever in updating their knowledge or trying to stay competitive. It's almost universal derision over the "flavor of the month" until 10-15 years down the road when they realized flavor of the month turned into Linux and all they have left is bitterness over being left behind.

Just as you have the hiring skills to weed through the huge cruft of useless candidates of your preferred type, some hiring managers hold the skills to wade through the cruft of young hipster bros and find the gems as well. The 80/20 rule typically applies to all things in life.

So put another way - in the end, I'd say it all roughly evens out in my experience. I think age is an extremely poor metric for productivity, and it really depends on the specific individual.

I say all this as someone much closer to 40 than 20.


>>I would say the majority of 40 year old programmers are just as useless in the other direction.

This has nothing to do with age. In most people's case swim lanes are set early in life. If you weren't good enough at 23, unless some major incident in your life changed you, chances are you will be the same at 40.

This has largely been a problem- most people feel with this popular phenomenon called 'Nerd culture'. Some people have it all figured way early in life, others take long, many never. This fact alone counts for different starting points in life, overall seriousness, and the net time available to make progress. Also if you started out as a Nerd earlier, you are more curious, and more eager to learn, change and adapt to things. Than people who start out later, who just wanted a job to provide for their family, and are in purely for 9 - 5 kind of a job, those people will routinely find themselves stuck in the wrong job- As they have to spend bulk of their time learning things. While they might want to spend that time on recreation.

This whole thing about learning and exploring things just for the heck of it doesn't come naturally to most people.

From my own childhood I remember a bunch of bullies from my lane, routinely taunting and beating up kids. One of them grew up to his mid-20s and woke up to the financial realities of life. He then went to college and did his engineering. But that was 8 years after I had finished my degree, and had a neat 9 - 10 years experience compared to him. I once met him in a bus and he came across as a totally changed person, but he still struggles. Largely because the impulse to do things is different, he just wants a job, and beyond that bulk of his early life throughout childhood, teenage and early adulthood were spent doing unproductive stuff. So there is just no desire or curiosity, or even the mental tools required to sustain a continuous learning life long marathon campaign.


I largely agree, I think it's almost always individualistic. I know many 40+ guys who's combination of experience and passion will nearly always trump a similar guy in his 20's. Then it's a "simple" RoI calculation of if you need experience in that position or not.

I was mostly trying to provide a counterbalance to the "older is always better" opinion I've seen as well. I was lucky enough early in my career to have a myriad of stellar mentors - both old and young, and learned to simply listen to what folks have to say.

As I get older I do understand how hard it is to keep up with the latest thing in tech, while also handling other responsibilities in life. I almost always tend towards the grumpy guy ranting about the latest fad, so this comment is largely out of character!

I thought I'd toss it in my earlier comment - but it's also selection bias at play here. The guys in their 40's looking for work are likely of a different type than those in their 20's. The stellar candidates who have decades of career behind them typically have a strong social network that places them before they even look at a job board. This is an area where I feel ageism is very real, and kind of terrifies me a little should I find myself in that position.


>>I was mostly trying to provide a counterbalance to the "older is always better" opinion I've seen as well.

Older isn't always better, simply because older isn't a separate species or class of humans. Put simply young people eventually become old. If you had major human enterprise issues while you were young, chances are those will remain when you get old.

This extends to other things too.

Why do you think retirement is so hard for people who didn't save well enough in 20s and 30s? Nobody can wake up one day, and run a ultra marathon. Processes for making something like this happen start way early in life.


I'm not saying it's the definitively the most cost-effective way to hire. It very well could be short-sighted.

I think your view is 100% valid and could very well be correct. The key point is I don't think people who disagree should be presumed "-ist."

And if you are correct: Good news! As a hiring manager, you apparently have a wider selection of better candidates because your competitors aren't properly assessing value in the right places.

I would argue, though, that being able to readily assess someone's experience in a modern framework you're actually using might be a less risky hire than assessing someone's abilities in decades-old technologies you don't use, hoping their lack of horizon-broadening over the last ten years hasn't been due to complacency, and paying them more just to find out.

But I could be wrong! And that's OK. We should be able to disagree without the "-ist" so often.


Your comment highlights more problems than just ageism.

First, what's the deal with passion? Can't land a job if you're not passionate about whatever the prospective employer wants you to work on? Replace that word by "enthusiasm" and it'll be more acceptable. Sorry for singly you out, it's the zeitgeist I'm after.

Second, the "relevant skills" bullshit. Sillysaurus3 had relevant skills for a wide range of jobs. The only problem was the lack of trendy buzzwords in the résumé. Employers just don't want to spend a few weeks (or even a few days) training someone to a new stack.


In the context of a job hiring candidate, I'd consider "passion" and "enthusiasm" to be synonymous. The idea would be to imply that you're interested in building cool things and solving difficult problems more so than just receiving a paycheck. And that's a beneficial thing to convey as a candidate regardless of your age.

Regarding training someone for a new stack: You quite possibly are right. I just think it's possible for employers to disagree with your view and not be presumed "-ist."


> I just think it's possible for employers to disagree with your view and not be presumed "-ist."

So do I. There are more ways to be wrong than being an "-ist".


I used to be passionate about computers, but it's hard to sustain passion after 35 years — these days I'm merely affectionate.


What are "relevant skills?" It's it a programming language? An environment? A framework?

When I started, I was expected to pick anything up as I was working. I got my first job partially as a result of having seen Smalltalk in a programming language class. The important "relevant skills" were enough background to pick up new things (since everywhere was different) and a demonstrated ability to get work done.


"Relevant skills" are the things listed as "required" or "preferred" in the job ad. These typically are all of those things.

I agree with your second paragraph--but isn't that a strong case for investing in a cheaper developer with less experience?


I've read your post, and I try to empathise, but at the same time I can't help but feel you were naive in thinking you could take on a web development job without having any of the skills required, even a very basic level of them.

If a 20 year old applied for a job that involved writing SQL queries, and couldn't, I'd expect him to be rejected. There's a world of difference between low level C/C++ style programming, and working in Rails day to day, and if you think there isn't, then you should be able to prove it.


My first real job out of college was writing graphics device drivers in C and assembly. I had no idea how to do it before joining the company and in fact bought a book on writing Windows device drivers a few weeks before I started so I could at least start out knowing basic terminologies.


If a 20 year old applied for a job that involved writing SQL queries, and couldn't, I'd expect him to be rejected.

I've been a webdev now for some time, and I've written zero SQL queries. Firebase is nice. So is Mongoose. And yes, I can structure it as a traditional SQL relational database with proper indices if needed. I'm not choosing technologies that cause technical debt or problems down the road.

If you think that you can't take a programmer with a decade of experience and a background in security (where I performed SQL injection attacks) and train them to write SQL queries, then that says a lot about the company. I don't regret failing that interview.


Respectfully, if you're going to play the "I'm a back-end systems guy but I can learn this front-end JS stuff" card, you can't choke on some SQL queries.

We're all here stroking our gray beards about how kids these days don't even understand the tools they're using, and then you turn around and say "i dunno, the ORM handled it for me"?

SQL hasn't appreciably changed in 40 years. I'll take your word that you really know your C/Unix stuff, and I get that gamedev is it's own niche, but I'd expect most senior devs to have developed a rock-solid grasp of SQL just by osmosis over their years.


>>I can structure it as a traditional SQL relational database with proper indices if needed.

These sort of things are the reason I have to these days deal with programmers who show me their several thousands of lines of code and feel proud about it, until I show them it can done with a few SQL queries. Or that Java/Python program who proudly flaunts their few thousands of lines of code weekend project, which is basically a awk command that can be written under 3 minutes.

Only a few months back I was asked to review a backend application. The developer explained passionately how his massive structure of Java code plays with the store, which happens to be a key-value store, while the application is largely transactional in nature. Despite explaining to them why this isn't great idea even from an operations perspective, let alone conceptually or design wise they went ahead. Last time I checked they have to often deal with dirty read/write issues in case of temporary failure, and disk crashes apparently are common.

These sort of issues can be avoided if devs sit down and get basic exposure to things SQL, which by the can be done under a few weeks.


Honestly, I think your attitude towards this is incredibly dismissive. Hiring people with experience is good, but not if that experience is at the expense of some of the fundamentals. Imagine hiring a graphics programmer because they had 10 years experience in front end web development, only to find out they can’t write any shaders, don’t understand any of the theory, and you turned down someone with 5 years graphics programming experience to hire this person instead? Or perhaps more similarly, taking a backend web developer with a decades experience, and putting them in an engine developer role. Then on their first day they ask you why you only use ‘new’ sometimes.


Once you have a strong understanding of how to architect code and how the "big picture" technologies fundamentally work, learning the different flavors/frameworks/etc takes time, but we're talking weeks of learning vs the years it takes to understand the fundamentals. You're not going to be an expert in the idiosyncrasies of X tech unless you are working with it full time for a year or so but you'll know enough to pass a job interview.

After reading your story, for example, I imagine you could spend 3 months part-time doing an online bootcamp like treehouse (focusing on big picture web architecture and syntax) and you would be more than employable as a web dev.


but you'll know enough to pass a job interview.

Unfortunately not. I believed it myself, until it became obvious it was a myth:

One interview went like this: They sat me down, opened up postgres, and said "Accomplish these goals." The goals were simple things like write SQL statements to query for certain types of users, or join data together.

I never had any reason to learn any of that. I knew that I could learn it if I needed to. I understood conceptually what joins are, why to use them, and what to avoid and why. The "why" is the crucial ingredient, and I naively thought that would protect me.

No... It's not fun when you have to sheepishly admit in front of three engineers that you have no idea how to write those SQL queries.

It was the same deal with Rails. Ditto for JS. Eventually I went through 5 interviews and struck out on all 5, in a row.


That's my point though. You say you don't know the job skills that you need for you to hit the ground running (which is what pretty much every employers expects these days unfortunately) and what I'm saying to you is you could develop a good understanding of these skills rapidly because of your background vs the years it takes if you have zero tech background.

You could spend nights and weekends learning postgresql/SQL, web MVC architecture, JS syntax (for node.js and frontend dev work), and http concepts, in a few months. That's enough to open a lot of interview doors.


Yes, I could, and that was my goal throughout that time. I was doing exactly that, but hadn't focused on postgres yet.

The point is that it was impossible to know beforehand which of those skills I'd be quizzed on. It's both difficult and senseless to learn every technology just in case I'm quizzed on it in a 30 minute interview (which really doesn't give you any sense of my skills or learning capabilities anyway).

The sole way to deal with this situation is to plow through a dozen interviews until one sticks.

You say you don't know the job skills that you need for you to hit the ground running

This isn't an accurate description: https://news.ycombinator.com/item?id=15714338

I'm about to hit HN's comment limit, so I think we'll have to leave this off here. Feel free to email me if you'd like to know more or chat about this. It's an interesting topic and I'm simply trying to call attention to the other side of it.

It's so easy to dismiss it as fake or that it doesn't exist. The real world is quite a bit more complicated and messed up in certain ways; it's not just sour grapes talking.


Yeah tech interviews are notoriously awful about testing for trivia style tech knowledge and you either know it or you don't.

>The sole way to deal with this situation is to plow through a dozen interviews until one sticks.

Yep, and that's pretty much the standard. On the plus side the more you interview the more you see the same questions popping up so you can prep for them. It's an awful numbers game but so long as you keep optimizing/learning that's all that matters.

For what it's worth there was a post on HN a few months back from a high end recruiting company where candidates were highly pre-qualified to interview for certain positions and even then I think their success rate was like 20%. It's the nature of the beast for tech. If your interview offer percentage is too high that probably means you just aren't reaching high enough.

>It's so easy to dismiss it as fake or that it doesn't exist. The real world is quite a bit more complicated and messed up in certain ways; it's not just sour grapes talking.

I'm not dismissing you at all, I've bombed tech interviews before it's the nature of the beast.


can we please be honest here? Is anyone who's claiming it's all about "staying hip with the skillset" actually over 40? Because if you are, and if your skillset totally matches what the job asks for, and especially if you are nimble and quick to boot, then I promise you have heard "overqualified" and "culture fit" many sad times, most often after you nail the first sixteen (i kid, but only a little) interviews in record time and find yourself at the sunny end of a kick-butt onsite at which your facial lines can be seen. Then what was once "the team loves your passion and personality" becomes "we just don't think it's a culture fit". I'm going to have to start sending in a body double. It's that bad/sad/stupid.


I definitely agree with you and this particular dark road is best left untraveled.

It would be really great to get more data behind what is mostly anecdotal evidence. Also getting data from across the US and other countries would be very useful in understanding this better. Without the data it is too easy to go down the dark path.


we need a #metoo movement for ageist experiences in the interview process and beyond! This antiquated EEOC complaint business needs a "disruption" by a more "agile" and effective approach.


> I don't think it's ageism if one person is selected based on being willing to work more for less money. That's just a labor market.

All sorts of discrimination exists still because "perfectly rational" arguments like yours ignore underlying reasons.

For example: it's the same justification racists used for intelligence tests for voting not too long ago.


Yes, and all sorts of underlying reasons are hidden by assuming "-isms" and victimhood at the first sign of disparate outcomes.

But trying to address perceived ageism doesn't solve the problem. Being aware of contributing factors and correcting for them on an individual level does.


How do you individually address an issue where the hiring manager is making a generic decision based on age unrelated to your individual performance (especially in hiring where they have no idea how many hours you personally work)?

Why should dying your hair and wearing skinny jeans affect your hire rate?

"-isms" don't work because they are rational individually, although they might work because they are stereotypically true. Or are you saying that the hiring manager should work on this individually... which would be actually just following the law.


> How do you individually address an issue where the hiring manager is making a generic decision based on age unrelated to your individual performance (especially in hiring where they have no idea how many hours you personally work)?

There isn't evidence of this, but even if you believed that were the case, you combat it the same way you would any other weak point of your value proposition. You address it head on in a tactful way that resolves those concerns without having to be asked explicitly.

That could include interest and experience in an emerging tool or architecture (combats stereotype that skillset is outdated), casual commentary on the latest Game of Thrones or Silicon Valley season (combats stereotype of bad cultural fit), or a true passion in solving the problems the company sets out to solve (combats stereotype of clock-in clock-out worker). It could also include adjusting your salary expectations more in-line with what your competition is expecting (if necessary).

The possibility that ageism is not the only cause cannot be simply ignored.


True. What we need is a"not hotdog" style app for tech interviews. Simply photograph the face of the candidate and wait for the app to tell you if they are too old for the job. It needs a hip name. Suggestions? (this was a bit of light humor about a serious and important issue)


> Yes, and all sorts of underlying reasons are hidden by assuming "-isms" and victimhood at the first sign of disparate outcomes.

This is a loaded and contentious statement ("assuming...victimhood at the first sign").

And trying to address ageism is trying to solve the problem; the problem is ageism.

In your particular example, ageism is manifest because older workers tend to have families and financial obligations (both of which, by the way, contribute to society). That's not "perceived" ageism. It's simply ageism.


It is assuming victimhood, because it is presumed immoral (or illegal) activity for what could at least as reasonably be totally moral and legal decisionmaking as a result of other factors.

If you choose not to work as much as someone else or made choices that require higher compensation, it is not an "-ism" for your employer to choose another candidate on that basis alone. We're going down a dark road claiming victimhood without evidence as quickly as people are.


no one claims ageism at the first sign of it... you typically spend the years between 35 and 45 thinking... "no way! that's so stupid!" until it finally hits you, and you still can't believe it's really a thing. It's a lot like racism or sexism, except it's a bit more comforting to know that it will come to its perpetrator with a swiftness that will blow their mind.


Never compete on price. Provide something that nobody else can and stick with it, otherwise it's a race to the bottom.

Those companies that hire the cheapest labour will get the quality they pay for. Let them, it's not your problem. There are enough who care about doing it right that they will eventually find you and will pay you according to the value you bring.


>There are enough who care about doing it right

If you know any of them, please list them. One real problem is that the ones who want to do it 'right' are few and possibly not easily identifiable so potential job seekers may not know where to search for them.


That depends where you live of course. The smaller cities have so little tech companies that it's hard to find any tech job, irrelevant of the price.


If you want to make money doing something you love, you have to be willing to do what it takes to do that thing. You either have to play in the market as it is defined, make the market change to the way you want to do things or be willing to relocate to make it happen.

Why do you think all the most highly paid actors and actresses are in Hollywood? Do you think they all started life there or do you think they moved there from all over the world because that's where their market is and that's what they love doing?


Amazon competes on price


Never take universal quantifiers literally.


Only a Sith deals in absolutes.


Not all Sith ;)


It's predjudice if you pre-judge how much people are willing to work or how much money they want based on their demographics.


It is, but parent comment stated it as fact--not a prejudgment.

If you are avoiding older candidates because of presumptions about them in terms of work ethic or compensation expectations, you are wrong. If you are advertising below-market salaries and expecting >40 hr work weeks and only attract younger people, there is nothing wrong with that.

Older candidates are not entitled to employment in the positions they want at the rates they want. They need to compete in the labor market with the rest of us.


> If you are avoiding older candidates because of presumptions about them in terms of work ethic or compensation expectations, you are wrong. If you are advertising below-market salaries and expecting >40 hr work weeks and only attract younger people, there is nothing wrong with that.

You don't seem to understand how discrimination works: the advertisement you describe discriminates against older workers because of how lives typically evolve with age (e.g. ability to sustain long hours, family or other obligations, etc.). The two scenarios you describe aren't objectively independent or even clearly separable.


We're just getting into a pointless gray area when that argument is made. Doesn't having work hours that don't match with school hours discriminate against parents? Isn't a position with on-call responsibilities inherently discriminating against people with more familial commitments?

I don't think it's the employer's responsibility to tailor the job to match a candidate's chosen lifestyle, or that not doing so means you're discriminating. I think people need to be responsible for their own decisions, and if that means you're less competitive relative to another candidate, that is not the employer or the other candidate's fault.


Funny you'd mention on call. It's a major hassle for the allegedly hot positions of devops/SRE/sysadmin, anyone who got a clue leaves the domain pretty quickly.


You don't seem to understand how discrimination works - any separation of people according to some criteria is literally discrimination, however, illegal discrimination is one done according to certain protected characteristics (including age) without a legitimate business need (e.g. actors for a particular commercial).

Offering below-market salaries for 80 hour work weeks isn't illegal age discrimination even if this highly correlates with age, those are legitimate criteria that affect work performance, and thus are valid; just as criteria that require workers to be able to lift a certain weight is a legitimate criteria if appropriate to that job role, even if it would disqualify almost all women for applying.

Disqualifying an old man because you think that people of his age won't work 80 hour weeks for a poor wage or can't carry the required goods is prejudice and illegal discrimination. However, disqualifying an old man because you know that this particular man won't work 80 hour weeks for a poor wage and can't carry the required goods is fair game, it's a reasonable evaluation of a potential employee.


Legal discrimination can occur in explicit, discrete instances, such as your example about this particular man. Illegal discrimination can, too. But illegal discrimination can also occur in a systemic, implicit way.

Systemic bias exists. One manifestation of that in this industry is the signaling in certain job descriptions and hiring practices. We needn't point to explicit, specific examples (e.g. "hiring manager told me point-blank 'sorry, we do not hire people over 30'") to see a broader effect.


As long as the market has been sufficiently de-skilled, by the belief that it can be taught by weeks-long bootcamps alone and a 25 year old has seen enough crap to be called "senior", that will be a valid view.


Perhaps there is some middle ground, and a fresh perspective with modern tools at a lower price might be more valuable in many circumstances.

And perhaps "years spent in industry" is not as universally correlative with cost-effectiveness as some claim.


But that's not the legal and societal POV is it would you say that applies to Race Sex Gender and so on?


Of course. I've never seen someone argue they were being discriminated against on the basis of race, and that they know this because someone else offered to do more work for less money and were chosen.

I agree it is currently popular to assume "-ism" simply from disparate outcomes, but I don't think it's right (in both the "correctness" and "morally proper" definitions of the word).


>This mythology leads directly to horribly created software that will either ensure many terrible security breaches in the future or the continued proliferation of "magic" throughout a product.

Does it matter if neither the product sales nor the execs will suffer from producing it?


Right, those examples albeit being good to convince a kid to into 5 yrs college are rarely the reality in the IT business. Quality has a price that most of the companies are not willing to pay and might as well detract'em from tapping into a new market.


> One of the issues is using the word "engineer" for any of the brogrammer types.

The problem you’re actually describing is completely orthogonal to whether someone is a “brogrammer”. The actual issue is that most programmers learn neither computer engineering nor computer science. Businesses are typically very high time preference, so they don’t care and are willing to hire people who “know how to code” and that’s basically the extent of their knowledge.


Ironically, working longer hours makes one less productive (in engineering).


>>Somehow we have confused people who stitch together software from pieces of "magic" that they do not understand with engineers.

Exactly, this is like saying carpenters get obsolete every few years because wood pieces with new joints come in every few years.

>>It normally boils down to those people being willing to work longer hours for less money.

And for naivety, you can easily sell stories like 'work for great exposure' to newly arrived kids.


> One of the issues is using the word "engineer" for any of the brogrammer types

In Canada the term 'engineer' is regulated at the provincial level, and a person can be prosecuted for labeling themselves as an engineer if they are not registered and licensed with their province's engineering association. For example in Ontario in order to join Professional Engineers Ontario and call yourself an engineer you must meet strict academic and work experience requirements[1], pass an ethics exam, and pay a yearly membership fee. Everyone who graduates from an accredited engineering program in Canada has to take an ethics course and has these rules drilled into their heads. I know I did.

But then I graduated and half the people I run in to at hackathons and conferences did a two week bootcamp and have "Software Engineer" in their job title, and proudly introduce themselves as such.

I call myself a software developer because although I have a software engineering degree I'm not registered with the PEO and it wouldn't benefit me if I was.

[1] http://www.peo.on.ca/index.php?ci_id=1848&la_id=1


> For example in Ontario in order to join Professional Engineers Ontario and call yourself an engineer you must meet strict academic and work experience requirements[1], pass an ethics exam, and pay a yearly membership fee.

Sounds like a racket, especially given that the PEO is a private organization. I’m not sure why this is supposed to be a good thing.

> Everyone who graduates from an accredited engineering program in Canada has to take an ethics course and has these rules drilled into their heads.

It bothers me that people believe that A) ethics can be taught like any other engineering class and B) ethics should be prescribed by some professional association.

> But then I graduated and half the people I run in to at hackathons and conferences did a two week bootcamp

I agree this is a problem, but most of the actual engineers I meet in real life (i.e. people who did a 4-or-more year accredited engineering program, including some who are from Canada) don’t actually have a particularly impressive level of rigor that would set them apart from most of the college-educated programmers I meet.


In Canada the engineering associations are self regulating bodies, that is, the associations serve a legal purpose because of the technical nature of engineering. The thinking is that Politicians don't have the technical background to regulate the profession and so having pronvincial bodies allows engineers to regulate themselves. Hence the ethics part


The value of the system shows up in things like civil engineering when it is required that an accredited engineer signs off on the design. That engineer has personal liability if the building fails. The engineer's boss cannot simply order the engineer to sign off on the design, if the engineer says that you must, it has real weight.

The US has the same basic system, all that is different is how strictly the word "engineer" is regulated. And that varies by state.

Before that system was adopted, bridges fell down an average of once a week in the USA. We have a lot more and bigger bridges now, but not nearly as many fall down.

Read https://www.amazon.com/Professional-Software-Development-Sch... for case for doing the same thing for software development.


> Sounds like a racket, especially given that the PEO is a private organization.

Dunno why you need the qualifier there. There are public and nonprofit rackets, too.


You can become a software PE in the United States, but it is absolutely a racket. One of the requirements for a PE is you need to work for another PE. This leads to a situation where new exams, like the software discipline, have 20 people sit for an exam in one year. The transportation discipline had 1,800.


> One of the requirements for a PE is you need to work for another PE.

That's also a requirement in Canada (Ontario at least). Work experience has to be completed while working under a P.Eng.


Canada, along with a couple other nations I do not recall, is an outlier in this respect. As an American EE degree holder who has been employed as an EE in the past, I think the attachment of "engineer" to shitloads of titles just to fluff them up is ridiculous. However, I think the Canadian position is equally ridiculous in the opposite direction.


Ethics exams are the stupidest kinds of exam. If you have no ethics, you'd have no problem and cheating to pass.


They don't care if you know the material.

They care about having a legal lever to manipulate so that there is proof that you should have known better.


The English word "engineer" is quite difficult for French people, because in French we have the term "ingénieur" (usually meaning "ingénieur diplômé"), which is something like the big universities diplomas in other countries.

A better translation of engineer would be "technicien".

French "ingénieurs" come from "Grandes écoles", see https://en.wikipedia.org/wiki/Grandes_%C3%A9coles


The link suggests that what is regulated is "Professional Engineer", not "engineer".


Here are a couple more links from the PEO website that are more explicit about the term "engineer".[1][2] Admittedly I don't know if the claims here are accurate or not, the Professional Engineer Act [3] is too legalese for me to decipher.

[1] http://www.peo.on.ca/index.php?ci_id=2075&la_id=1#whypeng [2] http://www.peo.on.ca/index.php?ci_id=2075&la_id=1#aftergrad [3] https://www.ontario.ca/laws/statute/90p28#BK36


True or not true: Engineers in Canada wear a ring made of steel from a bridge that collapsed?

Found it: https://en.wikipedia.org/wiki/Iron_Ring


does the ethics test preclude doing work on rayguns? asking for a friend.


Probably, engineers are required to adhere to a strict code of ethics that puts the public interested first. Rayguns tend to be developed by people with evil ambitions. If I were a professional engineer and my boss asked me to work on a raygun, I would be ethically required to refuse and potentially to report him to the relevant authorities.


my boss asked me to work on a raygun, I would be ethically required to refuse and potentially to report him to the relevant authorities

So Canada has no arms industry? I call shenanigans.


> So Canada has no arms industry? I call shenanigans.

Whether or not Canada has an arms industry is irrelevant to the question of whether or not it's unethical to build weapons.

Having an ethical obligation and acting on it are two different things. People can argue about the philosophy around the ethics of creating weapons. It could be argued that every engineer working on creating weapons is violating the ethical requirements of being an engineer.


It could be argued that every engineer working on creating weapons is violating the ethical requirements of being an engineer.

If that were true, wouldn't the professional body expel them from membership? Or perform some other form of sanction? Which it hasn't - ever, to my knowledge.


maybe their engineers are called advisors?


> and have "Software Engineer" in their job title, and proudly introduce themselves as such.

Personally I don't see a problem with this. That's their job title. Whether they have a P. Eng and can legally sign documents is another matter.

You are correct that technically they cannot walk around saying "I am a Software Engineer" unless they are a Professional Engineer. But I don't see the harm of someone saying "I work as a Software Engineer in X company" because:

- that's their job title

- the company chose the title and hired them for that role knowing their qualifications

- to my knowledge there are no legal restrictions on what title a company can give a role

If you want to go after someone calling themselves a "Software Engineer" when they're not a P. Eng I would argue you should go after the company for advertising an "engineer" position without hiring a P. Eng.

Edit: my argument here is that professional engineer licensing bodies should go after companies hiring engineering graduates to fill an "X Engineer" role the same as they would go after an individual falsely claiming to be a P.Eng.

It's stupid to allow companies to hire someone without a P.Eng to an "Engineer" role and then chastise the person for calling themselves by their job title.


That’s how we got to where we are today. You don’t get to call yourself a doctor or nurse because you have a box of band-aids in your pocket.

The meaning of engineering on the other hand has been watered down to be meaningless. There are no professional standards and no control on what companies represent their staff to be. It sounds prestigious as a title, but that’s about the end of it.


> The meaning of engineering has been watered down to be meaningless

For some disciplines like Software Engineering, I agree.

For other disciplines like Civil engineering, the term has not been watered down. No company would hire someone to certify designs who was not a P. Eng because they are required to employ a P.Eng for this by regulations/law.

This problem has come up because the government/PEO has not reacted to the changing engineering landscape and don't require positions like "Software Engineer" to be filled by an actual P. Eng.

Companies made the title worthless. I don't see how it's fair to fault the person in the position because there is a double standard applied to individuals and companies regarding the term "Engineer"

Apply the standard both ways. People cannot call themselves an "Engineer" unless they hold a P.Eng and companies cannot hire anyone but a P.Eng for an "X Engineer" position.


> to my knowledge there are no legal restrictions on what title a company can give a role

From [1]: Becoming licensed gives you the right to use “P.Eng.” after your name and “engineer” in your job title. Under the Professional Engineers Act, you may only use “engineer” in your job title if you hold a P.Eng. licence.

So yes, I'd agree that companies shouldn't be using engineer in the job title unless they're hiring a P.Eng, but that doesn't absolve the employee of liability.

[1]http://www.peo.on.ca/index.php?ci_id=2075&la_id=1#whypeng


> So yes, I'd agree that companies shouldn't be using engineer in the job title unless they're hiring a P.Eng, but that doesn't absolve the employee of liability.

So, would you agree that if we are going to be strict about who can use the term "Engineer" there shouldn't be a double standard allowing companies to call a position "X Engineer" and then hire someone who isn't a P. Eng for the position?

If Engineer is a restricted term for individuals, then it should be a restricted term for employers as well. Employers having a position with "Engineer" in the title should only be permitted to hire people with a P.Eng.

> Under the Professional Engineers Act, you may only use “engineer” in your job title if you hold a P.Eng. licence.

The company sets the job title. So if they want to hire someone who isn't a P.Eng to the role, that person is supposed to tell the company to change the role title to exclude the term "Engineer" ?


Yes, I thought my agreement was clear in my previous comment. Companies shouldn't be using 'engineer' in the job title unless they're hiring a P.Eng.


TIL I can say "I work as an attorney at X company" even if I never went to law school, as long as that's my job title. Hint: I can't.


When you say "brogrammer" it makes people not want to read the rest of your comment. Please be civil on HN and in general in the world.


Are we really this thin skinned? It's a funny portmanteau not a slur.


Given the context, though, its use here is acceptable.


60 year old dev here. I have been living on the cutting edge of tech for 30 years, and I have many scars to prove it.

New is usually not better if you actually want scalable, maintainable, and monetizable results.

I disagree that many area's of my knowledge has a half life. My deep understanding of network protocols hasn't aged. My deep understanding of SQL and NoSQL databases hasn't aged. The principles of distributed systems haven't changed, but they have been given more tools. When, where, and how to optimize hasn't changed. Debugging, naming, and design whether it be functional or OO hasn't changed.

Sure there are new tools and methods, but I say the fundamentals are the same.


More to the point, in my experience training more than a few college grad devs, what I look for in hiring isn't any particular language or technology, I look for someone who knows how to think like a programmer. That's something incredibly hard to teach, but if you can find someone who already has the thought processes in place for how to turn intent into code, that's 90% of your training done right there, and the remainder can happen largely with googling, stack overflow, and good code review from more experienced devs.

I'm not knocking coding bootcamps or colleges, I've had hires from both. I'm moreso saying that the idea that a programmers skills are isolated to the code they currently can write is a little shortsighted.


> Sure there are new tools and methods, but I say the fundamentals are the same.

Absolutely. Folks need to get their heads around software engineering as a practiced discipline. Your education is about the fundamentals of how you think and approach, even if the tools themselves change over time.


This is a sensible perspective, but the job market is actually the opposite of sensible (in a way).

Paradoxically, the (again, sensible) attitude that fundamentals are the same with a different coating, will distantiate one from chasing the latest fads... which is actually what the [majority of the] market looks for.


The job market for database and networking skills is pretty different than the market for web apps.

There are places where you get a job because you want to rewrite everything, and places where you get a job because you didn't suggest to rewrite anything.


I agree on the market being crazy.

Does any employer in the world interview people with real world skills?


I wish companies saw it this way. My company is downsizing and I've been interviewing like crazy. 5 years ago when my tech stack was hot I had to fight off companies, now I feel they are just humoring me.

I make it clear in my cover letter that I am capable of picking up new technologies and languages with ease, which if you look at my resume and my personal projects on github is pretty plain and obvious -- I have 15+ years of professional experience in multiple languages and using all kinds of technologies. But they look at my current job/tech stack and say "hey, you don't match ours. sorry".

I even applied to dozens of jobs in the HN Who's Hiring, got a couple phone interviews, and nothing past that.

I finally got a job at a place that was using my stack and wants to convert it to something more modern.

This has definitely been a strange interviewing experience for me. I'm nearing 40 and have to wonder if ageism is becoming an issue for me.


I'm under 30 and I've had the same experience. Every place I've applied to felt like they had a checklist of hundreds of items and a no on any of them was an instant disqualification. Even at places where I had a friend recommend me and could find out their reason for not giving me an offer, there was no consistent reason and it was usually just one person saying no so their team sat for 6-12 months looking for people.

It's like every business is now completely terrified by a wrong hire and will only take the "perfect candidate" but at market rate


This has been my experience as well; It's as if every company is looking for a unicorn programmer... when their domain isn't any more or less unique than any other domain.

One of the "interesting" side effects is that it seems like interviewers are starting with the assumption that the person interviewing is an idiot who knows nothing and then works from there. So any slight misstep or misremembering of something puts you the fail box immediately.

My theory is that part of this is the broken culture of interviewing that we have as an industry. This "tricky CS puzzle" crap that originated at GOOG and MSFT is so prevalent that it's nigh impossible to actually succeed. More than once, it's obvious to me that the people DOING the interviewing couldn't pass their own interview...


Exactly, my favorite flavor of those are the ones where the interviewer appears to not stop until they find what you cannot solve, which is perfectly fine for an interview, but they still seem to treat that failure as an inability to program. One place I interviewed at wasn't doing much more than crud, and we got into our 4th cs question of that day, and my 6th with the company so far after their phone screen. They gave me some question that was solved with a flood fill. I admitted I had never encountered that before and muddled through it getting an answer that could not handle a few edge cases which I pointed out before they even brought it up. After that point they cut that phase of the interview short and the next guy asked me some things that were answered off my resume and they said they'd call me back which of course they didnt.

I could easily have been denied for other reasons but it's one of numerous interviews where I go through 1-n cs questions and whenever I get to the first one I can't solve the interview is cut short. It's just a giant waste of everyone's time


Ugh. I have a pretty decent job, and about 20 years of experience, so I have no need to job hunt. But if/when I do, I fully intend to adopt the policy that I will get up and walk out of interviews that do whiteboard coding puzzle tests. Having been trained to conduct such interviews at Google, I have no desire to participate in them. I can see their use for large companies like Google or Amazon that get thousands of resumes a day and can afford the false negatives if it means fewer false positives. But I can't see their utility at all for small companies hungry for good talent.


True. And it doesn't make sense given "at will" contracts everywhere.


Lie on LinkedIn about knowing Python after doing one simple Django project 1). Watch the desperate recruiters fill your inbox.

1) actually I didn't lie, people I knew just kept +1'ing Python on my LinkedIn profile because I talked about it sometimes while I was doing a one off project in Python


If you're capable of picking up new technologies and languages with ease, then why don't you?

For example, learning a new programming language every year for fun is a decent exercise that makes you better with your "current work tool" as well, keeps your tech stack perpetually up to speed, and isn't that hard - the second language takes significant time, the 15th (for 15+ years of professional experience) brings lots of things you've already seen, so you can do something useful on the first day and a reasonable project in the first week.


Because it would be nigh impossible to learn every tech/language for every job you are considering interviewing for.

Would you spend a week learning a new tech stack for an upcoming interview, with the knowledge that they might not hire you? Only to restart the learning process again next week when you interview at another shop?


You don't need to learn every tech/language that the interviewers want, you need to learn a language that the interviewers want. If you have learned 5 languages in the last 5 years, at least one or two (unless you've been really weird in selecting them) would be fresh and popular, and there would be a bunch of different employers that are recruiting for one of those 5 languages.

The big problem is putting all your eggs in one basket, which is bad if that one technology goes out of demand (as the grandparent painfully experienced). It's not something you do for hiring in a week before interview, it's something you do all the time in the years before as a part of your natural interest and "sharpening the saw" - if you spend the last 5 years working with technology A, a person who is "capable of picking up new technologies and languages with ease" would have picked up technologies B, C, D and E during that time anyway, and can go out to employers who want B truthfully claiming that they learned B back in 2015.


> This sort of mythology is NOT helping. An experienced Software Engineer has valuable skills that do not age at all.

Not only is the claim "now it’s five years at most, and, for a software engineer, less than three" an unhelpful mythology, it's laughably incorrect to the point that I can't imagine anyone knowledgable espousing the view with a straight face.

It stems from, on the one hand, "software as popular culture," which is a great way to choose bad tools, and, on the other, the conflation of surface knowledge (the syntax of a new language, the API of a new hot library, etc) with deep knowledge (design principles that transcend language, the ability to evaluate a library based on those principle, etc).


> Not only is the claim "now it’s five years at most, and, for a software engineer, less than three" an unhelpful mythology, it's laughably incorrect

It's narrowly referring to the "framework fads" that do seem to sweep through our profession on about a three-to-five-year cycle.


Yes, but when Framework B goes sweeping through the world, Framework A doesn't actually go anywhere. People keep working with it.

If software engineers actually burned down all the old code every three years and literally everybody had to learn this new thing in order to do anything at all, this claim would make sense. But it's all additive. Even as we here on HN are shrieking and hooting about Framework B and the way it shines a unique light on the world of programming and how no thinking person could possibly resist its charms, thousands of people keep chugging along on Framework A for years afterwards. In no small part because they've heard it before, and remember that somewhere around the 2 year mark of Framework B's run you can expect well-written and cogent articles about "Framework B Considered Harmful", "Framework B Is Permanently Ruining An Entire Generation Of Programmers", "Why I'm Switching Back To Framework A And Regret Ever Listening To Hacker News" and "Oh Shit, I Bet My Business On Framework B And All My Developers Have Turned Into Feral Wombats And They're Biting Me Ow Ow Ow".


Not to mention how much Java is written in Ruby, C written in Perl, and so on. Where the round-hole syntax of one language/framework can be hammered into the square-hole of another, it will be.


Going off topic fast, but I've felt recently that this speed has slowed again.

On the frontend, React has been king for quite a long time now, with no obvious contender in sight. Of course there's stuff like Vue but the difference isn't so big that people are throwing away their gigantic React codebases to restart anew with Vue (things that did happen a lot with jquery, bootstrap and even angular based frontends). The new kids on the block may be better, but only by a relatively small margin.

Similarly on the backend, Rails has been unfashionable for quite a while now, but there's no obvious single new framework that people are flocking to. Sure, there's many hip options, like Elixir/Phoenix, vanilla Go, and bunch of "serverless" frameworks, but just as many people just start their new projects on Rails, Laravel, and ExpressJS, because that shit is productive and it just works.

I feel like, decades after the desktop UI world, the web as a platform has pretty much matured. The core concepts aren't changing that much anymore. I think that maybe something fundamentally new has to happen again for this cycle to truly restart - just like the web as a software platform was once fundamentally new. We'll get new programmers who can reinvent all these ideas again in confusingly subtly different ways. It'll be fun!


I suspect WebAssembly may provide the start to that new cycle you are (or are not) looking forward to.


I've certainly heard people toss around the myth of "expiring expertise", but even the people who believe that talk on the scale of decades. I honestly have no idea where Friend got "less than three", it's not a claim even the most ignorant ageists seem to make.

Sure, there are frameworks that turn over in <3 year timeframes, but even utterly reckless JS cowboys don't pretend experience in an old framework is worthless.

I suppose I shouldn't be surprised, though, because the article also offers: "With the advent of the cloud and off-the-shelf A.P.I.s—the building blocks of sites and apps—all you really need to launch a startup is a bold idea." It feels like the only skill acknowledged in the entire article is putting a basic JS frontend on a website.


Nothing shows a software expert like putting periods between letters in an initialism.


The people pushing this viewpoint are self-interested Vinod Khosla’s quote is missing a punchline — “Young people are just smarter” [except for me].

The only timeless ageism thing is the willingness of old men to schnooker young turks into things, whether that is getting blown up on a battlefield for king and country or getting exploited to make guys like Khosla money.


You'd think someone who writes for such an august publication would be a little less of a chump, wouldn't you?

(ETA: Unless you're familiar with the output of such publications, I suppose, at least enough so to validate the "Gell-Mann amnesia" concept firsthand. I can't say they serve no useful purpose - one doesn't care to contemplate packs of feral journalists roaming the streets, shaking down sundry passersby for tweed and elbow patches - but really, one is well advised not to take them overly seriously.)


One would expect people to have some familiarity with the concept of "half-life" before accusing others of being chumps.


I do, but I have to confess I'm not all that much seeing the degree of applicability that would seem commensurate with such a trenchant response.


If you are 40 and you are still thinking about your skills in terms of "i can code", then I have very bad news for you. There are hundreds of people who can code just as good as you who are younger which means that they are going to replace you because you are just not that good or that irreplaceable.

Is there a way out? Sure:

1. If you actually are that good of a software engineer, start a My Code As Service company - it is the "402 PAY_ME_A_TRUCKLOAD_OF_MONEY" response. There are gobs of companies out there that pay gobs of money for rock star performers because there are just not that many of them.

2. Create - or buy - your own lifestyle business. You will never make millions of dollars but you should be comfortable.

3. Stop pretending that coding matters. Start managing those that produce code. Being a technically competent manager is a great way not to get fired.


Coding does matter, but most companies can only extract a limited amount of value from code, due to various dysfunctionalities in how they operate.

Being a good coder in such a company could be problematic, because they won't be able to recognize ability or benefit from the difference in skill to an average coder. These companies will treat programmers poorly, will try to save costs, outsource, etc.

I don't get why you claim that younger people will replace 40-somethings. It's not like the older devs will break outside of the warranty period.


Because I'm not seeing younger developers complaining that they are being replaced by older ones. What I am seeing is older developers complaining that they are being replaced by younger ones.


If they want a big salary and can't deliver enough value for the company or the company can't extract enough value from their skills (see above why) they might be let go, but this happens in countries with poor worker protection, like the US.

Companies in EU countries have the opposite problem, they can't easily dismiss employees, even when perhaps justified.

I wouldn't want to work in the US, it's too darwinian. A company shouldn't be able to dismiss an empoyee just because there's someone cheaper available.


Yep, but perhaps to phrase differently: once you get to a certain point, you build a reputation and business network. People usually want to work with you because they know what to expect (in terms of professionalism, code quality, etc). There's also value in that, whether as a programmer, manager and/or consultant.


If you are 60 and just won the Nobel prize and you are still thinking about your skills in terms of "I can win a Nobel prize", then I have very bad news for you. There are hundreds of people (https://www.nobelprize.org/nobel_prizes/lists/age.html) who can win Nobel prizes at ages younger than you.

Or: this argument is meaningless because it's assuming more supply (of someone like you) than jobs (and perfectly informed employers for that matter).


I'm not seeing those winning Nobel prizes complaining about them being fired from a job because they are old.


I think the "half life of knowledge" misses so many parts of what makes a SW developer good. Sure, there are new languages, libraries and frameworks to learn and keep up with. But the fundamental parts of programming don't change - it is still sequences, loops, if-statements. Also, quite a large part is independent of the language - how to break the problem down into parts, thinking through the scenarios that can occur, good naming, consistency in the small and large.

Not to mention everything else that isn't programming per se - domain knowledge, talking to people, negotiating, in which order to develop things...

More details on what I think makes a good dev: https://henrikwarne.com/2014/06/30/what-makes-a-good-program... and about programmer knowledge: https://henrikwarne.com/2014/12/15/programmer-knowledge/


Remember when we were all becoming functional programmers and we put loops and if-statements behind us?


I'm 31 and I've been hacking software professionally for 15 years. I've debugged issues in the last month that I couldn't have done 5 years ago with 10 years experience, let alone when I was 21. As I grow more experienced my responsibilities increase in both breadth and depth.


I've worked in C, embedded and server-side, for about 15 years of my 17 year career. Companies I've worked with recently are having trouble finding C-capable developers, particularly C developers who have the first clue about 'modern' developments like CI and CD, or git.

C is a language invented in the 70s. If you kept up with thread APIS when they became usable in the late 90s you could quite easily have had a 40 year career in one tech stack, and be looking at a bit of consultancy by now (if you haven't moved in to management or other areas).

So even without keeping current on frameworks and languages, the idea that knowledge becomes obsolete in our field after just a few years is wrong. You do need to keep an eye on wider trends and newer tools, but still.

That said, there is still ageism in this arena. And I'm breaking out because, really, there are better things to be doing in 2017 :)


> Companies I've worked with recently are having trouble finding C-capable developers, particularly C developers who have the first clue about 'modern' developments like CI and CD, or git.

As is always the case when a claim is made about "companies having trouble finding developers" of some sort: Are they actually having trouble finding developers despite offering top-of-market salary, remote work and gold plated benefits? Or are they the usual "we think we offer above average salary with great benefits!" companies who get no interest from developers and conclude there's a shortage?


Honestly I don't know, I'm a contractor and the place I'm just finishing with now is having trouble. The pay on offer for permanent staff here could well be terrible. The work certainly is, which is why I'm moving on.

I tend to agree with your viewpoint, particularly when it comes to arguments about the need to mass-import developers from overseas. Are you struggling to recruit due to a genuine shortage? Or are you trying to get people with good experience to work for you for £30K pa?


Yup--my guess is that most companies complaining about a shortage of talent can be addressed with: "Well, you offer terrible work and/or terrible pay. What do you expect?"


Paying more isn't really a global answer. Sure, I can pay more to grab you from someone else but then that other employer is short staff. Remote work may be answer for connecting lonely employees and employers, but remote work has its difficulties.


It depends - in the UK in particular a lot of salaries for tech folk are very low. So those with smarts and ambition either leave the country or head into (middle-)management hell.

A few more well=-paid positions might make a difference to the perceived value of technical staff in the country's small to medium businesses.


Would you mind sharing some contact information for these companies looking for C developers? I have been working as a contractor myself and while I currently seem to have no problem finding projects involving javascript, companies actively looking for C developers seem to be little shy in my area.

I suspect you are from the US, and I am from Europe, so this might be a problem. But still, if you are willing to write me an email, my address can be found through my github account that is linked in my profile.


I'm based in the southern UK right now, and a place I am leaving today is looking for C folks. They seem open to contracts. The work is not that interesting, being mostly small pieces of maintenance. The money is OK though.

If that's interesting, reply in the positive here and I'll ping you the details of the company and the agency I've been working through. They have a bunch of people from around Europe, and AFAICT one travels in from Germany each week.


Sure, that sounds interesting, and I guess just asking is not going to hurt anyone. Since I am from northern Germany southern UK should not be completely impossible to reach.


Such as?


Honestly? I've had more fun coding Java (8) and Python lately, for server-side stuff. I'd also like to get paid to do some Rust and/or Go at some point.

It's all subjective, of course, and partly I'm just bored of doing the same thing for so long.


Also draws a poor comparison between e.g. a 20-years-experience coder who was writing C++ until a year ago and someone fresh out of college whose been writing node.js nonstop in the three years since getting their degree. There may be one you have more fun playing ping pong with, but (and I say this as a youngster myself) the professional choice is the one where the value for dollar is best and the code is something you won't dread maintaining.


> the professional choice is the one where the value for dollar is best and the code is something you won't dread maintaining.

That's very true, but might not necessarily favor either of your two examples. The choice between "20yr C++ veteran and 3yr node coder" is not always obvious or clear-cut, and the quality differential does not, in my experience, clearly favor either side based on that information alone.

I've seen people who left school and coded like crazy for a few years on flaky and fad-oriented stacks--because that was where they could get employed in a hurry with zero experience and a lot of debt. I've seen those people emerge from those environments with an incredibly solid understanding of software design principles and how to do things professionally while balancing time/money/legacy/human/etc. concerns. They learned those lessons not by refining their craft over 20 years of working on an old stack, but in just over 2 years of working on a stack where every day they had to ask their "rockstar" colleagues questions like "Why is this simple thing so hard to do? Why do you keep writing the same kind of bug over and over? What alternatives exist to these tools/strategies?" and not getting good answers. These folks didn't code much outside of work, but they did read about other tools/practices in their free time. By the time they left, they learned excellent techniques (and professionalism) out of acute, concentrated discomfort at what they were made to work on, even while everyone else was taking shots of the piss and calling it brandy.

(And yes, I've seen people in that same situation internalize the cancer and become "X. It's the future!" zombies.)

And I've seen 20+ year veterans who were extremely proficient in a very few (one or two) methodologies, who tried to apply those methodologies to everything--even use cases where it didn't make sense, confused their colleagues, and directly induced bugs ("Making a chat app prototype/mockup for management next week? JavaScript doesn't have CORBA support? Let's write low-level TCP code in Node so we can emulate CORBA using a library I'll write for you. Thrift is too new, I don't care if it's already supported; WebSockets are the 'wrong model'; you need a proper brokered architecture for this . . . proof of concept . . . wait, why is my TCP reading code corrupting data because of off-by-one errors in my read loop?").

I've seen very senior programmers whose "breadth" of tools understanding consisted of re-implementing very specific design patterns from e.g. Java or C in a dozen languages, rather than learning to use the tool that was correct for the job.

I've seen extremely experienced people whose expert understanding of a few things concealed intense insecurity about their knowledge in other areas, and who defensively tried to fit many square pegs in round holes--anything from "we need to use this email collaboration tool I'm super good at that everyone (happily using $other_solution already) has never heard of", to crazy implementations of object/factory patterns in functional languages, to bizarre architecture review and management practices. These people were all competent, and very experienced. They were also scared--not of learning new things, but of being inferior in any single way to their colleagues, even after decades coding. That's a shame, since those folks had the most to teach others, if they'd taken time to have a sense of perspective and/or face their fears.

(And yes, I've seen people with that experience be refined into amazing teachers and stellar coders over the years--people who could get deep understanding of a totally unfamiliar tool in a week, be teaching others to use it in two, and be teaching others when not to use it in three.)

TL;DR the choice is not that simple. But I haven't seen a clear correlation either way between time-spent-in-industry alone and professionalism or output quality. Every industry has old deadweight and young cocky rockstar-wannabes.


To clarify: yes. I've seen veteran coders produces piles of C++ templates of templates and Java reflection and configuration nonsense that seemed to solve no problem I've ever been aware of. I've seen technical leads write code where every single field in a class has a one-letter name. And I've seen clear, thoughtful code from people not too experienced (although never a total novice).

But that's my whole point. Age discrimination isn't the only reason 'technical half-life is 3 years' is bogus (not [only] morally outrageous but wasteful of the discrimator's own dollars) -- but also because we should have actual standards. And not just any standards! Things like code cost and quality vs price. I feel like we've gone a bit through the looking glass because explicit knowledge of how to program well is maybe 15% of the actual craft.

[Unrelated side note: I've heard of people using flash cards to try and learn programming. This is deeply weird to me! But I think it also reflects this 'half life of programming knowledge' way of thinking.]


Even languages and frameworks don't change that much outside of certain niches. Django and Rails have been around for over a decade. Go, a "new" language, is eight years old. I've been working for 5.5 years with a Python-based platform called Odoo (purely by chance of employment) and I don't see it going away any time soon.

A colleague of mine also recently got a very good offer for helping maintain a codebase in Informix-4GL.


I think you have that backwards. It's inside certain niches that people are still building stuff in Rails and other decade(s)-old frameworks. Now it's node.js and react, to name a couple of examples. By 2020 something else will probably be the hotness. By 2025, those frameworks will still be around, running established systems, but won't be very interesting to any new developers.


Even Node.js will be nine years old next May, and it's just a runtime for a 22 year old language.


This is besides the point. Smart companies hire engineers that understands principles and concepts.

Hipster companies hire by buzzword of the year. Stay away from those.


C# has been around since 2002. 15 years. It's changed quite a bit since it's introduction. C# is just a wee tot compared to C.


I belief that sentence to be broadly correct. First and foremost, it is making a comparison more than an absolute statement of fact. And (without having first-hand experience), it doesn't seem obviously wrong to say that someone who left for an extended vacation in 2014 would have more surprises waiting for him today than a mechanical engineer who did the same from 1926 to 1929.

Note also that the statement is also talking about the "half life", indicating an understanding that there is a part to expertise that is less susceptible to aging. One can criticize that the concept of half life is misapplied, because the underlying process isn't stochastic, i. e. that half of knowledge will not again half in another half life. But then one would be splitting half hairs, which of course is one of the engineering skills that never go out of date.


It wouldn't be wrong to say that I took an extended vacation from early 2015 to mid-2017, if not from the industry entire, then certainly from modern fullstack dev - spent the time working with best-of-2008 because reasons.

Coming back, there weren't all that many surprises. I went from first exposure to ES6, Babel, Webpack, protobuf RPC, and Kubernetes, to shipping, inside a month. And I am not in any way special. So I hope I might be excused for feeling I have some basis for the statement that I think you're vastly overestimating the scale of the issue here.


> and we've stayed current over the years

I think this may be the key.


Kids playing with building blocks that adults made trying to find a configuration that strikes pay dirt. If everyone over 35 in the tech industry disappeared the world would come unglued fast.


Yep, that engineering half-life is so much bullshit. Electromagnetics hasn’t changed much in 120 years. Maybe a new transistor topology comes out every 20 years.


This sort of mythology is NOT helping.

Unfortunately it's partially valid. Job ads in particular are (presumably for a want of some other way to whittle down the pile of incoming resumes) considerably biased towards frameworks and gizmos that have been around 3 years or less.

Yes there are new languages and frameworks we have to learn, but if we apply any effort ...

You'd think a lot of these places assuming we're all utterly incapable of "learning", they way they phrase some of these requirement lists.


> "Dan Lyons, a fifty-one-year-old Newsweek reporter, gets his first shock when he’s laid off. “They can take your salary and hire five kids right out of college,” he’s told"

I wonder to what extent, the above is the reason behind senior workers having a harder time finding/maintaining their job. It almost sounds like a Keynesian liquidity trap. People expect their salaries to continually grow over the course of their career, but at some point, the value they bring, relative to a fresh-grad, may not justify that salary level. Other industries may sidestep this problem since work output often depends on connections and reputation, both of which keep growing in your 40s and 50s, but engineering is different in that people's abilities grow exponentially in the first 10-15 years, but not as dramatically afterwards.

If people were willing to continually update their salary expectations, accept pay cuts when necessary, and signaled this to their current/prospective employers, I wonder if this problem would mostly disappear.

As an employee myself, I fully encourage all my colleagues to negotiate hard for every dollar they are worth. But at the same time, I wouldn't begrudge any employer who's trying to get the most bang for his buck.


> but engineering is different in that people's abilities grow exponentially in the first 10-15 years, but not as dramatically afterwards.

This is only true if you're doing it wrong. I think, at least for most people, their skills and the value they bring tend to grow steadily and linearly throughout their careers until they retire. It's just that, after a certain point companies believe they'd rather have another fresh body than a more skilled professional. So, skills and salaries grow linearly for a few years, at which point companies are willing to pay no more, salaries plateau and people jump ship to other careers. I'd still be coding today if I could count on even a few % year-on-year salary growth, but I hit the same plateau we all do and moved on to other things.


The problem isn't ageism, the problem is capitalism. For a long period of time in the (not too distant) past, one of the the main purposes of a company was to pay employees. Corporations in the U.S. have continually shifted goals to eschew all goals other than raw profit and growth (leading to more profit). It is EXTREMELY rare to see a tech company which offers its employees more than they are worth on the market or spends more money than is absolutely necessary to keep its employees on benefits.


>For a long period of time in the (not too distant) past, one of the the main purposes of a company was to pay employees.

I don't really disagree with your sentiment, but I don't think that statement has been ever true.


How do you signal that you'd be willing to work for low pay though? That would look really weird on a resume.


You signal it with a college graduation date close to today's date.


>> With the advent of the cloud and off-the-shelf A.P.I.s—the building blocks of sites and apps—all you really need to launch a startup is a bold idea. Silicon Valley believes that bold ideas are the province of the young.

Well, right now I don't see any "bold" ideas coming out of SV. What I see is everyone doing "X with deep learning", or "Facebook for Y", or "Uber for Z but with deep learning"). That's, like, the opposite of bold. It's everyone trying to milk the same cow before its teats fall off.

But even so- there's an obvious survivorship fallacy at work here. If SV is throwing a ton of money at graduates and school leavers to come and work for it in their startups, it kind of makes sense that the successful startups -few and far between as they may be- will be made up by younger programmers.

Anyway, older programmers don't have to go to SV and join a startup. There must be millions of companies in the world that use software, most of it legacy- from big corporations to small mom-and-pop shops. Anyone who knows how to debug and refactor can make a comfortable living these days.

In fact, "Pourvu que ça dure".


> What I see is everyone doing "X with deep learning"

Agreed. Even Google is guilty. Press releases from them that don't mention "Machine Learning" are becoming rare. "Machine Learning" is the new "cloud" buzzword.


Google is also responsible for creating many of the success stories for deep learning. So I think this might be a little unfair to them. They created the bandwagon more than they jumped on it, and they've gotten phenomenal results so far in general. (This seems to be a general trend for Jeff Dean's career overall.)


I did a little experiment a while ago: I made a two-word update to my LinkedIn profile adding the phrase "Machine Learning" for a week just to see what happened, changing nothing else. Inundation would be a good way to describe the result. Tech hiring is so stupidly trend-following it's not even funny.


First, ageism is not just a SV phenomenon. For example, see the very young White House aides in Clinton, Bush, & Obama administrations.[1]

Second, the tech companies that have a reputation for favoring 20-somethings will still hire older workers for certain positions.

- Even though Mark Zuckerberg said, "younger people are just smarter", he hired an "old" person like Sheryl Sandberg to be COO at age ~38.

- Bill Gates was ~32 when he hired Dave Cutler age ~46 to architect and develop Windows NT.

- Google hired Eric Schmidt ~46, Guido van Rossum (Python) ~age 49, and Peter Norvig in his 40s

- Netflix hired Adrian Cockcroft in his mid-40s

The SV companies will exhibit "reverse-age-discrimination" for executive management and architect positions. Why? (And the obvious answer is "experience".) But then, why is "experience" not valued for the non-architect programming positions? We actually do discriminate against young people for many roles. I leave readers to sort out why we value experience selectively for different situations.

[1] https://news.ycombinator.com/item?id=15350873


If I were to play the devil's advocate, they hired people who could fill the actual role requirements, or who had the right image they wanted in the company. The experience required to fill a COO role is going to be significantly higher than the experience possessed by a college graduate.

The experience bar to have Guido van Rossum's image is also pretty high too. ;) But how many Guido van Rossum's can a company afford to hire; how many do they need to hire?

I think the problem is that for the lesser jobs the minimum viable candidate is chosen because there's a relatively low perceived cost for choosing the wrong cog. Even Google, with their "Better to skip 99 good candidates than hire one bad one" mantra won't be impacted in any appreciable fashion by a less-skilled hire.

In fact, any candidate with greater than the minimal viable skills (i.e. someone in the industry for 10+ years) is usually seen as being too qualified (thus, too expensive) for the role being filled. Why hire an architect to move gravel? They'll just be unhappy with the job, and immediately seek to leave the role they were hired for.

At least, that's the perception.


It's worth noting that the median age of an American is 41. That we're discussing people in that age range as counterexamples kind of proves the point.


>median age of an American is 41. That we're discussing people in that age range as counterexamples kind of proves the point.

Don't take my previous post as a complete dataset. It was only coincidence that I happen to list programmers in their 40s off the top of my head. Here are some more in their 60s:

- Google also hired Vint Cerf (TCPIP) at age 62.

- Amazon AWS recently hired James Gosling (Java) age ~62.

- PayPal hired Douglas Crockford (Javascript book author) at age ~57.

My point is that SV companies (and other industries besides tech) will actually discriminate against the young 20-somethings for certain roles.


> will actually discriminate against the young 20-somethings for certain roles.

Uhh that's not age discrimination, that's creating a job role "Chief Internet Evanglist" and hiring the best person for the job, "Vint Cerf". Do you think anyone other than the creator of the TCPIP protocol could fill that role regardless of age?

Also, isn't there also an implicit age discrimination against younger people in any job spec when you say "X years experience"?


> isn't there also an implicit age discrimination against younger people in any job spec when you say "X years experience"?

Yes there is, and it's lazy and wrong. Figure out how to test for the skills you need; how many years someone's been using something is only very weakly correlated with how well they know it.


> Figure out how to test for the skills you need;

If you know how to systematically do this for all hiring positions, I literally have thousands of businesses that would pay you serious money to help them.


>Do you think anyone other than the creator of the TCPIP protocol could fill that role regardless of age?

Not to argue with you but if we want to have fun with question... it's not "Chief TCPIP Evangelist" but "Chief _Internet_ Evangelist".

If "internet" is the key word, we don't have to be limited to an arbitrary point in the network stack and pick a person from it. If we go lower and more foundational to the "internet" which is "packet switching", maybe Leonard Kleinrock[1] is more appropriate "Internet Evangelist". If we go higher in the stack to http, maybe Tim Berners-Lee is the better fit for "Internet Evangelist".

If we really embrace "internet evangelist", maybe the best person is none of the technical guys but a 19-year old college dropout from Africa. Conceivably, that type of person would have more to gain from a better internet so his brand of "evangelism" for internet's future growth would be more effective than any of the older white guys from 1st world countries.

>Also, isn't there also an implicit age discrimination against younger people in any job spec when you say "X years experience"?

Sure. One could also argue that there's stealth age discrimination when your job requirements list "Rust programming language experience" since more 20-year olds than 50-year olds will learn that language.

[1] https://www.youtube.com/watch?v=dRlZcKUp2vo


> If we really embrace "internet evangelist", maybe the best person is none of the technical guys but a 19-year old college dropout from Africa.

Ok, now you're just being contrarian, but I'll play along. Maybe you're right, maybe the 19-year old college dropout from Africa is the best candidate for the position. So while you play armchair head of HR, care to explain to me how you unanimously align the merit of every candidate in the pool with respective open positions in corporate entities?

> One could also argue that there's stealth age discrimination when your job requirements list

So then shouldn't it be illegal to list # years of experience as a requirement?


>Ok, now you're just being contrarian, but I'll play along. Maybe you're right, maybe the 19-year old college dropout

The point wasn't to annoy you. I used the college dropout as an extreme example because when Vint Cerf gives talks under the banner of "internet evangelism", he doesn't actually use any of his technical TCPIP specific knowledge to do the cheerleading. (But I'm also not saying the Cerf is the wrong guy for that role.)

>explain to me how you unanimously align the merit of every candidate in the pool with respective open positions

This is actually impossible to accomplish. The problem is:

(1) desirable characteristics are multidimensional

(2) undesirable characteristics are also multidimensional

(3) "fairness" of evaluation and outcome cannot be unanimously agreed upon from the employer _and_ from the candidate _and_ from the outside non-participants that compile statistics on job acceptance/rejection interactions

(4) any qualifications with a non-age & non-years-experience label such as "Rust expertise", or "culture fit", or "on-call weekend rotations required" will be _perceived_ as "age discrimination" or other discrimination. This is related to (3) that "fairness" cannot be universally be agreed upon.

All items above are also subjective. We are compressing multidimensional factors into a single outcome of "accept/reject". The multidimensions remain subjective and possibly unable to be articulated but the rejection of a candidate is an objective fact and is directly observable. There is no difficulty articulating the binary outcome that the candidate was accepted/rejected.

>So then shouldn't it be illegal to list # years of experience as a requirement?

I think you interpreted an unintended idea from my "Rust" example. I'm saying that even if an employer specifies a desired attribute in the job listing with no malicious intention of age discrimination, the actual candidate pool will be skewed and be _perceived_ as age discrimination and thus -- it can be argued by some that it _is_ "stealth age discrimination".

For example, if a job listing specifies "candidate will have enterprise resource management expertise with at least 2 previous full life-cycle implementations from fit requirements gathering to production go live", the candidate pool will heavily skew towards the 30-somethings and 40-somethings. Complex ERP implementations can take ~5+ years so asking the candidate to have 2 of them under their belt will be near impossible for 20-somethings.

Likewise, a COO job listing requiring "previous profit & loss responsibility of at least $10 million" will heavily skew towards 40-somethings and 50-somethings. Very few 20-somethings will have leadership responsibility over a $10 million division -- unless he was a young founder of his own company like Mark Zuckerberg. (And obviously, 20-year old founders do not answer others' corporate job listings so they won't be in the candidate pool.)

Even if employers leave out words like "years of experience" and instead, substitute "expertise" or "previous responsibilities", the world can always find an argument for "unfairness".

My comment you replied to was just observing an unavoidable perception rather than making any recommendations on how to write job postings.


> ...will actually discriminate against the young 20-somethings for certain roles.

Sure. They won't seem "authoritative" or something vague like that. They'll be passed up for leadership roles because they're too "green".

But you can have discrimination against young 20-somethings and the 40+ crowd at the same time.


> First, ageism is not just a SV phenomenon. For example, see the White House aides in Clinton, Bush, & Obama administrations.

And the office of the president itself. The minimum age is 35.


So? by definition aides aka SPADS (special political advisors) are junior political wonks who want to make politics a career they do a PPE at Oxbridge work for a pittance for a party or MP/Congressman or Senator for a chance to climb the greasy pole.


>by definition aides aka SPADS (special political advisors) are junior political wonks

There must be some misunderstanding. Are you from the UK and unfamiliar with the USA?

To be clear, there is no "by definition" that the office aide sitting next to the President of the USA must be a junior 20-something worker. There is no "junior" in the definition of that role. Theoretically, the gatekeeper desks can be staffed by a 49-year old or a 70-year old elderly person that wants to work at the White House. My point is that the 20-somethings will be favored over older workers in _non-SV_ jobs like that. People keep making the mistake that that ageism is mostly specific to Silicon Valley. It isn't. It's common everywhere in non-computer industries.


Tactical vs strategical.

At implementation level, software throughput is higher when you are younger. Validating/killing ideas, making money at large scale do not require experience, not that much.

At strategical level things change.


The code I wrote in my 20s was pure garbage. The code I wrote in my 30s was embarrassing. The code I'm writing in my 40s is perhaps passable. I'm hoping that by the time I get to my 60s I will be able to write good code. I'll probably retire just as I'm hitting my peak as a programmer.


In my 40s and feeling the same. It's the sad irony of life that just as we get truly good at this stuff, we either retire or move into management.


So in other words, "You either die a hero or you live long enough to see yourself become the villain." :)


A challenge of our increasingly advanced technologies.

Some of this we can work around with abstraction, but sometimes I wonder if an extension to productive working years will be a critical step to keep advancing science and technology. The more the existing body of knowledge grows, the longer it takes to study & train up to the forefront of development.


I don't understand this mentality. Don't you think you're being hard on yourself? How do you define "good" code?


A few weeks ago I attended a conference for startups.

Met one that was doing R&D projects for big corps and used "young" freelancers for it, "because they know all the new tech!"

But what I heard was "because they're cheap!"

In my experience, young people won't hire older people, only if they are really cream of the crop.

The only companies I worked for that had >40 or even >50 year old people were those with >40 year old directors.


I always love to hear about cultures where old people are the most respected of a society. I think such cultures are happier, because they ensure that the future looks bright, no matter how old someone is.

But ... such cultures might be in conflict with the information society we have built.


Not really.

Ageism seems largely concentrated on the Silicon-Valley style society.

As far as I am concerned, ageism does not exist at my workplace. Old engineers, even software engineers are looked for because they bring possibly valueable experience.

The simple truth is that ageism is a result of not valuing the lessons of an experienced engineer because they're not up-to-date enough.


"Ageism seems largely concentrated on the Silicon-Valley style society."

I saw ageism many years ago at a big aerospace firm in the L.A. area. It was understood and even outspoken that young engineers would get the call to work extra hours and weekends if needed, as well as the occasional graveyard shift when a unit was in thermovac testing. Older engineers were not usually called upon become they had "paid their dues" as young engineers. Colleagues at other big aerospace firms said is was pretty much the same for them. Ageism for sure, in hardware development, and not in Silicon Valley.


That's not ageism; that's a gentle form of hazing, and to be found in almost any profession or trade. An extreme example is what medical students go through in the early years of their study; a mild example is how, when I worked nominally as a cashier in a grocery store as a high school kid, the more experienced folks ran me from pillar to post for my first couple weeks' worth of shifts, doing everything from stocking to bagging and hauling to breaking down boxes and cleaning up spills, including one especially memorable spray of vomit from a kid who'd eaten too much of the extremely wrong thing.

The point is to find out whether you're game, and whether you'll work hard when you need to. These are important traits in the makeup of a worthwhile colleague, and those who would invest considerable of their own time and effort in mentoring you want to make sure they won't be wasted. This is how that's done. And, yeah, it's a pain in the ass for a while. But once you've proven that, as it was once put to me, "you ain't no scrub", that's an end to it.


Nah, as the older engineer, I can say with conviction that it is simply taking advantage of younger colleagues. Older engineer is using his connections, status in group or friendship with decision makers to avoid uncomfortable shift. There is no other point. Occasions to find out whether you work hard tend to arise naturally, without putting all weekends on younger colleagues.

Also, while mentoring is important part of seniors work, it is not "own time and effort". It is paid on the clock time and effort. I am not mentoring overtime for sure.

Hazing is not rational. Its main benefit is that incumbents feel good about power trip they are allowed to do. Plus a bit of good feeling after someone else has to go through uncomfortable thing you yourself was through in the past. Nothing more, nothing less.


Paid time and effort that could be better spent than on mentoring someone who'll make no good use of the knowledge you're trying to impart, then.

Part of learning your craft is learning how not to get owned, too.


What does someone not using knowledge have to do with younger employees having to do all the weekends due to hazing and "unspoken rules"? Anyway, if you are not willing to explain, then your senior salary is deathweight to company.

Another part of learning craft is to recognize when someone else is being owned. It is a bit harder.


One wonders whether a senior engineer who does so recognize has put a word in the ears of the juniors involved, if he feels they are indeed so hardly used.


And I'm sure the older engineers will be more likely to be called upon when the situation demands experience, no?

I fail to see the harm in "discriminating" a bit against the younger engineers by having them collect more experience.


Having done my share of babysitting units through thermovac I can say with some authority that the main experience is in personally observing the effects of sleep deprivation and mind-numbing boredom.


"Older engineers were not usually called upon become they had "paid their dues" as young engineers."

Old engineers were not usually called upon because they just said "no" when asked, so management quickly learned to ask the young engineers who said "yes" instead.


Setting ageism aside (which is certainly real), perhaps it’s also related to what the company is optimizing for.

For example, it’s now clear that companies like Twitter only optimized for number of supposed real users.

I’m sure they had no appetite for to hear lessons from an experienced engineer.

“Hey, what if this platform becomes a cesspool — a potent vector for hate and fraud, infested with fake accounts?”

“Shut up, we’ll solve that problem later, after we get big fast, assuming the world still exists.”


"You'll be working here to build a massive, deliberately addictive, privacy invading societal ill full of fake news and skinner-box games."

20-something: "Sign me up, bro! This entry-level salary is more than twice what my parents make combined!"

30-something: "I'm torn, but I still have student loans to pay off..."

40-something: "Will I be proud to tell my kids I built this?"


the young have student loans, but the old don't have, say, mortgages or medical bills?


That is certainly also part of the SV "Get big get rich" mentality.

Some corporations certainly do value building a safe and sound baseline of quality to stand on where they can service customers reliably and to utmost satisfaction.


I wish this was true in India. In India, we love to parade around saying respect our elders. But when it comes to Indian IT companies, it literally is a pyramid. I used to and still wonder where does all the 10+ year experienced guys go. In almost every team I have been, its predominantly freshers. In India, once you have 8+ years, I have heard the number of calls one get from recruiters is far less. And if you are not a team lead or something by then, they may look at you as through something is wrong with you, even when you are there to solve problems they cannot.


Maybe they have too many people ?


Its true that too many people gives the companies an advantage. I think the below reasons also contribute:

1. Good chunk of outsourced work doesn't need much knowledge or experience.

2. Customers who outsource outsource because they want to save money. This means they will say they cannot pay more than x dollars per hour per resource. This will automatically eliminate experienced guys because they would expect more salary.

3. Freshers will literally spend all their time in office for really low pay.


Maybe the only thing they care about is keeping salaries low?


I would be interested to hear stories about whether it changes as demographics changes. If there is a declining birth rate, does culture move toward the middle aged? I would like to think so, but I've read that Japan is still a youth culture despite aging population.

Maybe the opposite is true, as the article implies. When mortality is high and there are fewer old people they acquire value through scarcity.


I've met plenty of those up in age that can adapt and learn new technologies. There is no impairment from age, at least for most people until their 70s, that really stops them from learning new things or adapting to change. I've worked with a music instructor in his 60s who runs his own website and actively learns new web tech just because he thinks its cool, as an anecdotal recent example.

Its in the same cultural structure as the general anti-intellectualism movement. The older you are, the more memories and experiences you have, the more sunk cost you have in the skills you already acquired and the more resentful you are to have to acknowledge their obsolescence.

It becomes really easy to just say no when you have some amassed capital to having to adapt to changing in-demand skills. That is what the entire argument Trump made regarding coal miners has been - obsoleted workers who don't want to change and a president who promised them to make the world stop changing for them.

In practice there is legitimate ageism, especially in startup culture. But there are super strong parallels between ageism and sexism - there is the structural and the intentional. With each:

Explicit Sexism: Radical Tumblr feminists, "Incels", "Women belong in the kitchen", actively choosing to not hire an equally skilled person because they are a certain gender, or actively and intentionally paying someone less because of their gender for the same job.

Explicit Ageism: Old people can't learn new tricks, "millennials are ruining X", not hiring someone due to just age.

Structural Sexism: Women going into less in-demand or lower paying careers, leading to the average woman making less money per hour spent working than the average man.

Structural Ageism: Wanting to pay less for a job, regardless of skill, thus discriminating against those with lots of experience (ie, older people).

The former are just obviously evil, the later are consequences of the way we prioritize our society. Getting rid of the former are generally easy because you can just denounce sexists and ageists. Getting rid of the later requires whole cultures to shift, which is much harder to do.

But ageism isn't that unique among most other discriminations by physical characteristics. Humanity runs the gamut on ways to make other peoples lives worse due to circumstances beyond their control.


I prefer cultures who respect senior members, but make decisions based on the merits of the options proposed.


There can be plenty of ageism in those cultures as well. It just works the other way around.

After all, ageism is discrimination and discrimination can work in every direction.


A good way to counter this is to become an expert in some combination of "something old + something (somewhat) new". Like knowing bog standard java well, but also knowing how to deploy/use it in AWS well. Or being a C expert that also understands git in a deep way.


Or just pick a good technology going forward. Experts in C and Java are still in demand. There's no urgent need to branch out from those core competencies (well, C devs should probably have some C++ skills too). Old+New is an interesting and likely beneficial career plan, though.


I love C, but haven't done professional work in it for ten years - where are people looking for C programmers?


"pick a good technology going forward"

That's hard to pull off though. Who knows which technology will go out of fashion in a few years? It may become fashionable again a few years later but int he meantime you will have a hard time.


> Who knows which technology will go out of fashion in a few years?

Tech that is in heavy use and has been in heavy use for a decade is much more likely to be around in two years than tech that was invented two years ago. If the older tech were to decline, rarely would it drop off the face of the earth quickly. It would more gradually shrink in market share.


I used to be a hardcore C++ guy., Going was really good until around 2004 or so and then suddenly nobody wanted C++ anymore. That lasted a few years and suddenly demand picked up again. It's definitely a gamble with factors that are outside one's control


>> Zuckerberg once observed, “Young people are just smarter,”

When he's old that will change to "older people are just more experienced".

I mean "it will change" as the Seven Commandments changed in the Animal Farm: you won't be able to find the original quote anywhere on the internet, anymore; only the retconned one. Zuck will always have been wise beyond his years.

You wait and see.

(I exaggerate)


It sounds like we are seeing work become sport, with all the positives and negatives sport has. Maybe seeing it as ageism isn't helpful. Ageism doesn't exist in sport because the abilities you need can and do decline with age. I wonder for all the talk about "I'm just as good" if people really are; it's honestly ridiculous to assume a forty year old is as elastic or is possessed of the resiliency or endurance to keep reinventing their entire knowledge set every few years.


It's been said elsewhere in this thread, but there is a distinct difference between the knowledge you accumulate in years of engineering and the constant cycle of learning the new framework.

Anyone can learn a new framework. Making good architectural and design decisions generally takes experience.


Well, I am 37 and in my Junior year of a Computer Science degree. I'm not gonna lie, this freaks me out a bit. I do live on the east coast though, hopefully that will help some.


I'm not kidding here…

If you're eligible for a clearance, move to MD or VA and work for a DoD contracting company.

I'm 29, but most of the people I work with are 40-60+. Ageism simply doesn't exist here and the salaries (6+ figures for new grads) and benefits are excellent.


If you don't want to live in the DC Metro, you can also move to any city adjacent to a major military base. Huntsville, NW Florida, San Diego, Clarksville, Waukegan, Ft. Worth, etc.

Salaries are lower, but traffic is better.


Great advice. Ill look into it. Thanks.


I graduated at 36 ( 21 years ago) Admittedly, I very wisely picked the best time in history to pick my major(Computer Engineering, and that was because there was no local school offering chem Eng. and I saw a physics student playing with fractals and thought, "wow that's cool") Of course, I am pretty sure that kind of mind set is why I do ok, I have always been fascinated with just about everything which makes learning fun.


Trust me, it's not a problem. I got a CS degree at 35, now I'm 46, and I look "young" for my age (30's probably). I theorize this leads everyone to assume/perceive I'm younger than I actually am. In my location (mid west, metro of a major city) senior develoopers are in hot demand. So I've never perceived ageism to be a problem.


Good move, LinkedIn doesn't show your age but does show the year you graduated. I have a customer where my bosses are 20somethings that were a bit surprised when they met me IRL and saw a guy late in his 30's walking in. Graduating in 2012 really helped I guess.

I'm thinking about getting a master sometimes. Graduating in 2012 makes you look like you're an old person in ten years doesn't it?


So do I, and I haven't fallen foul of it yet.


As the bulge in older people progresses and ageism becomes worse in an irrational way, it may make sense for older people to come together and form companies of their own, focused on selling to older people, who, of course, have the wealth (if not always the income).


You bring up a good point: Despite older people having all of the wealth in the world (and likely a great deal of the income), few startups seem to be courting them as customers. Everyone's target end-user seems to be that mythical 15-25 year old middle class Ideal Consumer.


I'm less certain older people have such a big chunk of income (from labour, at least). A lot of wealth is tied up in real estate and pensions. Not everybody climbs all the way up the pyramid to make the big bucks; but people do accumulate assets.


Could it be because older people are also going to be much more unwilling to try something new ? Noophilia seems to be a feature of youth.


I don't have any numbers on this and would welcome any, but it seems true. How many young people are willing to sign up for $how_new_website_of_the_week vs older people? My personal experience is that older people realize that the website will likely disappear within the next 2-3 years, so they ignore it until it actually catches on. Younger people seem willing to sign up for anything new.

How many young vs old people used myspace or facebook when they first came out? How many young vs old people use instagram, whatsapp, etc ?

Once again, I have no numbers, but I'm guessing it's heavily skewed towards younger people.


It's a chicken-and-egg problem: I'd propose old people are not trying $hot_new_website_of_the_week because they are not in $hot_new_website_of_the_week's target market, and they are not in $hot_new_website_of_the_week's target market because they are old.

I consider myself an early technology adopter, have been all my life. But as I get older I'm less interested, not because I'm older, but because new technology companies are actively choosing to ignore my needs as a customer and my interests.


As you age, you become less motivated by trends and fads. The more seasons of fashion nouvelle you've seen, the more you recognize the Next Big Thing as being mere Pied Piper ephemera.

Soon enough, the major cultural inflection points of leaving elementary, HS, and college will fall behind you. Your job and family take center stage, and your free time shrinks along with the positive reinforcement you get from lightweight social immersion. Then the bloom is off the rose of those latest startups as your net-social focus shifts into more of a spectator mode, posting family photos on Facebook for those now distant friends of yore while caring for your kids, and then caring for your parents. Before you know it, 30 years have passed since you were tuned in and turned on by your last net pet rock.

In many ways, SV startups are truly a young person's game.


I think the reason for tech ageism is simple: generational conflict peppered with historical perceiving of how old tech folks work (being stubborn, "slow" to adapt, too hierarchical and other common views, as all the old folks were baby boomers or something trying to step in tech for the first time, just today). It takes time for this prejudice to fully spread in waves over the years so only recently this ageist view became critical in tech, I guess. It will also take some time for some of the affected old folks and the young ones to realize the current "old" people were actually young enough around the 90s so ageism will slowly stop being a problem in, say, 10 years from now (I believe peek ageism is past us). But perhaps I'm being naive and for lots of 20-somethings out there being over 40 in tech is a career death sentence indeed.


> “Those holding more negative age stereotypes earlier in life had significantly steeper hippocampal volume loss and significantly greater accumulation of neurofibrillary tangles and amyloid plaques."

I'm a bit disgusted that the article asserts this is somehow causative - that ageists age worse for some sort of karmic reason.

The screamingly-obvious interpretation of that data is that people who see bad examples of aging hold more negative stereotypes. If your whole family has a history of early-onset dementia, you'll probably view aging as a horrible process. And, surprise, you'll probably have rapid cognitive decline just like all the people who shaped your opinion. Just about every age-related form of decline has a major genetic component, and yet Friend insists that fear of aging is 'karma'.


I'm a bit disgusted that you're asserting that the article asserts a causative relationship when it does no such thing.

Here's the paragraph in question & full: Karma’s a bitch: the Baltimore Longitudinal Study of Aging reports, “Those holding more negative age stereotypes earlier in life had significantly steeper hippocampal volume loss and significantly greater accumulation of neurofibrillary tangles and amyloid plaques.” Ageists become the senescent figures they once abhorred.

Nowhere does it imply any causation. The reference to "Karma" isn't the author professing his believe in the cosmic power's intervention to establish justice among the people. It's just a rhetorical devise to highlight the finding.


Ageism is exacerbated by technology.

Pervasive communications technology has increased the number of choices that we have. We are suddenly all in competition with many more people. That means that superficial signals like age and culture rise in importance just to reduce the search space. It's Tinder-ification.


I think it is a mistake to think that ageism only describes a preference for young over old. The exact opposite has been/is the case for many industries in many countries.

Ageism is bad, regardless of age.


I agree, but it is important to make a distinction between ageism and roles genuinely requiring a certain amount of experience.

I wouldn't want a college graduate to be the one architecting the infrastructure for our company's golden goose.


> With the advent of the cloud and off-the-shelf A.P.I.s—the building blocks of sites and apps—all you really need to launch a startup is a bold idea.

With the advent of operating systems and the standard libraries - the building blocks of applications and widgets - all you really need to launch a startup is a bold idea.

I'm not entirely sure how I feel about trivialising tech startups - sure there are some which are trivial. At the same time, I don't really like that the venerable idea people are being encouraged even more.


"I'm an 'ideas guy'; all those details like implementation cost and monetization strategy and stuff are best left to others, so people in my position can have more time to think up the next cool thing!" <finger guns>


If the tech industry wasn't mostly located in SV with all it's housing problems it wouldn't have nearly as much ageism.

Of course the 20-something who's running a rat race to pay overpriced rent on a poorly maintained dump with no hope of ever owning real property resents the 50-something who owns the place. Of course that trickles down a little resentment for all "older" people.


This is utter baseless speculation. Who thinks about how old their landlord is? Why does ageism exist outside of CA?


Ageism flows both ways.

I recently had a landlord demand that I get a guarantor for signing a lease even though I make well over 40x the rent (the typical requirement). It's clearly because he (an elderly man) considers me (a young guy in my 20s) untrustworthy/unreliable.


Is it just me, or was there was a generation where it was hit or miss whether they regarded learning entirely new technologies as a routine part of the job? Among people who would be over, say, roughly 60-65 years old right now, it would be pretty cool to meet someone who kept abreast of developments in the field and were excited when they spotted something that could be an improvement on the tech they currently worked with. Those people were exceptional, and they were always pretty smart. The norm was to want to spend the next ten years working on the same thing you spent the last ten years on.

I think people younger than that (people under 60 now) have internalized the ethos of always being ready to roll with change. Who moved my cheese, all that stuff. Even if it's hard for them, they at least accept it as fact and give it lip service. It's been a huge cultural shift to an attitude that everyone has to spend their whole lives learning and adapting. That attitude is native for my generation; in school we read about the Rust Belt and the collapse of American manufacturing and we all said, "What was wrong with those people? Why didn't they adapt, move to a different economic sector, and move where the jobs were?" That's the way we were programmed to think. My father's generation understood the Rust Belt differently. They all hoped they would have done better, but failing to adapt was a normal response that they could easily sympathize with. That generation was not on notice to always be ready for change.

I think this cultural shift is going to result in a decline in ageism. The older engineers I've worked with in the last ten years have behaved like any other engineers, except they happen to have a lot of experience. Unlike many older people I worked with early in my career, they aren't cranky about their favorite tools declining in popularity. They don't drag their feet learning the team's new tech stack and rationalize their refusal by predicting widespread failure and a mass return to last decade's technology. There was a time when that was the way older people were supposed to act and there was no embarrassment about it. In a younger worker behavior like that could be treated as an attitude problem, but in an older worker, it was tricky, because social norms said they were entitled to act that way.

Now the expectations for older people have utterly changed. I can't imagine a 55-year-old engineer pulling that shit in 2017, so there's no reason to be afraid of hiring them.


>“The irony of man’s condition is that the deepest need is to be free of the anxiety of death and annihilation; but it is life itself which awakens it, and so we must shrink from being fully alive.”

I'm less afraid of death than of not fully living my life.


Why do people who write these stupid articles think it's alright to assume that younger employees are less intelligent based on their age, but not to assume that older are out-of-touch or have misplaced priorities based on their age?


I never understood ageism. We all get older, fact of life. So why look down on older people and write them off.

I wonder how milenials will talk when they themselves turn 50. Maybe they turn the tables and 50 will be the new 20.


Ageism <-> Burnout

Change what you call it and the conversation changes. It depends on the narrative the author wants to push. Those two things are related.

The middle age software dev you burned out then tossed aside has the children most likely to be software devs. You spoil your reputation with the kids before they're even old enough to apply at your company.

You've also increased the chance that, due to unstable home life thanks to layoff, the kids won't ever become software devs in the future. Then you complain about needing H1Bs. Those things are also tightly related.

Zuckerbergs of the world are too immature to see cause and effect. They only see next quarter results while driving society to ruin.


The idea of professional senescence beginning with the fourth decade of life seems to originate in academia, where publication of original research is the figure of merit for professional value, and does indeed appear, in the aggregate, to peak late in the third decade.

But academia is a profession more unlike than like our own. We do, after all, call ourselves not scientists but engineers. While the role of an engineer may include the invention of novel applications or even novel theory, it need not; what we do revolves instead around the application of ingenuity to the solution of real problems, and I see no reason to assume that facility of ingenuity or skill in its application are dependent upon youth. On the contrary, the accrual of experiential knowledge which comes with a long period of practice in the field makes a canny engineer more capable, rather than less so.

Not to say that there's no place for genius, of course - but, again, genius is hardly the sole province of youth. And ours is a young field, as the professions go; ours is the third or perhaps the fourth generation in all human history where the phrase "software engineer" has even had a defined meaning, and so it's reasonable to expect we may still be somewhat bound by the traditions of the theoretical fields in which our own so recently originated - we haven't really had time to establish traditions of our own, and the field is something of a free-for-all as a result. Combine all that with one of our preeminent centers of practice by happenstance being a place which has fetishized youth and beauty for well longer than our field's been around - and which has also, separately and for quite a few more years, made an industry of exporting its own culture to our country and the world - and it's not hard to see how we might end up with the kinds of attitudes here under discussion.

There is, though, cause for considerable optimism in the fact that Silicon Valley is not the reliable metonym for software engineering culture which it is so often taken to be, whether by the self-congratulatory among its own inhabitants or by those naïve or ignorant enough to believe the hype they expound. Sure - the field was in considerable part incubated there, and largely codified there to the extent it has been codified at all. But that a thing was invented in a place does not give that place an exclusive, permanent claim on defining the nature of the thing.

Sure - if you're forty or fifty and you want to work for Google as a software engineer, you're going to have a bad time, not least because you'll probably be interviewed by infants who may find you reminiscent of the fathers in their relationships with whom they are not yet fully past the adversarial phase. But there are many places to work other than Google, and many more preferable places to work and to live than Silicon Valley. There is no reason for anyone, but especially anyone who knows better, to subject herself to the programmer's equivalent of the Hollywood talent mill - often, as we have lately been hearing in the news, complete with "casting couches" for those who happen to suit the sexual fancy of one who happens to hold some power.

Come and join us out here in the real world! We'll be happy to have you among us, and you can make a life for yourself here. Believe it or not, we even have some houses that sell for under a million!


Silicon Valley, in particular, has always seemed to me to fetishise the pseudo-legendary brilliant young school leaver overachiever superstar. Anyone who hasn't burned out by the time they hit 40 is clearly either lazy or an underachiever, and therefore not the sort of person we would want to hire.

The Valley's most cherished myths are amongst some of the most societally detrimental.

(Disclosure: I'm 40 and work in tech. This sort of thing gives me waking nightmares.)


How about y'all stop complaining about everything


Would you please stop posting like this, read the guidelines, and then comment civilly and substantively?

https://news.ycombinator.com/newsguidelines.html


It was a substantive comment,and I agree I could have been nicer. The negative emotional overtone was a part of the message


I really hope that the ageist title was on purpose.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: