Hacker News new | past | comments | ask | show | jobs | submit login
On Getting Older in Tech (corgibytes.com)
499 points by mindfulgeek on Dec 9, 2016 | hide | past | favorite | 417 comments



I'm going to apologize ahead of time. this might be a ramble.

63 year old white guy with little hair and a lifelong beard that is now white. A tad overweight as well. I feel for so many people expressing angst about ageism. I've seen it elsewhere but not where I work now.

I suspect that at the faster growing companies and companies in tech centers, mostly on the coasts, see more pronounced ageism.

My last job was at a bank in Richmond VA and there was clear ageism in IT when I got their. I moved to compliance from IT for two years and I was very successful and never observed ageism.

At 53 I moved to a web development manager and developer role in Higher Ed. I took a gigantic pay cut, if you factor in bonuses and options to work in Higher Ed. But I got to send my oldest to college at a selective school for free. A $200K after tax benefit.

I don't look back. My life is so much better now with a 40 hour work week and being in control 100% on how we architect our web and backend eco-system. I spend more than 40 hours because I love learning but I choose when, where and what I learn and work on after 40 hours.

I read these comments from people that are 36, 40, 40+ and shake my head. That is not old. 63 is not even old. I plan on working till I am at least 67. I love my job and I especially love the people I work with.

I find these days I spend less time coding and I end up with better applications because I think through the design before coding.

I'm going to ask around in Richmond VA and see what ageism exists in industry here and report back.

Keep learning, try as hard as you can to stay in shape and engage in critical thinking. Good luck to each of you in staying employed and staying happy.


"I find these days I spend less time coding and I end up with better applications because I think through the design before coding."

^A golden nugget buried in an insightful ramble.


And you don't have to be a certain age to learn that lesson and put it in practice.


Absolutely right. In my job I find that my seniority gives me time to think more however. It depends on your circumstance and the work culture and boss relationship.

I report to the CIO and he has little need to know details. He cares about long term progress and the big picture. He could care less about how we get there as long as we are taking care of people and doing it honestly.


Totally tongue in cheek, but you're never too old to learn that you should have written "He couldn't care less about how we get there..."


Definitely, but key is to actually THINK, and reason about it, not just worry/stress about it!

Great irony also is that I've seen teams try to be agile, which they seem to think means code > design.. Often ends up poorly.


Thanks buddy! 50+ here and optimistic I can remained gainfully employed, but nervous about it. It's been my experience as well that the kind of companies you might see parodied on "Silicon Valley" have a more pronounced age bias, but there are lots of other places that need IT/web and have no such issues.

I was in a room full of developers when I read aloud the story about the guy who started oldgeekjobs.com, who defined old geeks as over 35. The room erupted with the loudest "F U" I've ever heard.


Yup. I was like no Fing way when I read that one. Keep up the optimism.


lol


Was it a paycut or really just electing to convert less time into money?

So many salaries look and sound impressive until you actually do the math on what it costs you.


It was both. With bonuses and options + salary I took about a 50% pay cut. My hours went from an average of maybe 70 / week to 40/week. My kids, now grown, thought I was a grump. Now they enjoy me being around. Same for my wife. Mostly importantly I was having health issues that less hours and less stress allowed me to get under control.

As I said, I have never looked back. I don't miss the money because I don't miss the stress.


It's called compensation for a reason. Sounds like you were compensated for missing out on your family and for trading years of healthy life. Not so tempting when phrased like this, but we often only realise after it becomes "normal".

Also for many there is not much choice. Sounds like you had a top 5% job. Could you exist well without the situation your former life gave you, such as owning your own home and not paying rent. Or would you be forced back into a bad family life to provide for that family?

EDIT: I'm having to reply here as I've hit the fake HN "you're submitting too fast" after doing 2 or 3 posts. The truculent censorship on this site when one hits specific topics sucks.

epalmer: Ok, good for you, and congratulations in finding your equilibrium!


I would have been fine. I've owned two small businesses that paid well at the time. I worked a lot of hours but I had fun with those companies.


I'm in Richmond VA as well and in my late 30s.

I do worry about aging out of this industry and my long-term plan is pretty similar to the path you have taken. My hope is that I can participate in the agency/startup ecosystem for another 10 years before I have to find something else to do. Hopefully that will still be in technology, and will probably be in Higher Ed or at a Nonprofit for a less money but more intrinsic rewards.

I haven't observed specific instances of ageism towards me or others (people being passed over for promotions or treated differently than younger employees), but I also haven't worked with many people 50+ since I started in this sector. I'm not sure how much of that is that people in that age group are looking for a more balanced lifestyle and how much comes from the companies that do the hiring.


I find these days I spend less time coding and I end up with better applications because I think through the design before coding.

At the risk of going off-topic; every new/kinda-new/wish they were new engineer can learn volumes from that single statement.


How do you learn a volume from a single statement? It's a good thing to think about: design up front is a tool for increasing quality. But the whole field of agile design is questioning the universality of that. The "volumes" are filled with all of the little things that tell you what is worth thinking about and what isn't. They contain many more statements on the subject.


>I read these comments from people that are 36, 40, 40+ and shake my head.

I'm 47 maybe there's hope for me yet.


> I read these comments from people that are 36, 40, 40+ and shake my head. That is not old. 63 is not even old.

Thank you for this. You made my day. I just entered my 20s and am constantly in fear of ageism later in my career.


48 yo here, started coding when I was 14, so that's 34 years of building things.

Here's where I'm at.

I look back at my career and I can tell you about great projects that I got to be part of, awards and plaudits that I won, big paybacks from projects that went well and literally saved the company. That's all nice to have war stories.

But I can't point to any of it and say, "I made that" because - and here's the kicker - it's all gone.

Software is ephemeral. One day your client does an upgrade, and then the thing that you spent years building and curating like a baby disappears. It isn't mothballed and put in the basement where visitors can walk by and see it. There's no photo of you standing by the thing that you can hang in the hallway and see every day. Your creation just completely vanishes without a trace.

All those years I've also been a musician and recording engineer. I've made a few dozen records none of which amount to anything that anyone else would care about. And all told I'm sure that I earned more money in one year of my IT work than my entire music career.

However, here is a collection of my work that I can point to and say, "I made that." It's a creation that I can reflect on years and years down the road.

I take much more satisfaction in my musical creations than from my software creations, even though I was much more famous and valued as a software architect.


Torvalds, 46, can point to Linux and say "I made that, over half a lifetime ago".

Stallman, 63, can point to GNU Emacs or GCC and say "I made that in the 1980's". Not necessarily the most recent version of it, but that hardly matters.

Gerald Sussman and Guy Steele can point to Scheme and say, "we made that".

John MacCarthy was able to point to Lisp and say "I made that", right to the day he died and we can continue to say it for him.

Make the right stuff; then you can bask in it for longer and be a kind of living saint to a few generations after you.


That's like saying da Vinci is representative of artists or Prince is representative of musicians. The average person who doesn't achieve software legend status will see their creations vanish as the previous commenter described.

I've worked in this industry for a long time as well, but I'm not someone you've heard of. I too have built some very cool things that I'm proud of, a number of which no longer exist. Meanwhile, a shed that I built myself 30 years ago still stands, and I can still point to that and say I made it.


If you make something when you're 20, and just keep maintaining it until you die at the TTY prompt at 85, then all your life you were able to point to it and say "I made that (and am still making it better)". This is the case even if that work isn't well known. Perhaps nobody else will point to it for you after you're gone, but while you are here, you can say that.

Things you hacked up in the past are gone because they solved a narrowly defined problem which no longer exists, and even before that happened, you already abandoned those programs.

That this happens is almost inevitable, as part of making a living. All those programming DaVinci's who are known for something also worked on lots of things that are now dust.


Why are you concerned about longevity of your creations? Surely impact matters more than the time they exist? Your software might only last a few years, but it it's used thousands of times, that changes the world more than another record heard a few times per year but lasts decades.


Dude. Open source software.


Even jQuery which is one of the most famous open source projects will be forgotten in 10 years or even less.


Nope, it'll live on and the code will be available for all to see and study which means it'll end up being a case study or research material in a textbook or Masters or Phd thesis.

One day someone may try to make an emulator for IE5 and ES4 just to run jQuery (like we've seen done with the 6502 and C64).


jQuery is _still_ in html5 boilerplate, which I consider to be a good simple starting point for content-based websites. https://html5boilerplate.com/


But it seems jQuery won't be in HTML 9 Boilerstrap, so I don't know how long it will live: http://html9responsiveboilerstrapjs.com


I disagree. Software with that level of popularity doesn't disappear.


"Unsurprisingly I now use React for most of my coding instead." - John Resig, creator of jQuery

https://twitter.com/jeresig/status/726058698989277185


Heck, even software with the equivalent level of unpopularity (PHP) doesn't disappear.


That can indeed be part of a legacy, but many companies are not comfortable with open sourcing their proprietary software. A lot of exciting work happens in that space as well.


I'm 46, and I totally get this. My other thing is I'm a painter (though I make all my money from software).

I imagine when I kick the bucket all my software will be long gone, but I will leave a bunch of artwork that will survive for anything from days to centuries, mostly on its own merits.


I also took up painting. 10 years of writing software and most of it is was replaced, deleted. Weekends of learning a new tech that is obsolete in 5 years are gone too.

On the other hand my paintings will stay forever on my walls and my children's walls.


Impermanence. Everything goes away. As software developers we often just get to see the whole lifespan of a project more often. Not many other professions get the privilege of building something from nothing, watching it grow and change, then get to slowly pull it down and retire it.


I like this aspect of software development. I've not seen the permanence and ephemeral aspects of things so cleanly expressed in any other field. The fundamentals don't change but the software artifacts expressing those fundamentals do. Not sure why I enjoy that dichotomy but it tickles my brain the right way.


I don't find this post inspiring, I find it sad.

It's partly self-encouragment, part PR. The fact that it even exists is proof that the author is facing some issues, no matter how confident they would like to appear.

That recipe to stay current looks tiresome. Listen to two podcasts, two webcasts, subscribe to four magazines, teach courses, go to one conference per year, blog regularly, read blogs, follow the latest web trends. Your reward: you are still employable.

And why is it that most older people answering on these threads are so passionate about learning and about new technologies and the latest and greatest javascript frameworks. Do they really enjoy having such ephemeral knowledge and basically competing with anyone that's finished a bootcamp or not even that?

It all seems fake. Like they're trying to put on a brave face while at the same time being scared and trying to convince themselves that all this new and shiny tech that they work with is awesome.

Why not have an honest conversation instead of pretending that learning some thing or another will make everything ok in the end?


Nah. The fact that this post exists is evidence that ageism is a thing.

What makes you think older people are passionate about the latest and greatest JavaScript frameworks? That is not my experience at all. New frameworks are just the same old stuff in a new wrapper. We've seen it all several times over. Like, backprop was a cool thing in the late 80s.

I, too, find this post sad, but probably not for the same reasons you did. This post, that makes me and you sad, doesn't make it any less of a fact that the young people in this industry think you are a useless dinosaur if you don't know what the new hot thing is about.


> Nah. The fact that this post exists is proof that ageism is a thing.

Indeed, ageism exists and it's a form of discrimination. There are societies (e.g. japanese, korean, central/southern european) where ageism works the other way around, at the expense of younger people.

In both cases it's still discrimination and we should acknowledge the problem and fight it instead of trying to look/act younger or older.


It's tough to communicate this accurately. I've noticed a trend in engineers self-evaluating as old on HN to say things along the lines "I am having more fun than ever learning about the cloud/TypeScript/webfoo". Several people on all these "old in tech" threads.

I have the feeling these posts are not at all representative of the domain, and neither are the proposed solutions to just learn more, but it's all I've seen so far.

More of the same thing is not bringing us any closer to solving this very real issue.


OK, that's probably where our experiences differ. In my experience, the more experienced techies see that there's nothing really new in the new technologies. Many realize that to stay current, they must learn the new tech. But rarely do they really love it, because it's really a step sideways rather than forward.

I suppose it's a form of competition that forces new technologies to pop up all the time. I just wish people would take a really close look at what's already there before spending their prime bestowing the world with yet-another-framework or language.

By the way, I'm 40. I don't think I'm old. I think Scheme is superior to JavaScript.


> think you are a useless dinosaur if you don't know what the new hot thing is about.

Welcome to the technology industry.


I realized this way back in the mid 1990s when suddenly everyone wanted a "webmaster" with 10 years of experience, when suddenly HTML - a document markup language - became the most important "programming language" one could know.

Since then I have studiously avoided specializing in any technology. As soon as I feel like I've spent enough time in a particular stack to start to "know" it, I move on. I have refused to be pigeonholed into any particular tech.

The coding / language skills are the very least important skills I have, I intend to keep it that way. However I can present an extremely long laundry list of technologies that I have built solutions with - the length of the list, not the presence of any particular TLA on it, is the key to demonstrating my learning ability.


Hi! I've been here since the late 80s. What do you want to learn?


Tell me how you've survived in technology without learning new technologies!


Not sure why you think I haven't learned new technologies.

I have learned them whenever I needed to. Most of the new technologies are not that special. That makes them easy to learn, but, also, kind of annoying because I can see that they're just repeating a mistake I've seen 20 years ago already.


I obviously don't think you haven't learned new technologies in 25+ years. It seemed like the claim to ageism was that the kiddies expect you to know the hot, new technology. Requiring knowledge of new technologies for candidates isn't exclusive to veterans. And it's definitely not a requirement of the other 90% of companies that are using older technologies.


I thought you thought that, because you said it. Perhaps I missed a nuance, English is not my strongest language.

My point, going up this discussion thread a few clicks, was that new technology is not always better than the old. Ageism comes into play when one's opinion about the new tech is dismissed just because one has some gray hair.


> Do they really enjoy having such ephemeral knowledge and basically competing with anyone that's finished a bootcamp or not even that?

I'm not old, I've just past "il mezzo del cammin della mia vita" (meaning I'm 36), and I have to say that the latest and shiniest JavaScript frameworks are pure and utter shit, and I say that as a guy who finds languages like PHP decent enough. I've started working on a project where the "npm install" thing took 1h and a half and generated 260,000 files (?!?!?). One of the countless modules in there is called Chevron (?!?!), with a capital "C", and I sincerely don't know what it does or what's is supposed to do (to make it clear, Chevron has no relationship with said project). It's just baffling.


Heh. I'm right there with you. I've not done web development in over a decade. Yesterday, I decided to look into React. To generate a hello world took over a minute of pulling down and building dependencies. Things are bizzare and inane now. All I want is a .js file to link in my HTML and for that .js to be editable.


Almost every ageism post I've read on HN ignores the hundreds of thousands of others jobs that exist outside of [hot SV tech companies] (avg age at FB is 28? no way!), and also don't mention those people who are right of the bell curve and working at companies like that (because their experience actually is valuable).

Some of these posts are real ageism claims, but most are "I'm old, I'm scared, how am I going to survive with a highly valuable skillset?" It's fear mongering. If you're concerned with job security, go find a job with security in government or in some monolithic non-tech company based in Go-Fuck-Yourself, GA. The "Hi, Fellow Kids" hipster-posing BS isn't going to land you a job at Facebook no matter how old you are.


I'm old, I'm scared ... some monolithic non-tech company

Wait, your solution to the problem of ageism in tech is "get out of tech, old fogey"??


Not so much "get out of tech" as it is "stay in tech, but not at a company that specializes in tech".

Non-tech companies need specialized software, too. When I was in college, I knew multiple people who had internships working on internal tools for a major bank. I imagine those banks also employ senior people to work on their internal tools, customer-facing web portals, etc.

And it's not just banks. I once interviewed for a job working on e-commerce stuff for Neiman Marcus. I'd imagine that other retailers like Walmart, Target, etc. need people working on their portals.

And if you want to work with technology but get out of programming, everyone needs IT.


Not so much "get out of tech" as it is "stay in tech, but not at a company that specializes in tech".

That's not exactly a solution tho' is it? Why go to a company where you are a cost centre, just because you reach a certain age?


They tend to be more mature environments, free of the "brogrammer" culture that's hostile to older people, women, LGBT people, etc. There's not attitude of "let's have a cool, hip office full of cool, hip young people". It's an environment where "culture fit" isn't used as an excuse to discriminate against older people.

It actually turns out that more corporate environments are actually friendlier to marginalized groups than a quirky freewheeling startup.

Job security is great. Big, established juggernauts don't have the kind of churn startups have... there's no worry about "what if the VCs don't go for another round of funding?", and the markets are well-established and slow to change. And if you go into defense contracting or public sector, you might even have lifetime employment.

The work environment is probably going to be nicer. Traditional corporations don't do open offices and don't require engineers to work 60+ hour weeks. Some of us would prefer do to 9-5 in our own cubicle. Banks are also especially generous with PTO (and remember that the "unlimited" PTO you get at startups is a scam)... I'm just going to quote a friend of mine on Facebook when I decided to post a general question of "how much PTO do you get?":

> I used to work for a bank, and they're notorious for giving tons of time, but when in my first position, I had 2 weeks paid vacation, 10 holidays, 10 sick days, and 2 WTFever days. When I was rehired further up the food chain, I got 4 weeks paid vacation, 10 holidays, 10 sick days, and 2 WTFever days, and I could buy an extra week off by lopping a week of pay off my annual salary. If I'd stayed longer, climbed more, I could max out at 8 weeks paid vacation with all the rest of it.

Not all of us care about doing interesting or ground-breaking work. We just want to stay employed so we can fund our lives, and we want a work environment that doesn't make us hate ourselves and want to die.

Honestly, I'm pretty happy at my employer -- we're a tech company, but the environment is very corporate (we're a telecom), and it doesn't feel like a startup at all. The work environment is highly praised, we're ranked as one of the top work environments on Glassdoor, and half of my team are graybeards. I don't want to leave here, but if it ends up happening anyway, I'm giving serious thoughts to pursuing public sector work after this.


I think he's trying to say that ageism isn't as much of a big deal as people make it to be (outside of SV's alternate reality) - everyone in the industry deals with the constantly shifting landscape that engenders feelings of job insecurity for programmers young or old. It's just tougher on older techies that haven't carved out their niche. If you're trying to fit in and compete with fresh bootcamp grads working new frameworks at age 60, you're probably due for retirement - there's a reason why the traditional career path was to engineering management positions for senior engineers. Either management, or something you're really good at that isn't the most hip new tech but still in high demand.


I really dislike this constant caveat that people keep placing against SV. "Well it isn't a problem outside of SV!" I keep seeing this appear in comment after comment on this article


I guess you're allowed to dislike a fact, but any particular reason you dislike this particular fact?


Because it isn't a fact? There's definitely an age bias outside of Silicon Valley as well.


That too, but also, a lot of us outside the valley have to deal with the unfortunate consequence of everyone looking towards SV as the "trendsetters" or sort of the ones who set the "meta" for "modern" software development, if you will, especially since most major confereneces are out in SF/SV area. At my own company, directors and above take frequent trips out west.

What this all boils down to, is if SF/SV is going to be seen as the beacon of software development, it is almost worse to me if it is ageism is being exemplified there. To give a rather rough corollary, consider if Washington D.C. never hired another underrepresented group, such as females, or minorities. It's sort of like, "well maybe I halfway, sadly expect that to happen in some small town somewhere", but C'mon! D.C.! Everyone's looking to you!" -- Same kind of thing.


Maybe, but maybe it's less pronounced? It's hard for me to say, as I've only ever not lived/worked in SV (hello from North Carolina). But FWIW, I am 43, and I don't feel like ageism has been a problem for me. I just went through a job search and had no problem landing a new gig in short order.

OTOH, to be fair, I am obsessive about learning new stuff, and I've been working with some "trendy" stuff the past few years (all big-data, hadoop, storm, kafka, etc. stuff) and I've been doing a lot of machine learning / data science MOOCs over the past year or so. So my skills are a good match for what there's demand for. But that would be valuable if I was 20, 30, or 80.


Agreed. The deliberate ignoring the whole world outside FB/Google is a alarming sign that the idea is just to get blog reads by piggybacking the last bombastic statement by someone famous in the sw industry.


> ignoring the whole world outside FB/Google

The whole world outside of GAFA looks to those companies for "how to do software development right" - no matter if they are right or not.

I've watched more than a few well established companies start to ape Google's hiring practices, or Amazon's churn, or Facebook's development practices just to try and attract new young developers.


I've been following the technology my whole career. I am still employable, but it gets old.

My advice to younger devs is to focus on general computer science fundamentals and application development skills. Those are the only skills that stay with you and grow as your career advances. The technologies always change so don't memorize them. Keep a reference around instead. Memorize things like design patterns and sorting algorithms.


Sorting algorithm you serious? No one uses that in his carreer.


Just to prevent rampant over-generalization in an article whose subject is the topic of bias: We've implemented various special-purpose sorts at least three times in the last five years. Here's one example: https://github.com/efficient/cuckoofilter/blob/master/src/pa...

Yes, this is all in a very high-performance (sometimes insanely so) context, but it does happen. Most of them were like this - unrolled special-purpose versions derived from a sorting network. Some were for GPU.


While this is for the most part true, questions regarding sorting algorithms and data structures (debug this left-rotate function that operates on binary trees) still come up during interviews.


I mostly agree, however there are exceptions that prove the rule. Some engineers are working on standard libraries. (someone has to write that code!) Some people are taking advantage of their data set to write special-purpose sorting algorithms that blow generic algorithms out of the water. (guilty!) Some people are putting stuff together in interesting ways that requires them to understand and sometimes even re-implement the standard algorithms to take advantage of internal data structures or other interesting effects. (It's bad practice, but knowing the order a map will iterate things in can be helpful if you don't expect it to change -- and make sure to have a unit test that proves it!)


You need them for those whiteboarding tech interviews.


I misread that as "waterboarding tech interviews", and I thought it was commentary on how whiteboard interviews feel like torture.


and the latest and greatest javascript frameworks

There are two kinds of learning in play here: learning how to do new things, and learning how to do the same things a slightly different way. Learning a new JS framework is definitely the latter, when 99% of the time the end goal is a CRUD application that could have been done with anything from the last 20-odd years. Because that is all the vast majority of apps are at the end of the day...

The former is what older people should be doing, leveraging the experience gained as a springboard.


I agree - I started at 38 (41 now) and work on a team as an iOS dev where we have mobile, backend and machine learning engineers - ALL - under 30. My interview was blind, so no one knew I was old - or black. They couldn't discriminate if they wanted to. That said, I couldn't imagine using all the tools on this list to stay relevant. I find reading HN, Apple docs and related blogs, books on Swift and iOS, and just thinking about tech time-consuming but enough to keep me up-to-date or even ahead of my teammates. I do appreciate what he said - I don't image FB would hire me - but Fuck them. Do your own thing then. That's why I got into tech - to stay employed long enough to create for myself. Still, if you have the right experience, someone will hire you if the big names (or startups) won't.


I know, right?

It's things like these that make me kinda regret going into tech. I should've gone into accounting or something like that instead.

I'm 32, and while I'm an excellent mid-level developer, I don't think I'm ever going to be principal or possibly even senior material. And everything I've heard from everyone is that is that by the time you're in your mid-40s, you better either a) go into management or b) hit principal or architect level or you'll be unemployable. I'm probably going to end up unemployable in 20 years, and that frightens me.


I kinda agree in the sense that I think that a lot of the tech jobs out there are not "difficult" in a way that requires years and years of experience to perform reasonably well at.

So maybe someone older should look for jobs that suit their abilities rather than a job that, as you said, anyone who went through a bootcamp can perform. Hopefully with age comes some kind of niche specialization or skill that can't be obtained easily.


"Hopefully with age comes some kind of niche specialization or skill that can't be obtained easily."

The big issue I've seen played out multiple times is a hiring manager, dev lead or anyone making a hiring decision not being able to tell the difference between what requires experience/skills/specialization and what doesn't. It leads to the all-too-common situation of people with required-experience being passed on in favor of the (generally younger) "hey, this person can definitely do it - they know all this computer stuff!" (slight exaggeration). It gets even more complicated when the person without the required skills tries to bluff their way through (deceptively or just out of sheer desperation).


> And why is it that most older people answering on these threads are so passionate about learning and about new technologies and the latest and greatest javascript frameworks. Do they really enjoy having such ephemeral knowledge and basically competing with anyone that's finished a bootcamp or not even that?

Nobody enjoys having their hard-earned knowledge become obsolete (hence the "X11/bash/sysvinit/etc were good enough for me so let's never improve them" crowd).

But the fact is computer technology does change fairly quickly. You have to keep learning to keep up.

If you're really worried there are definitely some technologies that persist for longer than others. If you really hate re-learning stuff then I'd stay far away from the web and javascript. Stick with things like Java, Go, C#, C++ & Rust. Those aren't going away any time soon. Ruby, Docker, React, etc... I give 5 years max. Then you'll have to learn something new.


    > Ruby, Docker, React, etc...
Just for the record, Ruby is 21 years old. Ruby on Rails is 10 years old.


Also, I wouldn't include docker in that list. It's evolving rapidly into a core component of many platform frameworks.


3 web development technologies :D

First tip for elders: don't be in web development.


I am doing just that (but Go and Rust don't belong on that list). However, if a lot of people are doing things radically differently it starts limiting my options too.


Can you explain what's the honest conversation to be had here? That "old" people should be looking for jobs outside of Tech? I am trying to understand your point of view. I can't even start to imagine how hard this gets when you are "old", have a family to take care of and, for god's sake, you'd like some kind of stability in life.

I am 30 by the way. I feel way more prepared than 5 years ago. I think experience is important sometimes.


This could probably only be solved by changing the forever young, fashion-oriented US software development culture.

Staying current is just mitigating the problem, not solving it.

And staying current should anyway mean growing ones knowledge not replacing it every X years or investing so much time like the author recommends.


> And why is it that most older people answering on these threads are so passionate about learning

...because older people are told all the time, even on HN, that they're too old to learn and that they don't keep up with technology.

It's not surprising that they pre-empt these doubts by saying what they do to keep learning.

> Why not have an honest conversation

What's the honest conversation?


Well, an honest conversation would involve acting like a regular human, and not a lean mean learning machine ready to go head to head with any younger developer.

Because most people want to live their lives, not jump on a learning threadmill.


Life IS a learning treadmill. Have you looked at the state of employment today? Unless you don't care about income at all, you have to be constantly learning, because things are constantly changing. Even outside the employment space, I'm amazed at how fast the world is changing. My little girls get music playing by talking to a tiny box in our kitchen. Do you realize how crazy/cool that is?


> acting like a regular human,

In this thread we see people saying that older people are unsuitable employees because they have families, and would prioritise those families over work.

We have people saying that older people are unable to learn new tech.

When those people stop being discriminatory the older people can stop being super human.


I'm not sure if you intend this, but your comment comes across as a bit like "God, it's so embarrassing watching these old guys try to stay trendy by using the latest JavaScript frameworks, it's like seeing an old guy in skinny jeans riding a skateboard. They're not doing it because they enjoy it, they're just trying to act like the kids".

Who says using the latest javascript toys is only for the kids? Are you really worried that if you see old coders using the tools you like, that maybe that means they're not as cool as you thought? Is that really how technology works?


I'd say it's normal and healthy. Many industries in the UK, for example, now have what are called CPD obligations. ("Continuing Professional Development"). To stay current you have to earn a particular number of points each year by attending courses or other public events. Even something light like a public lecture might count, for example, for an architect.

https://en.m.wikipedia.org/wiki/Professional_development


I'm 47 and don't believe its a "macro-domain" issue. Maybe in some micro domains like getting hired at Facebook and Google that is the case. Software is eating the world and in my experience, there is still way more jobs than competent people. I have an older friend who's does c#/.Net/webapps consulting. He's has an endless stream of opportunities and never works for less than 70 an hour which is great wage for the southeast region where he lives.


> That recipe to stay current looks tiresome. ... Your reward: you are still employable.

It does make pursuing a career in programming seem dubious, doesn't it? There's an old saying about "clawing your way to the top", but sometimes it feels like I'm clawing my way back to where I started. I feel like I'm expected to have instant answers to any question and immediately comprehend any technology (regardless of its level of documentation or comprehensibility). I actually do enjoy learning about new things (and old things!), but most of the time, we're not paid to learn, we're paid to instantly know, and if you don't instantly know, there are 20 guys lined up around the block who are ready to take your place as soon as you admit that you don't know how to set up pass-through SSL using the undocumented firewall product that was installed last week that you don't have credentials for and we don't have time for you to waste reading documentation because there's a customer deadline and there's no slack in the schedule. The only other career I can think of that you have to put this much ongoing personal effort into is entertainment; it's like we have all the downsides of entertainment careers without any of the upsides.

On the other hand, I can't imagine doing anything else - every other job (except maybe astronaut) looks murderously boring to me.


Not a bad post at all and it definitely hit home for me.

I'm 43 and have been doing professional development for 20 years (actually 20 years). I moved permanently to Saigon just 1.5 months ago. I'm teaching the Pivotal software engineering process (agile / extreme) to a 100 person consultancy full of really smart ~20 year olds who didn't know or understand process at all.

I keep up on all the latest tech and I have a youthful mind and body (most people think I'm in my 30's). I'm the oldest guy in the company and the only American here. This has quickly lead to a lot of personal mentoring on many levels, not just software, but life in general. The culture in Vietnam is strong and my team wants to learn from me. It is very exciting and new for all of us. It has been an amazing experience so far and I look forward to the future.

The best additional advice? Just be nice. It is so simple. The culture here is to never raise your voice or get mad in public, so I've taken it to the other extreme and I just smile and laugh a lot. Even when the servers are melting down. Viet are shy and have poor personal communication skills. By being friendly and nice, they have learned to trust me and that has opened up them up a lot. It has infected my entire team and improved moral almost over night.

Being older has a lot of advantages. I'm loving my 40's way more than my 20's. Cheers! =)


Some pretty sweeping generalizations about the culture after only 1.5 months. I live in a country neighboring Vietnam and there are similar perceptions of the culture here, yet in my experience these ideas are baseless and often ingrained in expats well before ever learning the language or sometimes even before entering the country.


"these ideas" -- which ideas?

Yes, there is a ton of racism and generalizations among expats. As soon as I got here, I was added to a few private facebook groups where expats vent steam over the craziness of this country. I'm actually not a fan of it because it is honestly very racist, but I want to know both sides of the story. Much like democrats read republican news.

The language is hard and learning is going to take me years. I'm trying my best, but when I say something as simple as 'một' (the number 1) to someone, they rarely understand me. Vietnamese also want to learn english (my company has a full time english teacher) and will not help me.

So instead of language, I've focused on learning the culture first and in 1.5 months (also this isn't the first time here, so it is more like 2.5 months) I think I have a pretty good handle on a lot of it. I make 1-2 (or more) new friends daily thanks to the friendliness of the people and because I'm out there networking like crazy. I have over a hundred friends here now, both from business and personal.

If I stop liking it here, I'll leave. I don't see that happening any time soon though.


You made two generalizations in particular:

The culture here is to never raise your voice or get mad in public

This stereotype also exists about Cambodia, the country I've lived in for 4 years now. I can tell you that it isn't true about either Cambodia or Vietnam, since I've seen plenty of people in both countries get mad and raise their voices in public. The same happens of course back where I'm from in the US, and if I had to guess it happens at about the same frequency but I wouldn't rely on faulty human memory to make that judgment.

Viet are shy and have poor personal communication

This is just a ridiculous thing to say if you've never spoken to these people in their native language. I have actually gone through the arduous process of learning the language in Cambodia, and I can assure you that speaking in a foreign language clumsily and constantly being worried about your inability to express yourself will make you more shy than you are in your native language. And I have to mention that famous park in Saigon where foreigners are literally swarmed by Vietnamese people eager to practice their English... is that shyness? I never talked to so many strangers in my life as the couple of nights that I sat out there.

And actually, please do not think I'm vilifying you or calling you a racist or anything. I can tell from your post that you're a nice person, whereas in both Cambodia and Vietnam there are an outsized number of expats who are just straight-up mean, bitter, and racist. These people unfortunately end up influencing the perceptions that new expats have.

I'm only encouraging you to come in with more of a blank slate, and to not delude yourself into thinking that you can know anything about the culture after 1.5 months. I'll be more willing to hear out generalizations after you've been there for years, learned the language, and traveled all around the country. Have you even been outside Saigon yet? I've been to Saigon, and it's not representative of the rest of Vietnam.


I feel like you're making generalizations about me without knowing the full story. You've taken what I've said out of context and quoted and commented it based on your own experiences.

I've been here longer than 1.5 months from multiple trips here. I've also been to Cambodia. I'm not pretending to know everything about the culture. I've travelled to other cities than Saigon. I drive a motorbike as well as or better than locals. I'm working here daily at a company full of (wonderful) Vietnamese. I'm not an idiot and can form my own opinions. I'm not an English teacher (which is another unfortunate stereotype in itself). I'm not bitter or mean or racist. I know about that park. I'm working on learning the language.

I still stand by what I said.


Given the topic of the thread, it's clear that your comment was upvoted because it's inspirational to do new things at the (oh so advanced!) age of 43. But I want to address it along a different axis.

> Viet are shy and have poor personal communication skills.

When I imagine how such a statement might land with your fellow HN users who are Vietnamese, let alone, say, a Vietnamese elder, such a claim is painful to read.

> I'm the oldest guy in the company and the only American here. This has quickly lead to a lot of personal mentoring on many levels, not just software, but life in general

I have a similar reaction here too. Other dynamics leap painfully to mind—a vast power differential and violent history—that don't have to do with "life in general".

We don't want a gotcha discourse in which well-intentioned people get scourged for saying things; I don't mean my comment that way. At the same time, civility means more than personal politeness. It includes respect for others different from oneself. That is a profound thing with many levels, each of which challenges us to awareness. Every one of us has these challenges, of course. They're just easier to see in somebody else's case.


I could have said it another way that was more civil and polite. I won't argue that. I have a very direct personality that I don't feel like I should subdue for HN. So I went with painful, but honest. #inmyopinion

Overall though, my experiences have been extremely positive. I literally wake up every day happier than the last, all because I live here now. They are happy and friendly people. They are young and full of energy. If I can help it, I'll never go back to San Francisco. Vietnam certainly isn't perfect by a long shot, but I love it anyway.


That is the same in the Philippines. Public shame is a big no no. You always want to handle things in private.


Thanks for your inspirational story. It seems there is hope if you are willing to step out of your comfort zone.


Thanks! I don't even know what my comfort zone is anymore. =)


Take acid and go to crazy festivals if you want to stay young. No really, do it once at least.

You need to bathe in youth from time to time in order to experience it - it's fantastic.

Of course you need to keep up to date, try to use your wisdom to understand which technology/language is going to survive the test of time.

For example, C/C++ is going to stick around for a while; make sure you're up to date (C++ 14 and C++ 17).

Pick technologies with a steep learning curves, don't try to compete with 20-year olds doing Javascript Bootcamps - go five steps deeper.

Broaden your horizon - read poetry, listen to all kinds of new music, watch experimental movies, travel around, talk to foreigners, eat weird food.

Study physics and philosophy, psychology and economy.

Have lots of sex - your wife will love you again :)

You have kids ? Great! Learn from them - everything. Try to teach them what they study at school - see if you can figure out a better explanation. Notice how much new stuff you learn about the subject, about yourself and your kid!

We're all getting old(er) every day - as we age this process seems to accelerate - and one day we will be no more.

But inside us lives the kid, the 20-year old, the 30-year old. It's still there, it can still be crazy and fun, we just need to remember to go on a date with our younger selves. All the rest will follow.

At least that's what I'm telling myself :)


> use your wisdom to understand which technology/language is going to survive the test of time

I usually use statistics. If you select a random point on the lifetime of something. There is a 50% chance, that you are closer to the middle then either start or end. Thus: Always assume you are roughly in the middle of the lifetime. In other words, if some technology is only one year old, assume it is dead in another year.


That's a great rule of thumb. You should give it a catchy name so people remember it.

When I don't have good insight into whether it's worth my time to learn some new tech, I'm going to try to apply this rule.


This is the Lindy Effect[1].

[1]: https://en.m.wikipedia.org/wiki/Lindy_effect



So that means C looks pretty safe... 44 years. C++ is 33 years.


Nice piece of wisdom right there, thank you!


> Have lots of sex - your wife will love you again :)

Not in my experience

... oh, you meant with her.


"Take a mistress. Your wife will believe that you are with your mistress. Your mistress will believe that you are with your wife. And you can spend more time hacking code."


I'm young but this was invigorating to read.


> Pick technologies with a steep learning curves, don't try to compete with 20-year olds doing Javascript Bootcamps - go five steps deeper.

Wanted to call that one out, don't think I've thought about it before, but it's good advice.


100% agreed, thanks for this!


> Have lots of sex - your wife will love you again

Unless she finds out.


Trust me. If she finds out, she will love you more.


Elephant in the room, IMHO, is the technical interview process.

In software engineering roles at big/desirable/fast-growing companies, the interview process favors faster (by definition, younger) minds. Both young and old are put thru the same/similar coding interviews at many of these places, and often faster coders are younger, and get the job.

You can't fix ageism without fixing the interview process. Being jovial, healthy, nice and culturally sensitive are necessary and useful things to keep your job after you join, but the gatekeeping itself is biased on the other side, which reduces the intake to a trickle.


36 year old checking in. If there is any perceivable gap in speed between a younger and older software engineer, that gap can EASILY be made up with experience. After over 10 years in the industry I can honestly say interview questions aren't that original. If anything I have a huge advantage because I've heard all of them before: Answer a question that demonstrates you understand polymorphism + Algorithms 101 And that covers most of the questions I see right there. Yes it's insanely stupid to interview like that, no I don't think my age does now or ever will put me at a big disadvantage.

In my experience older people get cut out based on not being "a good cultural fit". This has been discussed ad finem on Hacker News because "cultural fit" leads to all kinds of discrimination: racial, gender, age, etc.

Every job I've had we put a person through a series of interviews, then we have a group meeting and we vote. There is no quantifiable evidence that this person actually interviewed the best. It comes down to how people feel in a room. That is the issue, not the speed at which a person can give answers. I've seen people voted down based on all kinds of illegitimate reasons and with age I think it came down to fear in some cases. A lot of software teams don't want to hire the best person they can find. They want to hire someone who is pretty OK, but will also make them look good. Yes, sometimes people don't get the job because they are too good. Am I going to hire someone who makes me look like an under-performer, or could get promoted to before me? Bingo, bad cultural fit.


the interview process favors faster (by definition, younger) minds

Algo-on-the-whiteboard interviews favour people who have recently been cramming for their final-year CS exams - by SHEER COINCIDENCE they happen to be in their early 20s...


I would even say they favor people who have been cramming algo-on-the-whiteboard type of questions. I mean, there are several businesses built around this (CtCI, leetcode, ...)

I know some North American universities have adapted to the practice and are now preparing students, but I assume this is relatively new. My algo and DS classes weren't about cramming at all.

Now to be fair, reasonable companies will focus on higher-level, systems design and architecture type of questions when interviewing seasoned engineers. Or at least, they really should.


Since everyone knows what they're getting into with a tech interview, I don't see why it's such a bad metric. Many companies purposefully give you a rubric/criteria to study. It's a good way of measuring whether a candidate can take the time to learn/prepare a specific set of knowledge, and then work through problems in a way that includes the interviewer (ie. other devs if hired) in the steps to solve the problem.


> It's a good way of measuring whether a candidate can take the time to learn/prepare a specific set of knowledge

And how isn't that a terrible metric? Most of my current coworkers would fail as they simply don't have the time.


> Since everyone knows what they're getting into with a tech interview, I don't see why it's such a bad metric. Many companies purposefully give you a rubric/criteria to study.

It's often so broad that to really cover everything that might come up you've got to have time to make the studying a part-time job.

And even then you might get hit with one of those "you almost have to have seen the trick before" questions, like detecting a cycle in a broken linked-list with O(1) memory.


Why interview in a way that's unrepresentative of the actual work?

The only reason is that you want to build an environment where the work is secondary, such as wanting to hire a bunch of bros to go drinking with and help you spend all that sweet VC cash...


Exactly, it's the choice of evaluation criteria.

Meanwhile, experienced developers have their brains tuned towards on-the-job skills that are harder to cram (like an instinct for edge-cases) while "shelving" the stuff that you don't need.


I'm not sure it's because the interview favors faster minds. When you're 25, and good at what you're doing, there is a good chance you simply don't see what a 40 years old have to offer. Never mind that the day they turn 40 themself, they'll be able to brig far more value to the table because they have 15 years of additional experience.

If there is noone with 20 years of dev experience within the company, there is noone to speak up for this, and the cycle continues.


Why do you think often faster coders are younger? With many more tricks up their sleeves, a more experienced coder will be faster. Or are you saying compared to literally older people, but not experienced?


I don't think the assertion has any basis in objective data in the first place but is based on a personal feeling. I don't see the value of trying to discuss reasons for something that only "exists" based on spurious subjective personal claims?


Young coders can code that 10,000 line bad idea they had really fast, while older ones can recognize that a problem isn't new and avoid the 10,000 lines altogether.


Cognitively, people do slow down after 21 or so. It is likely not to be a linear process, but I'm sure a quick google of research will find this. It is not, as suggested by the other comment, spurious feeling, but a well-known cognitive fact about humans. And experience in eal things doens't help with random tests, expecially if you're 20 more years away from your CS finals.

Whether this translates into a perceptible disadvantage in code tests is less likely to have been tested, but it is given what we know, and given other arguments above, quite possible and indeed likely.


At least one study suggests that unlike athletic abilities that do peak at around 20, fluid intelligence is more complex and may peak all the way up to 50 years. http://m.pss.sagepub.com/content/26/4/433


Technical interviews don't necessarily favor faster coders. I have been on both sides of technical interviews and what counts most is that an applicant asks the right questions and chooses the right approach.

What you're definitely not looking for is a candidate that hacks some solution together quickly but with badly structured code, makes a lot of unconfirmed assumptions and doesn't listen to advice.


> technical interview process

Even before that is the technical screening process. If the company has standardized on Angular 2, your experience with Dojo, Ext-JS, jQuery and even Angular 1 is considered irrelevant by the screeners - you may as well have experience in medieval basket weaving; they won't even call you. If the company has standardized on Groovy, your experience with Java is equally irrelevant. If the company has standardized on MySQL, your experience with Oracle is irrelevant. If the company has standardized on Linux, your experience with Solaris is irrelevant. And on and on it goes...

There's a perception (which I've already seen repeated 10 times in this thread, and I'm only halfway through it) that the "new thing" always completely replaces and invalidates the old thing. But that's almost never the case. Angular uses jQuery. jQuery uses Javascript. Javascript uses the DOM. Groovy uses Java. Hibernate uses JDBC. AJAX uses HTTP. HTTP uses TCP/IP. They all use the OS. And when X uses Y, Y can go wrong in ways you didn't expect if you just assumed that Y became meaningless when X came along.

So we have this environment where the hiring managers are shooting themselves in the foot by looking for style over substance and anybody who tries to bring it up is dismissed as a dinosaur with a case of sour grapes.


> let’s look at the average age of IT workers at well-established companies. Facebook: 28. LinkedIn: 29. Google: 30

I had an interview at Google a couple months ago and noticed that most people were pretty young. When I asked the person who was in charge of taking me to lunch about this, he said that it's probably because there are just much more graduates of CS now than there were before, and that Google would very much like to hire senior people as well but they're much harder to find.

I wonder how much of what he said is true vs ageism.

On the other hand, I wonder how much of a natural bias there is against older people if they have to go through the same interview process because it felt like a mental marathon to me. Although the interview only lasts a day it took a couple days for me to recover.


> ...and that Google would very much like to hire senior people as well but they're much harder to find.

> I wonder how much of what he said is true vs ageism.

Of course they want to hire more senior engineers now, it would likely help win their court battle[1] over it.

[1] https://tech.slashdot.org/story/16/07/02/0438216/age-discrim...


Also Atlassian says the same. They said last week that they just can't get senior people in Sydney. Personally I feel older people are not as good at the little puzzles they set as an entry test.


I applied to Atlassian many years ago and what struck me is their refusal to entertain non java devs to cross over. If they want more seniors they should consider c# or other language devs who could cross over.


That is actually a really hazardous approach, almost every job I've had as a software developer have introduced me to a new language and as someone who's been writing code for 10+ years there's hardly a large struggle to learn a new language or framework, especially if it's just another c-like imperative language.


I heard this so many times before about just another language... learning language itself? yes, can be done damn fast, plus everybody had some java-like language in their studies. Knowing this means almost nothing, we're talking about very junior-level resource.

You know gazillions of frameworks to achieve everything these days, their integration tricks, various app servers, CI toolsets and so on and on?

I mean, if you are senior in something, are you senior also in XXX language, meaning I give you spec, we talk and you deliver proper maintenable solution, leading dev team, managing all issues and bumps along the road? If no, and you just come as described junior, nobody has time to babysit you for weeks/months, and you are not willing to take junior salary. but that's what you are to the company.


I'm sorry. No actual senior person who is senior at java or c# takes months of babysitting to switch to the other. The beauty of a senior person over someone who is junior and knows only js, is that the senior person should have used many languages over their careers. Picking up a new language and/or framework is what senior people should be doing best. Most of the factors that go into a proper maintainable solution are all language agnostic. Good design, DI, IoC, all of the 12 factor app suggestions are all language independent.

Even when I was back in college many years ago, only the first class taught a language. From that point forward the teacher of each class said we're using language X and suggested a book if you needed help learning.


I'm not the person you're replying to, but can you really argue that someone senior who doesn't know a language and its associated frameworks is equivalent to someone who does know those things or deserves to be paid the same?

Sure, it's true that many things about good design are language agnostic. On the other hand, frameworks and languages can actually limit or enable what you can do, and that lack of familiarity with them has the potential to lead to mistakes.


If everything else is equal, then obviously knowing the current language I need will push that person ahead. IME, the language(s) a senior person knows is the least important part whether they are a good hire though.

My reasoning for this is that we are always learning new languages and using new frameworks. Why would I let a better person go when the language is probably going to change, or worst case they pick it up in a couple weeks just by looking at the existing code base? One case where I would deviate a bit is if I was hiring for a functional programming position. In that case I would prefer experience with some functional language, but that is not much different than wanting OO experience for a Java/C# position.


If the more senior program is of the same calibre and lacks skill in a language, then 100% yes. The idea of discounting proper experience is a clear example of the Dunning Kruger effect.


Oddly Google's and Atlassian's software have been getting steadily worse over time but I'm sure that's just coincidental.


I can't really speak to Google's software because there's so much of it and it does so many different things (much of which I try my hardest to avoid), but I've been using JIRA for a long time - probably on and off since about 2010 - and I actually think it's a lot better now than it ever was.


Largely agreed, we're still running a confluence version from 2010 at one organization that I'm a board member of and I must say that it doesn't fare well with the more mobile oriented ecosystem of today.


The Google process is also veeeery long. I guess is not unusual for an older developer to start it, and get other good offers meanwhile. Another benefit of age is to be able to pull contacts as ways of getting jobs, and reduce the amount of "meaningless" interviews.


His reply couldn't sound more scripted


It didn't sound that way to me, but maybe I'm just too trusting. For what it's worth he was probably in his mid 30s and had a family.


Your response couldn't sound any less sympathetic.


I wonder how many more people are graduating with CS degrees in the last few years compared to those who graduated two decades ago. I'd imagine there's a lot more today with how popular it is becoming. That's not even counting all the CS grads that have since moved up the corporate ladder/switched roles or careers and aren't applying for dev jobs now.


Actually the average age of software engineers at Google is much higher than 30.[1] It's the non SWEs that being the average down. The cited source in that article lists average ages for all employees, not just technical ones. This is unfortunately an overlooked distinction when writing about the industry.

[1] source: I work for Google.


That is a recurrent point that Uncle Bob tends to do. In my opinion, sounds resonable. e.g.: http://blog.cleancoder.com/uncle-bob/2014/06/20/MyLawn.html


I don't know your definitions of older/younger, but I interviewed at Google NYC recently, and my perception was that most people were in their 30s, with a few older and much older folks. Maybe a factor of the teams I was talking to, but it was clearly an older bunch compared to FB.


I can't speak for NYC but this is also my experience at Google Zurich and London.


Did they give you a job offer?


I did! Why do you ask?


You left the ending off your story! Nice ending too.


Haha! I didn't feel it was relevant to the comment but thank you! I'd rather be a math genius though! Maybe one day I'll be able to at least study some more math


Age will matter more in a loose job market (more people than jobs). In a tight market as it is today, your skills are front and center, not your age.

That said, the OP is correct. You're only as good as your last two years and even that's pushing it. If the tech changes, you have to adapt with it.

I'm 53 as of yesterday (the 8th). I started with PDP-11's in the 80's, then VAX's, then PC's, BASIC at first, then C, then Visual Basic, then ASP, then C#/ASP.NET, and now I'm deep into AWS (Lambda, DynamoDB, Redshift), NodeJS, AngularJS 1.x/2, ReactJS, and I'm still learning new technology all the time.

A lot of developers will transition to management and it's on my mind, but I'm also still drawn to solving problems at a code level. And there's always new toys to play with like Angular and React. Now we have .NET Core and all of its interesting avenues.

If you actually care about being a good developer, you will continue to work.

As long as there are jobs. Nothing will help you if the job market contracts. Then I do believe hiring becomes age-oriented with us older dev's labeled "over-qualified".


Age will matter more in a loose job market (more people than jobs). In a tight market as it is today, your skills are front and center, not your age.

The fact that people are still claiming there is a shortage of tech workers, demonstrates that that isn't true. The industry wants young, cheap workers who come ready made with the trendiest skills, then it wants to ditch them rather than retraining or allowing them to accrue seniority, and hire new ones, who will work 80 hours a week for free soda and "stock options"...


Devs should train themselves. If you're waiting for someone to train you, you're not going to last in software development for very long.


Some rhetorical questions: How does a person know if they've trained themselves well? What if part of the training involves interacting with other people?


What other fields operate like that? Doctors, lawyers, accountants, etc all get CPD on the company's time and dime.


What companies fit this mold? Do you work at one or worked at one in the past?


>Age will matter more in a loose job market (more people than jobs). In a tight market as it is today, your skills are front and center, not your age.

You'd be surprised. The original post (TFA) gives the median age in major companies for example and it doesn't work like that there.


I'm not convinced that there's more jobs than people in the tech market, especially now that people are starting to wise up to the fact that jobs are being automated away and want to transition into stabler markets like ours.


Younger people are not smarter. They might learn faster new things. On the other hand, older people understand related new things much better as they already have a large context (experience). The biggest difference of my current self (44) to my younger self is that I did spend much more energy in my projects when I was younger. I created results much faster at the cost of limited consideration. Now, I can still burn for a while, but not as long as when I was younger.


I used to be able to burn longer when I was twenty, but I'd spend that fuel on shit I've learned doesn't matter like 100% test case coverage, setting up continuous integration servers, and load testing against massive amounts of traffic I'd never see.

I could go on and on about the stuff I'd do I don't do anymore.

It's not that those things don't have their place, or that all young programmers are guilty of premature optimizations, but there's no difference in terms of productivity between doing nothing and doing things that don't matter at all. It took me a while to learn that, and I still catch myself wasting time.


Do you ever get asked about this? e.g. what you don't do anymore as a result of learning from experience

Either in your current role / in interviews / on the street?


not that i can recall. why do you ask?


It seems like you have learned a tremendous amount throughout your career that others could benefit from, that's why! Not learning from you flies in the face of wisdom (1)(2).

I'm not afraid to generalize this either. I see so many people and organizations repeating the same mistakes, of their own, and of others', over and over. I envision an organization built around the idea of learning from others' experience. It's a near mythical creature, but it could run circles around its peers, who would doggedly pursue finding out its "secret" and promptly discounting it when they hear "learn from others' experience / history."

This turned into a mini-rant, but it's because I am incredulous that you and others like you have learned so much yet the potential of that knowledge is so rarely tapped into.

(1)“Fools learn from experience. I prefer to learn from the experience of others.” ― Otto von Bismarck (2) Why Don't We Learn from History by B.H. Liddell Hart https://www.amazon.com/Why-Dont-We-Learn-History/dp/09850811...


I find that - very broadly speaking - old guns are better at strategy and young guns are better at tactics. The energy and interest of the young lead them to find the local maxima more readily, and the experience of the old help you find the real maxima rather than the local. Or put another way, young'uns will get you out of the rabbit-hole more quickly, but old'uns will take you down fewer rabbit-holes. Both mindsets work well together.

YMMV, caveat, caveat, caveat, etc, roughly speaking, &c


Would you mind expanding a little bit more, from your perspective, on what falls into tactics and what falls into strategy?


Generally what marklgr said. By tactics, I mean being up to date with the latest lib for -foo- and being on top of what the cool kids are doing, chasing down the rabbit-hole to find that one setting to improve db performance. Individual, almost atomic tasks, especially those in a changing landscape.

Strategy is more about planning, forethought, and knowing ahead of time which avenues are worth pursuing and which aren't. As a very simplistic example, a tactician might spend a day down a rabbit-hole and improve a db's performance by a small but significant amount, and a strategist would not have spent that day, knowing that that db is going to be turned off in two weeks. Not the best example, really (and it implies that you're only one or the other, which isn't true at all). There is also overlap between the two.

Perhaps a military analogue - tactics is about how to take the forts (shorter timeframes, clearer objectives, obvious goals), and strategy is about determining which forts are worth taking (long-term timeframes, murkier objectives, sometimes unclear goals, concern for secondary effects). The difference is also in scope, I guess. Being a good tactician benefits from energy and focused interest, something which the young'uns tend to have more of (caveat, caveat), and strategy benefits from forethought and experience, something which the old'uns tend to have more of (caveat, caveat, &c)


In that context, to me, tactics is writing & debugging code, compiling stuff, settings things up to work with other libs--ie. mostly hands-on tasks.

Strategy is more about planning, architecture, knowing about the real pros and cons of different solutions eg. libs, tools or languages.


At 37 I'm much faster at debugging today than I was at 27. And 10 years ago, I was consistently one of the fastest debuggers in a company of 15-20 programmers. Debugging's always been one of my strongest programming skills.

So debugging, no, definitely not. I can spot root-causes of bugs now even easier than I used to be able to.

And one of the big new skills is that I'm much more able to 'guess' where a bug is in someone else's totally new code-base than I previously was able to.

In fact, I also write code faster as I get the general gist of it done much faster first time.


Some people specialize, and get more and more efficient at the same specific tasks; but I believe it's more common to widen one's experience in various areas (languages, tools, platforms, non-functional requirements etc.), leading to some lack of practice in the hands-on tasks one used to do everyday in the past, and also making it more difficult to commit everything to memory. On the other hand, one generally gets a much broader picture.

For instance, I used to be able to perftune UNIX and DB (mostly Oracle on Solaris) pretty well, knowing various kernel parameters by heart and all the tools of the day (tkprof, Cockroft and co). Now I guess I could still get by after some brush-up, but I'm not anywhere near as efficient as I used to be on that specific task. And I still write code, but I don't know every single method of every API anymore. But now, I can design full solutions, from choosing the hardware to setting up platform, languages, monitoring, high-availability, backups, security etc. Not because I'm smarter, but because I've been exposed to all that along the years.


Breadth of experience often turns into strategy. Lack of experience tends to show up as when you only have a hammer, every problem looks like a nail. At this point in my career I've written production code in many languages/frameworks and was even a DBA at one point. I've also worked in many types of businesses. Now I can look at a given problem and provide a range of possible solutions. Sometimes those solutions do not involve writing any code at all.


I've seen several examples where the younger developer (often me) comes up with some really clever code to make a certain thing run really fast, and the older developer coming along and pointing out that my underlying premise is wrong.


I agree with all your sentiments. I think the key thing is that younger people tend to equate efficiency with lines of code written. As you get older, you realize that pausing long enough to consider all implications leads to better solutions. Fast and hacky is fine for prototypes but (hopefully) eventually it needs to scale and be stable and supportable.

I also think that the Zuckerberg quote also points to the startup mentality. They want 22 year olds because you can get them to work 100 hour weeks against the promise of an IPO, not because they are "smarter". Hard to convince a 40 year old with two kids to do the same. There has already been a shift to more traditional CS companies (IBM, etc.) as the lack of IPOs has soured some people on the startup dream of being employee #3 at the next Uber of XXX.


Agreed. My current self can understand complex concepts faster than my younger self. Understanding the innards of a distributed database is easier now than ten years back.


> Younger people are not smarter. They might learn faster new things.

I was a young person once (no, seriously!). I _thought_ I was learning things fast back then. When I got older, I realized I was trading speed for depth.


Young people:

    1. Learn faster (better memory)

    2. Can keep their attention focused longer.
Number 1 actually becomes a bane with old people: my relatives over 70 actively developed an active reticence to learn anything new: I even theorized that they are instinctively protecting the limited amount of functional short term memory for vital tasks.


    > Learn faster (better memory)
And yet, when I needed to learn XSLT (and then React), I was able to simply inhale it in one go because of my experience of similar technologies and functional programming, which come about from having been doing this quite a while.

Younger and quicker developers took a lot longer over it and to find their feet because there were many new concepts there for them.

As you get older, there's less to you haven't already learned.


Let's put aside the unsubstantiated claims about biology (and old doesn't mean 70 yrs in this context). Jobs that rely heavily on attention span and learning speed can also quickly get boring and unfulfilling--the more experienced folks probably know to steer away from that.


Article mentions "RPG" many times: "RPG back-end", "RPG developer". For those who wonder what it it (like me), it is not a "Role Playing Game", it is IBM RPG language: https://en.wikipedia.org/wiki/IBM_RPG


How the author failed to explain this abbreviation is beyond me. Who in their 50s doesn't know that you spell out important concepts and key terms, then shorten them in parentheses?


Maybe it didn't occur to him that people wouldn't know what it is. If you were writing an article about programming, would you explain what BASIC stands for?


No of course not, everyone knows it stands for 'Badgers Are Super Intelligent Creatures'.

:)

(Beginners All-Purpose Symbolic Instruction Code... which now I type it out, obviously cheats. From here on, it shall be known as BAPSIC.)


STFU.


Please comment civilly and substantively on HN or not at all.


I spent my first 5 years out of college as an RPG developer, and even I was like "wait, can he mean THAT RPG?" I'm not even old, I started that job in 2005!


Although it is more fun to imagine he meant 'Rocket Propelled Grenade' back-end... (Etc)


"First appeared: 1959" and it looks it. I don't think the author is doing himself any favours by going on about such an archaic language!


...and that's exactly the attitude the author is pointing out.


No, I agree with the parent commenter - but it's not that the language is old that's the problem.

If the friend mentioned lost or left his current job, he'd probably (?) struggle to find another. It wouldn't be nasty ageism, it'd be "sorry all your experience is with a technology irrelevant to us; that we haven't even heard of".

I say this going on the article alone - of course, for all I know the guy's an avid Ruby/Node/Elixir/whatever's-hot user for a range of awesome side-projects.


No, the author is actually saying that that attitude is fine. His point is that you need to keep learning and doing things that are relevant today, not learn today's technology and then stick with it for the rest of your career.

> I could tell you about all my accomplishments over three decades, such as replacing the use of a System/3 punch card system with the AS/400, writing a Cobol debugger, or…. Ah, I’m boring you. What you do care about are things I did in the last two years.

RPG is only relevant today in the same way Cobol is.


The feeling your comment conveys is not "keep learning". The feeling is that by talking about "archaic" stuff, the author isn't "doing himself any favours", i.e. he makes himself sound old.

That's actually his point.

Someone who has seen a technology mature and develop over decades, seen the hype trains come and go, projects succeed and fail, might be a bit wiser than someone fresh out of Stanford. He might have some ideas about how to build maintainable systems after working on some that are older than most developers.

If you don't see it, that's on you, not on the author for mentioning his particular "archaic" specialization.


One of the things that has made me old (52) and crotchety is that I learned Lisp very early in my career. That gave me the ability to see that 99% of "new" technologies were really just poor re-inventions of (parts of) Lisp. Even today, Common Lisp -- despite (or, as some would argue, because of) the fact that it hasn't been officially updated in decades is still not only a viable language but one of the best choices for many applications. But no one knows it because it's not the shiny new thing, and even young people still can't seem to get their heads around the fact that the parens are a feature, not a bug. And that makes me grumpy sometimes.

The good part was that I was able to build a very successful career while not having to suffer nearly as much pain as many of my contemporaries. The bad part is that now it's hard to find people to collaborate with. :-(


Even today, Common Lisp is still one of the best choices for many applications.

Then why don't people build amazing and popular things with it? I don't mean one or two people build one or two things, but lots of people building lots of things.

Nobody uses it, but it's the best choice, can't both be true. And 'nobody uses it' is approximately true. It's not a mainstream JVM language or CLR language, it's not an AWS or Azure or Google Web Services language, it's not used for Linux Kernel, Windows, Oracle, *BSD, SQL databases, No-SQL databases, it's not used where Erlang is, it's not the research language Haskell is, it's not the fun esolang or the long-tail COBOL, it's not the new compiles-to-JavaScript, it's not the back of 3D games or VR engines, it's not behind Amazon's shop or used where Go is. It's not talked about in StackOverflow's most popular languages surveys, or most profitable languages for devs to learn, or most desired by employers. It's not an educational language like it once was.

Yet it's "one of the best choices".

I simply don't believe it. Any perceived advantages it has, in practise must be a wash.


Too much fun to program in. Lisp makes it remarkably easy to extend the language into just what you want it to be...which means that Lisp programmers tend to spend most of their time extending the language into just what they want it to be rather than solving the problem. (Indeed, this is often touted as one of the benefits of Lisp...once you have a suitably well-adapted DSL, the problem solution follows naturally.)

Meanwhile, the Java or Go programmer is thinking "this is boring and ugly. Let me finish this problem as quickly as I can so I can go home." So they finish the problem as quickly as they can, and then it's done, and it ships. And people have an incentive to use it rather than tinker with it because nobody really wants to peek under the hood, and the people who do peek under the hood tend to be really dedicated and care a lot about the problem domain because why else would you put up with the language?


This one I can believe.


A few notes:

* Google actively develops one of the Lisp compilers. Google Flights powers Orbitz, Kayak, etc. That's Lisp.

* There are several Lisp compilers in active open source development.

* There's a graph database written in Lisp called AllegroCache. It's good enough to support a business (Franz) for more than a decade.

* Another company (LispWorks) also exists and has a large portfolio of clients.

* Lisp has been used to make entire operating systems. Ones of the past, and ones of now. (Of course, an OS needs a community. But where are real OS's with GUIs in other languages?)

* Lisp has been successfully used in my own career for embedded systems to control satellite acquisition systems to, most recently, quantum computing. (At real companies.)

Just because there's not this huge buzz around Lisp doesn't mean no one is using it.


> But where are real OS's with GUIs in other languages?

There's this thing called Windows you may have heard of...

And, out of curiosity, what OS are you referring to?

> Just because there's not this huge buzz around Lisp doesn't mean no one is using it.

jodrellblank's claim was not that absolutely nobody is using it. The claim was, compared to how wonderful Lisp advocates claim the language is, relatively nobody is using it. If you compare the amount of software written in Lisp to the total amount of software written, and compare that to how wonderful Lisp is claimed to be, jodrellblank has a real point. And citing a handful (or several handfuls) of counter-examples does not refute the point at all.


I don't even code in Lisp, but I knew the Google Flights engine would come up, because it seems to be just about the only "serious" application ever written in Lisp. And it wasn't built by Google, but their acquisition ITA, which hails from Boston and MIT's Scheme reality distortion field. (And even MIT has stopped teaching SICP in Scheme...)


Emacs, ViaWeb, Macsyma, the DS1 Remote Agent, much of the autonomous navigation software for the Mars Rover before the Pathfinder mission... And I have built a number of small-scale web applications in Common Lisp. I'm running one of them in production right now.


And Yahoo! paid 49 million dollars of stock to throw the LISP in the trash and use something else instead.

It wasn't even worth them spending 1 million dollars - 2% of that price - on training people.

http://discuss.fogcreek.com/joelonsoftware/default.asp?cmd=s...


Right. And that decision was surely one of the reasons Yahoo is the runaway success it is today.


Are AutoDesk still using it?


It might be that it really is a big secret, and people are quietly using it and gaining advantages from it.

If that were the case, I would still expect to see lots of gushing blogs of leaks from inside hush-hush companies and people desperate to learn LISP posting "how do I replace strings in files in Common LISP" on programming forums, and "I learned LISP and doubled my salary" on Twitter.

Yet what you really see is Steve Yegge and "my EMACS code at Amazon was replaced with Java years ago" and "Facebook working on a new JavaScript thing" and "Microsoft working on a new JavaScript thing" and "Apple working on Swift" and "Fog Creek compile VBScript to PHP" and "rPi comes with Mathematica and Python" and so on and so on.

I wasn't saying that no one is using it. I was saying "if it was the best - as claimed - then there would be a buzz around Common LISP, because huge numbers of people would be using it".

Your points about your career are pretty interesting.


Lisp is old and it's unlikely that you see hype anytime soon. But you can see that Lisp derived languages like Clojure can generate minor hypes. Other new languages may even contain substantial pieces of Lisp influence. Examples would be Julia or R.

Generally the industry has problems reusing old/existing technology. See for example the Javascript domain, where new frameworks for web development pop up every week and the lifespan of frameworks is measured in months.

Instead of using/enhancing existing tools, there is a constant pressure to develop new stuff. Or take Apple with Swift. Instead of using Scala, Standard ML, OCAML, F# or Haskell, they developed a new statically-typed functional language.

It's the NIH syndrome at work. Everywhere. But it's also that tools are complex to learn, so people start new with simpler tools, they grow over time and after some time they are replaced with other stuff. If something gets updated in some incompatible ways, it already causes problems: some users are lost, some users will only use the old stuff, some only the latest stuff and some will try to use multiple versions. See Python.

Full Common Lisp is just too complex for most developers, but it has a life in many specialised and niche applications: CAD, some AI tools, music, robots (like the Roomba), planning/scheduling (crews, telescopes, ...), Expert Systems, verification of software and hardware, some maths stuff, ...

Since Lisp is only left being taught at a few universities, there are not many people able to develop with it. Even when it was taught, it was often only used to teach concepts like recursion and not programming. The younger Lisp programmers found it by themselves.

'Industry' sometimes often has no interest to diversify their programming tools. Many enterprise software shops currently (still) use Java: standardised, broad industry support, ... You won't successfully propose to them to use Lisp, even if the application would be better in some way. For example if the project fails, it certainly wasn't Java fault, because all the others are using it too. If you would use Lisp and the project would fail, it would be Lisp's problem: not enough people, little architecture experience, tools not broad enough, integration story too weak etc... Even it would be successful and in production, there would be a lot of pressure to rewrite it in some industry standard in the next product iteration.

> "if it was the best - as claimed - then there would be a buzz around Common LISP, because huge numbers of people would be using it"

There is no general 'best'. It's all relative to a domain, community, demands, legacy, fashion/hype, ...

Lisp is not more dead than usual. Yesterday there was a donation effort started for the Quicklisp library manager and it's now at $16606.37 .


Very good list of some issues that lead to "the new hotness" all the time. I had not thought of some of these; it's not totally brain-dead fanboyism.

One issue that (good) bosses have is that they cannot allow their business to be held hostage by one person. If that person quits (or dies), they have to be able to replace them. Esolangs are a hard sell on that basis alone, no matter how fit for use they may be.

I'd forgotten Clojure (and others!) - yes, Lisp is more popular than "things named Lisp". And, arguably, Clojure is popular enough these days that the argument in the previous paragraph doesn't really apply to it any longer. (Whether bosses know that is a separate issue.)

Then there's the claim by Guy Steele that, in creating Java, he dragged a bunch of C++ programmers "halfway to Common Lisp". (Lisp purists might concede some fraction considerably less than half...) Paul Graham says that the Lisp feature set is slowly taking over programming languages. Lisp may die but still conquer, or mostly conquer.


What Java mostly got from Lisp was parts of memory model and the managed memory, not so much on the language side. Smalltalk got that from Lisp too. Ruby also (the Ruby developer studied the Emacs Lisp implementation). Microsoft's CLR also. Actually parts of the early Microsoft .net CLR GC were written in Lisp and automatically translated to C.

But Java could not do a lot of things a typical Lisp implementation can do, sometimes with a good reason for that.

  * runtime compilation
  * loading of code (-> custom class loader)
  * garbage collection of code is sometimes difficult
  * saving/starting of memory dumps
  * fast startup times
  * tail call optimization
  * updating changed objects
  * calling error system -> Java has a terminating error system
For most of these problems some 'solutions' have been developed. On the language level the Java community experimented with some original Lisp stuff, too. But not much was added to the language, but at least they got lambda expressions (1958 in Lisp) and a first kind of multiple inheritance of code (around 1980 in Lisp via Flavors). Java soon will have a standard shell for incremental use ( https://blogs.oracle.com/java/jshell-and-repl-in-java-9 ) - that was around 1960 in Lisp. Now imagine when Macros will be added...

But generally knowing Lisp does not help you much with Java, since the language's OOP model is very different from the typical OOP in Lisp (see CLOS+MOP).


>Nobody uses it, but it's the best choice, both can't be true

Sure it can. It is possible for both to be true, as long as developers don't pick languages rationally, which they probably don't.

There are a few biases at play. One is that people are only exposed to a subset of the languages that exist, which are those that are either used in industry, or are making the rounds in news. Another bias is that we like to pick languages based on familiarity. For example most of my college courses used imperative languages: Python, C++, and Java. From that experience I am quicker at thinking in terms of for loops than folds and maps. So when I try a language like Haskell or LISPs I think "That's neat!" but when faced with a deadline I switch back to something closer to my first languages.

It could be that I am the only one that thinks like this, and everyone else sits down and spends equal time on every language in existence, but I doubt it.

In any case at one point LISPs were popular, so why did they lose all of their momentum?


I also doubt it. But LISP dates back to 1958.

JavaScript is from 1995.

"I chose the language which exists" doesn't apply to the people in 1994 who could have had /thirty-five years/ of LISP experience (potentially) and yet still chose to write another language, in another language. And when Brendan Eich was in college around 1981, the tutors there could have had /twenty years/ of LISP experience, but weren't there to convince him that it was amazing.

Same with literally any language dating post-LISP, and I note that that covers most languages which are popular today.

"Programmers don't pick rationally" is fine on the small scale, but accross the entire industry, even among people who do love exploring programming languages, even among young entrepreneurial risk takers, even among companies in tough markets angling for any and every edge they can get over their competitors, in decade after decade, over problem domain after problem domain after problem domain, there is this empty howling wasteland of happy and productive people using not-LISP, writing world-conquering systems that do just-fine-thanks and the claimed benefits of LISP just don't seem to be making any noticable dent in anything.

Therefore, they are over-hyped.


I agree that claims for Blub (for any value of "Blub") are too often over-hyped. But don't make stuff up in the course of justifying your reasonable point against hype.

I met John McCarthy in 1977. In college, a professor (Ruth Davis, I think; SCU EECS department) brought in someone who taught Friedman's "The Little LISPer". By the time I got to Netscape, I had read SICP. I knew enough about LISP to be dangerous.

But as I've said many times, and Netscape principals have confirmed, the reason JS isn't Scheme is because Netscape did the Java deal with Sun by the time I hired on (after I was recruited with "come and do Scheme in the browser"), and that meant the "sidekick language" had to "look like Java".

There was no "chose to write". Netscape management gave the MILLJ order; Netscape source was C (mostly), JS ("Mocha") was supposed to go server-side via "LiveWire" as well as embed in Navigator, and I was a C hacker. These are the reasons for the new language and its first implementation language.


But don't make stuff up in the course of justifying your reasonable point against hype

Where's the fun in that? ;) No, but really - I apologize for claiming there wasn't enough general LISP enthusiasm around when you were at college to affect you, and for not actually looking into the history of JS before making statements about how/why it happened.

JS design being partly a pre-made business decision - that's something I really should have known by now.


You are overlooking a crucial fact: very often people who want to use Lisp are not allowed to by their management precisely because "no one uses it" and so the experiment never gets done.

I was the lead engineer on the first release of AdWords. I wanted to write it in Lisp, but I was not allowed to, being forced against my strenuous objections to to it in Java. So now AdWords is a Java success story rather than the Lisp success story it might have been not because Java is better (it certainly wasn't -- it was a disaster) but because my boss issued an edict.

This has happened to me many times in my career. The one time I was actually allowed to use Lisp in a project that I did not have direct control over (the DS1 remote agent) it was an overwhelming technical success. In fact, management tried and failed to get the software rewritten in C++, so this was actually a controlled experiment.


If that were the (only) issue, we ought to see Lisp used more in startups than in established businesses, because in startups, you don't have some manager who doesn't know Lisp and is managing to minimize is perceived (not necessarily actual) technical risk.

But in fact I'm not sure that we see more Lisp use in startups, either. That could be because they aren't taught it in school. (Many startup founders haven't had the chance to pick it up on the job. For that matter, many programmers haven't had the chance to pick it up on the job.)

On the controlled experiment: Why was that? I mean, it has to be possible to rewrite anything in C++ (Turing complete, and all that - but possibly at the price of Greenspun's Tenth Law). Was it because the people who tried didn't know anything about Lisp? (Were they working from the source, or from the spec, or from the program documentation?) Were they just not as good programmers as the Lisp programmers (and don't say "if they were as good, they would have been using Lisp" - that wasn't their assignment). Did they have less time, less resources? Why didn't/couldn't they do it?


It's all a viscous cycle going back to AI winter in the late 80's: DARPA stopped funding AI work, which meant that funding for Lisp work dried up, which meant that fewer people used it, which meant that fewer people learned it. Now, 30 years later, hardly anyone uses it, at least not directly. But it keeps getting re-invented again and again. You can't avoid re-inventing Lisp because it's part of the fundamental physics of computing. That is my frustration. People say that Lisp sucks, and then they proceed to re-invent it badly not even being aware that that is what they are doing.

As to why the re-implementation of the Remote Agent (it wasn't the whole thing BTW, just the planner) in C++ failed, it was a combination of factors. This was twenty years ago (holy cow!) and C++ compilers were nowhere near as mature then as they are now. There was only one compiler available for the flight hardware, and it was pretty unstable. But mainly it was Greenspun's tenth: the application did a lot of dynamic allocation (it was an AI search algorithms) which is something C++ is particularly not well suited for. They essentially had to re-implement Lisp in C++, and that was hard. Of course it would have been possible given more time, but we didn't have more time. The Lisp code was working, so that's what we flew.


Supposing the AI winter never happened, what is your perception of where Lisp would be now? Dominant? Large niche? Small niche, but larger than present?


I have no idea, and it doesn't really matter because we can't go back. I really don't want to dwell on the past, except insofar as we can learn lessons that inform the future. The point I really want to make is that Lisp is still a viable option today (Common Lisp and Clojure in particular) and I would like to see people give it a fair shake going forward.


> Nobody uses it

This application is written in 7+ million lines of Common Lisp

https://www.ptc.com/cad/elements-direct/modeling

https://www.youtube.com/watch?v=mJGytRaNvec

The video shows how Eterna uses 'PTC Creo Elements/Direct' to develop their watches. There are many other clients of that in various domains.

Also:

https://www.youtube.com/watch?v=4rD0zmA-Trc


Good cite.

(Although the fact that some people can make successful Common LISP software could just be that those people are exceptional enough to make successful software in any language, it doesn't bolster the claim that 'Common LISP is one of the best choices for many applications' very strongly).


Inmarsat uses a Lisp application (G2) to monitor their satellites and to provide a high level overview of the status of all Inmarsat services in real-time.

http://www.inmarsat.com

If you would run a Cement plant, an oil pipeline or a coal terminal in South Africa, then you might also be using G2 to control and diagnose technical processes in real-time.

The product used, written in Lisp:

http://www.gensym.com/wp-content/uploads/Gensym-l-G2.pdf


Then why don't people build amazing and popular things with it?

The reason is because "best" can have more than one meaning. Languages are usually only "best" at one thing, not for all things. In the case of many languages that people consider to be "best", they've been optimised for the development process (making it nice to write code in) rather than speed (C) or security (Ada) or a specific niche (R) or maintainability (not sure) or customisability (LISP). They're designed to make writing code more straightforward. Languages that are optimised to be easy to code with are the languages that tend to bubble up to the top when it comes to popularity contests because the more people who can access the language the more popular it's going to be.

There is an argument that Ruby is mostly LISP, so it could be said that lots of people use it very happily everyday.


I would go with that, in the sense that "here's a box of parts" is the most customisable way to solve a problem - yet not the best way for most people.

Except, even people doing exploratory stuff apparently aren't using Common-LISP.

Let me point to Slava from RethinkDB, who wrote blogs about how amazing LISP is and then went to found a new-software startup written in C++. Peter Norvig, AI researcher, who moved from LISP to Python.

And even your mention of Ruby - "Common LISP is the best langauge", then why is Ruby even needed? Why aren't those people happily using Common LISP? Simply because it wasn't actually any better, and was in fact worse than Ruby in whatever metrics.


> There is an argument that Ruby is mostly LISP, so it could be said that lots of people use it very happily everyday.

How is that? I thought Lisp's "killer app" was its extremely powerful macroing system. Does Ruby have an equivalent?


It has class level declarations which do the same thing.


Sure, in practice people's beliefs about languages can become self-fulfilling prophecies. And the thing about self-fulfilling prophecies is that they are actually true. For example, Javascript is tremendously useful even thought it is a horrible language from a technical point of view, it's just that the sheer weight of people using it because everyone else is using it results in enough infrastructure that you can get useful things done in in Javascript despite its (lack of) technical merits, not because of them.


And when JavaScript was becoming a thing, where was the better alternative written in LISP and the community of people who like better software supporting it?

Nowhere, that's where. Like in approximately all problem domains, whatever perceived benefits LISP has, doesn't seem to make it all that desirable.


Please. You pick a lowest-common-denominator language like Java or Golang so that you have a large pool of cheap, semi-sentient code monkeys to hire from.

If you choose LISP, you're actually going to have to expend a little bit of effort on identifying, recruiting, and retaining actually competent people. That's a problem when you're looking to put butts-in-seats (as is the case in my industry: defense).


That just pushes the problem back a couple of decades. Common-LISP is from 1984, GoLang is from 2007.

In twenty years, Common-LISP with all the advantage it supposedly has couldn't make anything that a 'code monkey' could use? Why not? Is it awful at writing tools?

Or is it that the alledged advantages of Common-LISP don't really make any difference in reality because they are over-hyped or even non-existent?

Note that I'm not saying "why did they pick one language over another", I'm saying "Common LISP can be the best tool for the job - yet they waited twenty years to use a 'worse' tool to implement a 'worse' tool". That doesn't make sense.


You are writing this comment in the lisp-based software. Yeah, I know, Hacker news is a simple software and could have been written in anything else, but, somehow, the author, who I've heard is very smart, thought that Lisp is the best pick.


And his secret to winning big was "write lisp software, have it bought out by another company who will rewrite it in a different language to make it popular and useful".

Really, if PG's "LISP, how to win big" was as compelling and important as he tried to state in that essay, every ycombinator company would be using LISP for their secret advantage over other companies.

Applicants would choose LISP because it's "better", LISP-based applications would be preferred because the applicants have an advantage, existing companies would be encouraged to LISP because it would help them do more for less employee numbers and YC would become a LISPy community of alumni.

I have no proof that this is not happening right now, but would you bet on it happening right now in secret?


Your logic assumes that young founders/developers are very proficient in many languages (including lisp) and they have perfect information for choosing. In reality, they use whatever crap they picked up hacking in high school. On top of that, the tools are only a part of the ingredients. However bad your platform is, at least it works, while most of the screw-up is hidden in business decisions - that's why every startup incubator's job is to help the youngsters not screw up business-wise. Tech is normally something up to the founders, most of whom have never heard of lisp.


> And that makes me grumpy sometimes.

If you will excuse a moment of cheekiness...

Could be that you have cause and effect backwards here: because you are grumpy you are dismissing 99% of other language developers' work as rubbish. Could be that it's less than 99% and you are overlooking some great ideas.


That's not cheeky, it's a perfectly fair question. Yes, that is certainly possible. And there have been a few cool new ideas that have come along that are not easily subsumed by CL, like Haskell's type system. It's easy to implement Hindley-Milner, but actually using that information to inform the compiler, plus adding laziness as a core language feature, is much harder. But I think the jury is still very much out on whether or not Haskell is really a net win.

But the most popular language on github at the moment is Javascript, and there is no question that it is simply a very badly designed Lisp with C syntax. This is not intended to disparage Brendan Eich. He had a week to design and implement something, and under those constraints he did a pretty amazing job. But I can't help but imagine how different the world would be if he had used Scheme as a starting point.

Was there something in particular that you had in mind that you think I may have missed?


My take is that Language X usually has major deficiencies compared with Language Y for domain Z, for many values of X or Y: Lisp, Scheme, Smalltalk, Forth, Erlang, Haskell, assembler, C, etc.

Confirmation bias makes it easy to bind those variables to values that make one's own favorite language obviously the best and everybody else's infuriatingly, irrationally terrible. If unchecked then this leads to wildly false conclusions, such as that languages fit into a hierarchy of powerfulness ("Blub Paradox.")


What do you see as Common Lisp's "major deficiencies"? (Pick your favorite value for Z.)

This isn't a challenge, I'm genuinely interested in your answer.


I prefer not to. I think it would be more productive as a private thought experiment: what would you expect a Smalltalk hacker curse about if you forced them to use Common Lisp? a Haskell hacker? a Rust hacker? I think they would miss some really valuable things, and that you could easily miss the value of those things if you took them out of their original context (e.g. considering only whether Lisp would be improved by adopting those specific features.)


that should be pretty obvious: lack of library support.

there is the famous example of the reddit founders, who believed pg's lisp story, and built the first version of their site with it. it went so badly for them that they had to start over again in python.

... but i bet you are going to have a very plausible-sounding reason why it didn't work for them.


I have no idea why Lisp didn't work for Reddit. But Common Lisp has exceptionally good library coverage today, and with Quicklisp, getting access to it is virtually seamless.


okay, maybe i betrayed my biases there too much, but, i agree with everybody else: if lisp was such a great secret weapon, there would be a hell of a lot more visible success stories by now, other than just pg's original viaweb implementation, and some flight routing software.

plenty of other tech has come from up nothing in the last few decades, to wide adoption, and big successes. the fact that lisp hasn't is, in my mind, prima facie evidence that it is not nearly as great as its proponents claim.


> there would be a hell of a lot more visible success stories by now, other than just pg's original viaweb implementation, and some flight routing software.

There are, but you are ignoring them, you simply did not do any research or it is simply hidden.

The success of Common Lisp today is relatively small, but a careful reader could find a few interesting applications of it like the scheduling system for the Hubble Space Telescope, the design software for the Boeing 747 (and other aircrafts), the software for the Roomba, the software for DWAVE's quantum computer, the crew scheduling software for the London subway, chip design verification software at Intel (and other chip companies), ...

There are some old application platforms which survived. For example Cyc, an attempt to provide common sense AI to computers, is under continuous development since the mid 80s. The company Cycorp has 50+ employees, is very secretive and you need to guess who pays for it. Customers are among others the United States Department of Defense, DARPA, NSA, and CIA. They are using it for various applications.

Note also that prototyping software was for a long time an application area for Lisp. Have relatively small teams develop a prototype and make it a product once the idea is validated. Example: Patrick Dussud wrote the core of the first Microsoft CLR (.Net Common Language Runtime) garbage collector in Lisp. The code was then automatically translated to C (IIRC) and enhanced from there after some time. Lisp now is no longer used and the GC has a lot of new features, but the first working versions came from that Lisp code.


> if lisp was such a great secret weapon, there would be a hell of a lot more visible success stories by now

Not necessarily. There are other possible explanations of Lisp's relative lack of commercial success, not least of which is the fact that a widespread belief that "there must be something wrong with it because no one uses it" can become (and I think has become) a self-fulfilling prophecy.

But another important factor is that the Lisp community seems to attract people who are really good at tech but really bad at business. I think if someone (or, more likely, some pair of co-founders) could bridge that gap they could still kick some serious ass.


Just a data point: I founded a Lisp startup together with a bunch of experienced Lisp hacker buddies from the SBCL community. Sadly and reluctantly, we found Lisp awkward and ended up rewriting everything in C, and then never looked back.

These days I am developing such software with LuaJIT and that is working much better for me than either C or Lisp.

One thing I learned along the way is that many tales of Lisp heroism are actually anti-paradigms. Once upon a time when I read about ITA Software rewriting CONS to better suit their application I thought it was impressive; now I see it as a farcical workaround for having chosen an ill-suited runtime system and sticking with it (and generally an indictment of Lisp not providing a practical performance model for the heap.)

Lispers are too expert at spinning bugs as features. "It's insanely complex, every line could be an interaction with undefined behavior or a race condition or an unexpected heap allocation" becomes "suitable only in the hands of trained specialists, like a chef's knife or a surgeon's scalpel or a Jedi's light saber."

I feel like we need to have a shared "our emperor didn't have any clothes" moment with regards to Paul Graham's essays.

(I say this as somebody who does love Lisp and will probably do a lot more Lisp work in the future but only on a project that is a peculiarly good fit.)


(Funny feeling of being a Lisp hacker searching for catharsis in the Hacker News comments section... :-))


Don't search, go do something cool,what all people will talk about!


Tangentially: I am working with LuaJIT these days and this feels really exciting to me. The compiler is a new kind of beast and possibly the beginning of a large Lisp-like family tree. Feels very "MACLISP" to me - exciting!


Some functional languages make certain behaviors implicit, such as partial evaluation and laziness. However, these work better if they are explicit. They work better because one of the two is severely confusing when implicit and the other potentially performs badly.

  C:\Users\kaz>txr
  This is the TXR Lisp interactive listener of TXR 162.
  Use the :quit command or type Ctrl-D on empty line to exit.
  1> (defstruct integers ()
       val next
       (:postinit (me)
         (set me.next (lnew integers val (succ me.val))))
       (:method print (me stream pretty-p)
         (format stream "#<integers ~a ...>" me.val)))
  #<struct-type integers>
  2> (lnew integers val 0)
  #<integers 0 ...>
  3> *2.next
  #<integers 1 ...>
  4> *2.next.next
  #<integers 2 ...>
  5> *2.next.next.next
  #<integers 3 ...>
Why would I want implicit laziness everywhere? The best of all worlds is to have expressions reduced to their values eagerly before a function call takes place.

When I don't want an expression evaluated in (what looks like) a function call, I can, firstly, make that a macro.

If I really want lazy semantics, I can have a decent vocabulary of lazy constructs that fit into the eager language. For instance for making objects lazily I have lnew, distinct from new.

Implicit laziness everywhere is academically stupid. You're drowning the execution of the code in an ocean of thunks and closures.

The pragmatic approach is best of making a compromise between making everything explicit and visible, yet keeping it syntactically tidy and convenient.


> Why would I want implicit laziness everywhere?

Modularity; see the stone age paper discussed yesterday: https://news.ycombinator.com/item?id=13129540

> Functional programming languages provide two new kinds of glue - higher-order functions and lazy evaluation. Using these glues one can modularise programs in new and exciting ways, and we’ve shown many examples of this.

> This paper provides further evidence that lazy evaluation is too important to be relegated to second-class citizenship. It is perhaps the most powerful glue functional programmers possess.


The paper claims in its conclusion that it has provided evidence (what is more, "further evidence") yet I can't find any in there.

It argues that you can achieve a certain useful separation between programs together when one produces data for the other.

This can be achieved in a very satisfactory way with explicit streams (i.e. lazy lists). It can be satisfied with delimited closures, coroutines, threads and often with lexical closures. Not to mention Icon-style generators.

Lazy lists can be incorporated into the language so that their cell are first-class objects and substitute for regular eager cells smoothly. (Thank you, OOP).

The paper is actually wrong there, because laziness alone will not provide the kind of separation that g can begin executing, such that f then only executes when an item is required. Not for an arbitrary f! Suppose f traverses a graph structure recursively and yields some interesting items. Lazy eval alone isn't going to allow the f traversal to behave as a coroutine controlled by g, proceeding only as far as g continues to be interested in further items. The author is attributing to lazy evaluation magical powers that it doesn't have.


> Why would I want implicit laziness everywhere?

For the same reason you want automatic memory management: so you can fob off the job of figuring out where the thunks should go onto the compiler, just as you fob off the job of figuring out where the calls to malloc and free should go. At least that's the theory. It seems plausible to me. I think it's an open question whether my failure to grok Haskell is due to a problem with Haskell or the ossification of my brain.


It's not the same. Here is why: the program correctness doesn't depend on when (or even whether!) that automatic memory management happens. Lisp systems have been bootstrapped without having a working garbage collector upfront. Short-lived Lisp images run as processes in a conventional OS might never have a chance to collect garbage.

Laziness has precise semantics which has to unfold properly, or else things don't work.

Delaying evaluation is not the same thing as delaying reclamation. They are opposite in a sense, because we only allow something to be reclaimed when it is "of no value".


Not trying to troll you here...

To what degree do you use Lisp as an FP language? As a pure FP language? The forced purity may make Haskell a very different language.

And, to return to your original complaint: If you dislike new languages, I bet XML drives you straight up the wall...


> To what degree do you use Lisp as an FP language? As a pure FP language?

It depends on what I'm doing, but I generally write in an OO style more than a functional style. Real problems have state.

> If you dislike new languages

I want to be clear that this is just a general observation. I don't dislike new things because they are new, I tend to dislike them because they are generally bad. But they are not all bad. Clojure is cool. WebASM is very cool. The work that has been done on Javascript compilers is nothing short of miraculous (even though the language itself still sucks).

> I bet XML drives you straight up the wall

Kind of, but not really. Yes, I dislike XML because it is nothing but S-expressions with a more complicated syntax. But it doesn't drive me up a wall because when I need to deal with XML I just parse it into S-exprs, do what I need to do, and render the results back into XML.


I dislike XML because it's a nested tree of internal nodes and typeless character string leaves.

In XML I have no way to place "255" and "FF" in such a way that XML understands them to be the same object, of integer type.


Sure you do. <base10>255</base10> <base16>FF</base16>


Pure FP deals with state!


Sure, everything non-trivial is Turing-complete. But FP begins as a stateless paradigm and then tacks on state as a sort of a kludge while all the while seeming to be a little embarrassed about it, while OO embraces state from the beginning as part and parcel of the mental model that it endorses. I find the OO model has a better impedance match to my brain and the real world. Reasonable people can (and do!) disagree.


State is something that is best just embraced rather than "dealt with".


An analogy to that is that OO doesn't embrace IO. Yet weirdly enough that makes OO-IO better than older languages that have built in commands to write to disk.

Haskell doesn't have state but you have multiple models to choose from, from simple folds to STM or State or Reader or Writer Monads all of which serve different purposes and do different jobs well.


OOP absolutely embraces I/O. I/O begs to be OOP and makes, hands down, the best use case for illustrating OOP.


What I am getting at is that OOP languages like C++ have no IO commands built in it is all delegated to libraries.

Haskell has no state support built in, it is all delegated to libraries.

So:

C++ has excellent IO support, but the language doesn't embrace IO at all.

Haskell has excellent State support, but the language doesn't embrace State at all.


C++ I/O libraries in fact depend on the sequencing semantics built into the language. If we make two calls to the library, they happen in that order; consequently, the I/O happens in that order. We can do wrong things like:

    f(cout << x, cout << y)
where we don't know whether x is sent to cout first or y.

C++ statements could be added to C++ (e.g. as a compiler extension). They would be straightforward to use; C++ doesn't inherently reject that the way Haskell and its ilk reject sequencing and state.


Haskell doesn't reject sequencing. f = g . h will require h is evaluated first.


h might not be evaluated at all! Consider

    h x = factorial x
    g x = 0
(but I agree that Haskell doesn't reject sequencing).


Yeah the IO monad will, but it isn't generally true of monads. Infant the maybe monad Will cease early on Nothing by design. So it is a brain shift.


> Haskell has excellent State support

Someone who understands where that support is and how to use it should rewrite atrocities like:

https://rosettacode.org/wiki/Assigning_Values_to_an_Array#Ha...


I don't think that's so bad, it's just you are using a function instead of the usual built in indexing operator [0]. Here's something a bit more convenient using Data.Array.Lens[1].

    arr ^. ix 2
vs the original:

    readArray arr 2


Failure to grok Haskell is probably due to the lack of easy accessible literature on it. Plus with so few other people groking it there isn't as much osmosis available.


But I can't help but imagine how different the world would be if he had used Scheme as a starting point.

I've heard that he did in fact use Scheme, but the suits insisted on the whole curly-brackets-and-semicolons thing so they could call it Javascript and piggyback on Sun's marketing efforts.


To be fair, he only took some of the semi-colons.


He took an option on the semi-colons.


I'm certain lisper [parent] knows these, indeed may well have worked with PG in the past, but for those who don't, here's some useful linky: http://www.paulgraham.com/lisp.html


Isn't there a good big comnunity in Clojure now?


Depends on what you count as "big". If you look at languages used on github, for example:

http://githut.info

Clojure is doing OK (better than Common Lisp), but not great (worse than Haskell and Emacs lisp).


I bet more people are getting paid to do Clojure work on production systems than Haskell or Emacs lisp.

That's a really nice web site. It took me a moment to realize it's showing quarterly data (sadly last updated in 2014), and that the ranking is based on the first metric ("number of active repositories").


Not quite as much activity as Scala either. I think the good thing is that functional languages as a whole aren't going anywhere, it's just that the jobs for them are spread very thin over several languages.


Maybe because in Clojure you achieve more with less effort :)


Out of curiosity what flavor/dialect of LISP do you recommend? Common or something else?


I like Common Lisp, and Clozure CL in particular, but it doesn't really matter all that much. The cool kids seem to be using Clojure (with a J, not a Z) nowadays. Pick whatever works for you.


"That gave me the ability to see that 99% of "new" technologies were really just poor re-inventions of (parts of) Lisp"

"But no one knows it..."

It'll be really hard to work with a 23 year old with that attitude too.


Ageism is real in the Vally but it can cut both ways. In my current company we did a SWOT analysis and one of our advantages was "many old farts, and many former colleagues friends".

Yes, if you haven't been using AWS and GPGPUs you will be of minimal use to us but it's really valuable if you have already made a bunch of mistakes on someone else's dime. And in your 50s you're probably an empty nester, and can easily put in a 50+ hour week when necessary (which it often is, but not all the time, in a startup) and get more done in 40 hours than the squirts do in 60+.

You need a mix of ages and backgrounds. A bias to youth is as bad as a bias towards time-in-grade, or any other such bias.


> Yes, if you haven't been using AWS and GPGPUs you will be of minimal use to us

> already made a bunch of mistakes on someone else's dime.

What is it with this industries lack of interest in investing in people yet demanding so many hours that you can't learn anything new at least nothing deep. Why not just make everyone a contractor but pay them appropriately as a result.


I guess I phrased that poorly! I meant battle scars are important: not everybody will have them (some will acquire those scars in the future). In other words, experience is important.

Sometimes experience gets in your way ("No, a scripting language will be too slow for this") and other times it saves you a huge amount of grief ("Really, we had better plumb that now, it will save us a lot of pain down the road").

Not sure I suggested many hours (quite the opposite) or not learning anything new.


Maybe it wasn't phrased as intended, but my point is few companies are willing to take someone on just because they haven't done tech X even if they are seasoned in tech Y which is broadly similar and have all of the communication, team working, leadership, problem solving and domain skills. Because they need to "hit the ground running".

The flipside is that of course employees don't hang around along in IT as they need to keep moving to gain the experiences they need to make their CV look reasonable. So I guess it is perpetuated on both sides.


It makes sense to plug https://oldgeekjobs.com

I've taken a small break from working on it, so the traffic has died down some, but feel free to post your tech jobs for free for the time being.


If you're looking for help, drop me a line. As a young(er) engineer, I live in terror every day of what may come down the road.


That's kind of you Dan. Yep -- I remind myself that ageism is the one form of discrimination that eventually affects everyone. There might be more...I dunno, but it keeps me on mission.

I run a Medium publication at https://blog.oldgeekjobs.com. If you've got writing chops, I'm looking for contributors. That goes for anybody. Thanks again.


I would like to help you as well. Ill ping you on Twitter.


Feel free to e-mail if that works better. john@johnwheeler.org


I got in touch through twitter but will email you tomorrow morning. :)


What kind of contributions are you looking for?


Blog posts that can make the front page of HN and/or Reddit. Let me know if you're interested in collaborating


Any update on how the site is developing?


Traffic has been driven by HN. I haven't had any blog posts make the front page of HN for a month (i haven't written any). I've added some new features like sort by date and salary.

I've made it free to post, and I'm going to keep it that way while I get some more blog posts out.

I tried pitching to some reporters with no luck--I read a good book "This is how you pitch" by Ed Zitron, and I'm going to try some of that. Turns out I didn't know how to pitch at all.

Also got an open source project I'm going to try to incorporate into some of the blog material. It's not done, but I might as well plug that here too. Worked on it all last month.

https://github.com/johnwheeler/flask-live-starter

Thanks for asking!


Unsolicited suggestion: it's very interesting to follow what entrepreneurs are upto, the ups and downs of launching a business, you should consider blogging about your progress. This is something the indiehackers guy does and it's fascinating to read.

Good luck with oldgeekjobs!



I'm 39, ageism is real, but I'm gonna tell you to quit whining grow a pair, and what the hell is wrong with you if you cannot sell your strengths, wisdom and experience?

(BTW: just so it's clear I apply this same rule to everyone - this is the same advice I would give to any discriminated group, whether women in tech, people of color, Jewish or Asian or whatever group that find themselves on the victim side of discrimination)

Stop thinking of yourself as a victim. Be razer clear what your value is.

Almost every one will have some or other circumstance where some trait of his/her counts against himself, that is life, but these are merely speedbumps on the way, not show stoppers.

If you can't overcome bias at one specific company or in one specific country, move or do whatever YOU can do to solve it. Crying aint gonna fix it.


Isn't the point that you don't get a chance to demonstrate your value because you are overlooked earlier on in the process where some hiring manager sees how old you are and decides you aren't "a culture fit" for the company. Even if you being hired will make all us 20 somethings 20x more productive, you won't even get the chance?

Ageism, like most *isms, isn't something that you can simply throw individual willpower at. Anti-sexism isn't wont for women with willpower, it's a problem that lies beyond individual power, and requires collective power to combat.

> Crying aint gonna fix it

I think most people know that. It's because some people still deny or ignore the existence of these discriminations, so it raises awareness.


An interesting thing happened to me in college. As I progressed through it, I found I learned more complex stuff with less effort. A big part of it was I was able to see what was important to learn and not waste time on irrelevant things.

As I get older in tech, I have a similar experience. Of course, one can go too far and think everything new is irrelevant :-)


Young software developers scope out the projects (time required to develop a project) very differently from older engineers. Seems like the industry has forgotten the role of the QA engineer! With experience, you know that you need time for architecture, risk planning and putting things in production. I see younger engineers quoting 1-2 months to just about every project; just earmarking enough time need to put together some frameworks and write a basic code. Cloud technologies and new frameworks definitely do make building projects easier. But scaling up a product is still not very easy and casual usage of several new frameworks, comes back to bite very often.

I just hope people stop being so hurried about seeing the first cut of their products. That itself would fix some issues around this topic.


This won't change. They've convinced themselves that it is best to move fast always.


I think it’s more about not appreciating experience enough as opposed to discriminating against age. I’m 31 with 13 years professional experience. I’ve worked as an employee, contractor and as a single person start-up. Last year I interviewed for ~10 jobs in SV and I was rejected for all of them. None of the jobs valued my experience enough to consider my asking salary to be worth it. I make ~$300K USD contracting and was willing to take a pay cut to ~$240K to work at a big company and for access to big problems. I found out, via a friend, that one of the jobs I missed out on was due to my asking salary. They ended up hiring a guy with much less experience for ~$120K for the position. A year later and the project failed due to lack of experience costing many millions of dollars. I wanted them to succeed and I know that I could have done it but I’m not going back to $120K - at $300K customers only let you work on important stuff, no busy work. This story gets repeated over and over. There is a culture problem that doesn’t value experience. It’s not my problem because I'll go to where my experienced is most valued; it’s SVs problem because it results in failed projects and wasted money.


this is not specific to SV, or US for that matter. 10x developers don't get 10x salaries of they work for bigger companies that have sort of rigid pay structures for given positions. You have to work on your own ('in your own company') to have that in 99% of the cases, rest are tiny companies with heaps of cash (how many are those?).

If you would be having same salary (apart from bonuses) as say some senior executive/CEO, well that won't fly, no matter of your added value or semi-magical skills. It doesn't matter how rational that might be.


this is not specific to SV, or US for that matter. 10x developers don't get 10x salaries of they work for bigger companies that have sort of rigid pay structures for given positions. If you would be having same salary (apart from bonuses) as say some senior executive/CEO, well that won't fly, no matter of your added value or semi-magical skills.


"I have taken big cuts in salary three or four times in my career. I’m talking 10-20 thousand dollars a year." - HOW can you do that? How does one justify that kind of long-term damage to ever being in a position that you don't have to work?

I understand the idea of it; to move into a technology or business with more room for growth, but if you don't have the time in your career to benefit from that growth, how can you do it?

If you're in your late 40's, early 50's, with a mortgage and/or children, or that point where you can choose not to have to work anymore is dangerously close. One badly-timed layoff, one forced pay cut and you're stuck.

Especially if you're in that age range; the last generation that ever had some hope for long-term employment with an employer, the generation that saw pension plans converted to 401Ks and didn't understand just how much you personally needed to take over funding your retirement yourself. You're dangerously close to having one badly timed layoff or large pay cut snatch the choice of not having to work away from you.


I'm always curious whether Zuckerberg stands behind that quote or if he looks back on it today as part of a youthful hubris he regrets today. I know if I look back on my early 20s I remember a lot of similar arrogance that I hope I've shed at least some of today.


I worked at IBM Research where one of the top experts in malware research was a 60 y/o guy wearing a big copper bell hanging on the rope and he was diving into assembly level code like a water.

Smart companies want smart solutions to tough problems.

Assuming that solution is coming from certain audience, like age/race/gender based is a recipe to failure.

Smart companies are after smart people. Who cares about anything else.


about the value of experience, I had it explained to me like this:

  If you see a toddler running after a ball that rolls under a coffee table, bending over to go under the table and 
  to pick up the ball... 
  You know what's going to happen next. That's experience.
  There are just different balls and coffee tables.


I'm a 42 year old technologist who just got laid off last week, and very much dread age discrimination. Although I'm told that I look about 8 or 10 years younger than I look, it still scares me. And, funny enough I'm about as energetic - or very close to it - as I was when I was 22. Beyond all the great notes that the blog post's author wrote, I think the parts about "exude energy...it’s crucial to be spirited." really hit home with me. Especially now with my current situation, i appreciate this blog post!


"You know what they do with engineers when they turn forty? They take them out and shoot them." - Primer (2004)

One of the biggest killers of sedentary professions is heart disease, which is the number one killer in the US. People who work at Microsoft have told me they give free soda, sugared "juices" with artificial flavors, coffees and teas for free to their employees. These all are sources of heart disease, but the young engineers drink it up like a lost caravan in the desert. It's likely not that different in any of the other areas as well.

The cost for ailments such as heart disease is known to be one of the most expensive in the US. It requires extensive testing, support, medications and visits to the doctors and hospitals. Considering the extremely high health care costs the US enforces, this places a huge dent on insurance premiums companies pay as well as accommodation, etc.

These companies want them out before they have to pay more premiums on health-care and other factors related to health, age, seniority, etc...


Get a job in an industry that uses tech, but isn't itself pure tech (healthcare IT for example). Then your years of domain experience matter.


This is some of the best advice in this thread. There are definitely domains within technology where experience is a key ingredient, but the best insurance against age bias is a deep understanding of technology coupled with a deep domain you have experience in.


Umh... http://www.kettlerusa.com/?fullSite=&cartId=&division=kettle...

The design. The url. Clicked "add to card" - default JS alert popup. Then some HTML pop up showed up saying something about my IP being banned because of BOTNET? What? Now refreshing the page I get a timeout.

This is literally the worst eshop I've seen in the past few years. I wouldn't hire a man who made this atrocity and sure as hell wouldn't like working with him.


Yes! The most important thing I learned in academia is to never stop learning and to always push yourself to never get stale. Learning doesn't stop at high school, college or grad school.


Existence proof. Can avoid ageism. But likely only via competence, creativity, and rep.

I've been lucky, I guess.I would add to continuous learning: continuous invention. New stuff in the world, not just new to you. In my case, almost entirely unplanned drunkard's walk of a career that I don't recommend anyone emulates, but it sure has been a wild ride.

I'm 69 last month. Started 50 years ago inside discrete component technology mainframes. Done hardware logic design, firewire and Medialink FPGA, datacomms Hard Real Time embedded firmware, synchronous and asynchronous comms protocol design, real time networking design in the Music world (before anybody thought it possible, we showed em!:), created cool 4G visual programming languages, lately Audio DSP inside gaming consoles.

Moral: Keep inventing, keeps you young, moreso than just leaning new stuff you won't be using until it is obsolete. A wide T LI profile doesn't hurt either. But that is not an after-thought, it's a side effect of your lifetime of energy and obsessions.

Vaguely thinking of retiring when I'm 72 or so. Want to make more music, DSP takes too much time.


I'll probably be ~36 by the time I complete a PhD in CS. Should I be OK in terms of finding jobs at hip companies upon graduation (big 4, etc)? My areas of interest is in ML, algorithms but I'm completely OK with normal software development positions.

Reason I'm older is I decided I had a passion for it in my late 20s / wanted to do research, but had to go back to school to take classes before enrolling into a PhD. Hope my age + PhD wouldn't hinder me for software development jobs?


My experience has been that if you've got the chops, you've got no worries at all. Also, fashion is fickle, but right now you'll have people beating down your door to hire you if you're skilled with ML and algorithms.

I am in my 40s.


Thanks for your very encouraging words. You really brighten my day. I've always had an interest in theoretics/foundations and believe a strong mathematical backbone leads to better understanding of these fields. I simply couldn't get that working a normal job, so I know I made the right choice to pursue a PhD. Many, many people recommended against it (especially my friends who are pulling big salaries in industry), but for me it was always about the intellectual pursuit.


If you don't mind sharing, what school are you going to, and what is the area (the more details the better is good)of your research?


In my experience, age is measured from when you finished your education and not from when you were born.


education is a lifelong pursuit.


I'm obviously referring to the last degree that the person received and not to learning in general.


Related to fitness. I never went to the gym. However I was working on fitness related site for some years and as a lead there, I wanted to be reflection of this site I cared much about. So I did what I like, walk and running up the stairs.

Whenever there are stairs, I would run up them. Now, again I am not super fit, but I don't lose breath, everyone else kind of does. This gives me great pleasure.

I am older developer, it is getting hard, I will not lie. But, I am on top of latest technologies, enjoy my work, I am pain in the but.


Perhaps I'm a simple creature, but running up stairs is one of the greatest pleasures in life ;-)


54 yo here. Although I have a bachelor's in EE, I have been developing software since the beginning of my career. I am not necessarily nostalgic but I sometimes ponder about the real time assembly I wrote for a Telex switch, or the CHILL code I wrote for a telephony switch. All of it pretty much gone, not being produced anymore. Or the Windows 2 GUI written in C, the multi tier apps in VB6 and Powerbuilder. They are gone too.

However having participated in these projects give me a lot of perspective, and I notice that nowadays I tend to write less and be more thoughtful. I see others here with similar experiences.

Now as for getting old, there is one thing I recommend: get old but never allow yourself to look decrepit. Be always clean, well dressed. Don't complain about your back hurting, or show off the medicines you take. No one likes to be around sickness or weakness, so pretend to be healthy and strong (or try to be, even better). I aim for this reaction - "that guy looks good for his age". This usually helps with ageism, at least in my experience.


FWIW, I'm still on the earlier side of my career (though certainly not just starting out) and the direction of my learning has been away from fads and towards older, more established technologies. Erlang, Unix, etc. have all been around for many years, and while they do receive active development are still pretty well established and based off of sound engineering principles.

I also try and get beyond the hype. REST/HATEOS is cool? What about the things it replaced? What are the edge cases? What was the original design intent?

I also read from/talk with older engineers (many of whom I have the privilege to work with) to understand what things were like during previous fads.

Ultimately, the problem is about staying marketable while giving fair due to our life outside of work (I prefer the term "unpaid responsibilities" to "leisure time"). I don't claim to have solved this problem, and I still have concerns about what the future holds, but my instinct is that staying on top of fads is a trap that I want to avoid.


Im closing in on 54. Having an absolute blast in tech now because i bring 4 decades of coding, life, and a number of different jobs to what i do today. And I devote every morning first thing to reading and learning something new - python, azure databases, IPC in C#, C and SBCs, ZigBee. There is so much out there at a price point that makes learning painless and fun.

In life i stay healthy as a vegetarian and practice yoga. Look after the body and the mind will largely follow suit but feed the mind with challenges daily and you will notice that you get better over time at a rate the javascript kiddies cant comprehend.

Maybe this ageism is jealousy from the kiddies because you think leagues ahead of them, and also jealousy from old managers who cant do anything productive now that their body of knowledge is no longer useful? Food for thought, for someone. I wont be wasting any time to think about it.


Well, if the author is almost 60, it probably won't matter which career he's in, he would likely suffer from some sort of ageism. It doesn't matter if you're a programmer, or in finance, or working at Costco, if you're getting close to 60, most people will be skeptical as to whether you can really work as well as a 25 year old.


Yes,there is ageism but there is also some remnant of meritocracy in IT development and operations (though both the evolution of devops and enterprise agile + ci/cd tool culture will eventually kill that imo).

The only way to remain in the IT game at the age of 45+ is to learn constantly and use your aggregate experience to determine what is good and bad and necessary. When most of the 20+ year IT veterans are in set piece environments they enabled or abetted...technological advent and invention is the enemy.

All the musings on how great a person you really are contra|outside tech are band-aids on reality. If you make your living in technology: be better than the other guy or be useful to them. That's all there is. Otherwise your days are numbered.

We can ramble on about salad days and personal achievement but the younger guys snicker and say 'listen to this fossil' and do their thing. As you would have in their shoes.


There's a reason why ageism exists in the industry and it isn't because of some undeserved stereotype about old dogs not being able to learn new tricks.

1) Most business-critical projects have long lifetimes. There's a reason why banks are still running COBOL and mainframes, and why Java's continued promise of backward compatibility with every new release is so valuable to companies.

2) Maintaining legacy systems is a bitch. Nobody likes maintaining legacy systems.

3) Therefore no employer wants to hire somebody who writes code which almost immediately turns into legacy code, either because it's not tested, or not written using modern language features designed to make the language safer, etc.

4) Learning to write code in a modern fashion requires continued education.

5) Employers will not budget or pay for this continued education in a no-compete-clauses-are-illegal environment where smart employees will take the training and run to another employer willing to pay for the benefit of another employer which already put in the legwork of investing in that employee.

6) Employees therefore need to spend significant time educating themselves on their own time. This is great for the minority who are computer geeks who treat it as a hobby and it's terrible for everyone else.

7) Most people will not spend personal time educating themselves, because they prefer to invest that time in friends and family. This is all the more true, not less true, after one's children are grown.

8) Therefore they slowly become unemployable as their skill set turns obsolete.

9) Therefore employers have a hard time finding older people who do have that combination of a modern skill set and decades of general industry experience. And it's for the same reason it was difficult to hire any programmers at all in the 90's, because the competent labor pool (then in general and now in the older age group) is basically limited to computer geeks.

10) Continued interviews of older people who did not bother to keep their skill set modern and honed creates a stereotype that old people aren't "smart".

It's not a problem that some Chief Diversity Officer at some Big Four company can solve, because they're either going to literally fight human nature (people desiring quality time with their families) or they're going to adapt an affirmative action policy that'll only make the problem worse, as prefer-false-negative hiring policies set up to protect codebases from incompetence get overturned for what's essentially a political reason, breeding resentment.


This is an argument built on one stereotype/assumption after another.

1) Many business critical projects have relatively short lifetimes. The use of hyper-legacy (avoiding the use of the word 'ancient' here) systems has more to do with the stability of well-known domains like accounting than any other factor. Other systems are business critical but meant to solve a short-term need, eventually being replaced by a better or more well-informed implementation.

2) Don't presume to know what everyone else likes to do.

3) Your implication is that "old dogs" are going to write code which is immediately legacy: they don't write tests (which is by no means a guarantee that code won't become 'legacy'); they don't use modern language features; indeed, that code without the use of modern language features is itself legacy.

4) I won't argue with this one, but with the same assumption you make in #3.

5) No-compete clauses are only illegal in a few jurisdictions like CA (in the US anyway). "Smart" employees tend to stay with employers that have an ethos of investing in their employees. In any case, I used to teach corporate training classes, and I can assure you that there is a healthy market of employers who do this.

6) Yes, people of all ages need to spend a significant amount of time investing in themselves, including 20-somethings. This is not a new thing, and indeed, "old dogs" are better at sniffing out where that time is best spent.

The remainder of your argument are conclusions built on faulty premises.

The problem of ageism in tech is multifaceted to be sure, and we all know developers who fit into the stereotypes you promote here. But to say that they are the primary cause is part and parcel of that same ageism, and ignores: * the role of capital in providing vast sums of money to inexperienced youth; * the hubris of that same youth; * the lack of diverse and inclusive cultural values within the US as a whole and Silicon Valley in particular; * and far from the least, the social and cultural aspects of media in influencing stereotypes (Forbes' annual 30-under-30, anyone?).


Therefore no employer wants to hire somebody who writes code which almost immediately turns into legacy code

But what is "legacy" or not is not to do with technology, but fashion. Let me give you an example: everyone rants on here about what awful languages PHP and JS are. Is COBOL really a worse language than either of those? If so why? What algorithms or data structures can't be implemented in it? What tooling doesn't exist?

The only reason COBOL is considered a legacy language is because it's unfashionable, and a large part of that - I'm not even kidding - is the clothes COBOL programmers used to wear have fallen out of fashion. There's actually no reason that you couldn't use it for any application you might want to write today, and it would probably be more productive to do so than some modern languages...


COBOL has its limitations for sure, but I am an oddball because I love it. I took it for two semesters as part of my master's degree and am keeping an eye out for COBOL jobs by me. I would love to work in COBOL day in and day out.

The hardest thing about COBOL is the mainframe it has to run on. Without that access, I haven't done much coding outside of class.


"Learning to write code in a modern fashion requires continued education" - what does that even mean? It's like telling someone they need to learn to drive in a modern car. The car is modern, the driving isn't.


As someone who transitioned to tech from finance, I can tell you that this trend (preferring young workers to old) is not limited to tech. The fact is companies in general are going to prefer younger people for labor. Younger people are cheaper, tend to have fewer obligations outside of work, and are willing to put up with more on the job.

As someone getting older and more experienced, you can take this one of two ways. (i) You can try to "learn new tech" and "stay up to date" in an effort to compete with these younger workers or (ii) you can actually listen to the market. And what is the market telling you? Yes, younger workers are more valuable for the aforementioned reasons. But it's also telling you that by the time you are 40 or 50, you should be implementing your own ideas, not someone else's.


Oh cool, he made a webshop with an RPG backend. I last did that about 15 years ago, it looked just as awful, and just like him, I was so proud of my achievement at the time, that I couldn't see what an abomination it really was. It's a pity, that customer deserved better.

I totally believe age discrimination is real, and the Corgibytes author hits the nail on the head with:

> Only as Good as Your Last Two Years of Accomplishments

> Kent Beck has suggested that, with consistent use of pair programming, the capabilities of programmers don’t differ much after two years of experience.

Experience in our field rots away at an amazing rate. I don't think it's as short as 2 years, but if you're still regularly using techniques you mastered 10+ years ago, you're probably falling behind. I don't think lawyers, doctors or stock brokers have this problem.


I doubt that a lawyer, doctor, or (especially) investor could survive more than a year without constantly investing time in learning about their field.


Doctors now have to do 'continuing education' and many of them got pretty upset about that and / or were grandfathered out of the requirements.


Yeah, but what you've learned doesn't effectively rot away at the same rate. Old diseases don't go away, people don't develop new organs as they discard old ones.


Elon Musk is 45. Does that make him old? Doesn't seem to slow him down.

edit: just saying limits seem artificial, that's all


How much hands-on tech involvement does Musk actually have in his enterprises? I imagine his focus is on business, leaving the tech for the tech professionals that are hired by his businesses.


A LOT. The guy matched years of study with months of reading textbooks. His design inputs are almost always spot on, in the long run.


> How much hands-on tech involvement does Musk actually have in his enterprises?

From multiple interviews, Elon has stated that he spends the majority of his time on engineering and design problems, and very little time on business, PR.


PR BS & founder worship. If he wasnt spending time on PR he wouldn't be doing interviews


> If he wasnt spending time on PR he wouldn't be doing interviews

There's a difference between spending minimal time and no time at all on business, PR aspects.

> founder worship

I only stated the facts. Where is the worship?


If by facts you mean "the picture that Musk's PR machine wants you to believe as facts" then yes. Unless you spend lots of time with him personally so you state that first-hand. Somehow, I don't believe in superhumans. Elon is undoubtedly smart and successfull, but he is not a Batman (or Superman, or whatever).


By facts, I mean what Elon Musk has stated himself.

The interview with Sam Altman in How to build the future is one instance where Elon states this. In the same interview Sam also states that when Elon took him for a tour of SpaceX, Elon talked about the engineering aspects in detail and had a surprisingly good understanding of how the components worked.

Another instance is in the recode interview I think.

And how often do you see Elon going around doing PR stuff? AFAIK not much.


So, in an interwiev, Elon talked some engineering aspects to another enterpreneur, Sam, and Sam, who never did any rocketry, concluded that was a surprisingly good understanding of how components worked. How the hell Sam Altman does know anything about rocketry?

That's exactly what I'm talking about. Elon certainly understands some general concepts, maybe even decent amount of engineering (for a businessman). He is obviously an excellent businessman, and, I'd say, even better personal brand promoter. Everything beyond that is just PR that is obviously excellent since intelligent people fall for it.


Much like laws, ageism doesn't apply to billionaires.


I don't know. But in his case, how does it matter? He's at the top. No-one can be "ageist" on him. Clients or Vendors maybe, but I double at that level, a Company would decide to not hire Elon Musk's Company or his minions because of Musk's age.


Capital and Labor are different beasts


Getting older in tech, most of the stuff I learned is obsolete: http://www.computerworld.com.au/article/184668/readers_throw...

I'm a legacy software, retrocomputing expert now I guess?

Most of the old commercial software has released free and open source versions of itself or someone wrote a FOSS clone or whatever to do the same things.

I am 48 years old now, getting close to 50 in two years, and can apply for AARP and get better health insurance through them.

I ended up on disability, but been trying to learn new things and keep up with trends and patterns in the industry.


> between 2008 and 2010, I was training Java developers at Circuit City on Groovy and Grails. These folk were mostly late 20s and early 30s, and they were just fine sticking with good-old, write-everything-yourself, don’t-bother-with-frameworks, Java.

You're assuming Groovy and Grails are better to code in than Java and something like Spring. Grails began as a thin wrapper around Spring. Its business purpose was to chisel market share away from vanilla Spring so its backing company (G2One) would get bought by SpringSource, which eventually happened in late 2008.


While I believe their ageism in tech. So what. I'm 56, I've been programming professionally for 30+ years. Code is still code. Theirs not much new under the sun at a core level.

Consider this, many thousands of lives depend on my ability to write good, maintainable code everyday. I do Alarms, Telematics and 911 Systems.

So ageism. ya, it exists. But for every 50 year old coder, theirs 150 under 30 in the business. Its just the way it is. But I know, if I was in the hiring end of the game, I'd drop my dime on the old fart whose got a ton of REAL WORLD experience over some freshly minted grad every day of the week.


Out of the software devevelopers I know, only a few percent are 40+, so my sample set isn't very reliable. But from what I've seen, the average (or at least median) skill set was better for younger people, where the best ones I've met are in their 30s.

My assumption is that there has been a point where software/CS started booming and people started getting degrees in that field. Among the 40+ devs I know there's a higher percentage of carreer changers, and with a few exceptions those have been generally less skilled.


It's good advice to keep up to date and in shape, and always have, but unfortunately even that is often not sufficient.

Lately potential employers have been expressing surprise that I never went into management. I enjoy developing so was never interested---until I encountered a few truly incompetent bosses in the last few years and rethought my position. So I've read all the classics, such as MMM, Peopleware, etc... but found it is too late to be hired as a manager when you've never managed anyone.


One of the strongest team members at my workplace, super smart, asks more questions than anyone else, thinks of the stuff the younger guys don't, basically would always want him on my team as he gets stuff done, is an older guy. I think he's prob in his mid 50s. (to be clear, I don't actually think mid 50s is old, but society sometimes suggests it is)


32yr old here. This is interesting to me, coming from times of 28kbps internet where the dude with a huge beard was a tech god.

Reading this and reading the posts, I'm aware of what occurs to be a trade off in young/age, in that young can be cutting edge while old tends to have wisdom/smarter about approach/worldlyness.

Seems shortsighted to think that young is better.


It's ok to get older in the tech industry as a whole just not in spaces like are Silly Valley and or places that are trying to be such and draw that crowd.

The best place in tech for older is a govt. job where age skewers older and you may even be a minority amongst your fellow laid back/no drama, hard working Indian co-workers. The pay is more then good too!


I read a study in the 90's comparing more experienced and less experiences multimedia programmers given identical tasks. In short, it concluded that the more experienced ones took longer to do the tasks, but did them to a higher level of quality.

That has always rung true to me, and I see that happening today.


I think the industry you choose matters. I have done work in the Medical and Transportation industries and the devs there seem to be older. I tend to gravitate towards development where the code simply has to work correctly. This means proven technologies are used as opposed to the new and shiny.


I get inspired by the japanese Shokunin work ethics, it is well described on the Jiro Dreams of Sushi movie, a explanation can be seen here: https://www.youtube.com/watch?v=Q78xvcnmIMw


"Uncle, What do you think."

It's tech.this age thing.

Forced out of tech by invention/adoption of cell phone age is worshiped in my jobs today. Like Indians calling seniors "uncle" to show respect.

Age is my shield, my platform, my integrity and my perceived knowledge.

It's tech


I've found out how to sidestep agism - work for yourself. Build it in whatever 'old' language you want, bring value, and profit.

Now I get to sit on the porch and yell at the youngin's "Get off my lawn!"


> After years of scoffing at talk of prejudice in the information technology field...

It seems kind of ridiculous that white men won't believe in hiring prejudice/discrimination until it directly affects them.


This cuts both ways. As a young founder and manager, I must be extremely conscious of hiding my age. Without knowing the number, my team is fine. Once they know all hell breaks loose, at least for a while.


Fantastic article. I am lucky enough to work with Don and he is amazing. Probably one of the most simultaneously brave and curious people I've ever met.


> I’m only as good as what I’ve accomplished in the last two years

This implies no more salary increases after two years?


No, not at all, just that interviewers are totally uninterested in anything older than about that date. I've been in the industry for 16 years, the last 9 on my own as a consultant and have been kicking the tires on some full-time jobs in the last couple of months and it's definitely eye-opening: they really only want to talk about what I've just worked on and trying to use examples from anywhere else in those 16 years is met with suspicion. Some of the hardest problems I solved were when I was limited to ASP and Access. The fact I didn't build the thing in Mongo and Clojure doesn't mean I can't use those technologies. Some of it is the impedance between "technical" recruiters who are really just looking for you to say certain keywords and some of it is the age gap between older devs and the developers who are typically interviewing them.

I have a resume that's good enough to get me in the door at a few places you've heard of this time around but it was also turned down flat without more than a recruiter call at other places where my current experience exactly matched the tech they were asking for. I can't help wondering if they saw "16 years of experience" and either thought "too old" or "too expensive". I have considered doing a little A/B testing on that.


It just struck me that Zuckerberg is starting to get too old for himself.


And in a few years he will be touting the value of people right around his own age. It's human nature and one of the most valuable things about getting older (while continuing to learn) is to be able to see that kind of thing and value insight from anyone, not just the people who look exactly like you.

Harder to learn if you build up an empire of people who look exactly like you.


Life is ageist.


Great post; You should focus more on Consulting jobs;


My advice to all is to keep an eye on the ball and understand where you are and where you expect to be every five years and have a plan. If you plan on being just an expert or individual contributor when you are 40 or 50, then expect to have to compete with those in their 20s. Not too easy when a 25 year old has fewer responsibilities at home (like teen-aged children) and can spend their spare time learning the latest fringe technology to make it more mainstream. Also not too easy to fit in culturally unless you have the developmental maturity of a 25 year old. (See my note about The Stakes at the bottom.) Adults with no responsibilities can drop acid and go to Burning Man without consequences. Responsible adults cannot without the risks they take impacting others. (in response to a poster's comment) That's what separates the men from the boys, to coin a phrase. No offense to ladies or others.

What I am not hearing in all of these discussions is talk of developing leadership fundamentals. They apply not just to one's job/career but to the individual and all aspects of their life.

There's 5 levels of leadership:

I. Individual Contributor Self-leadership. Responsible for producing work and getting along with others.

II. Expert/Manager Expert Best at what they do. Work on more complex projects. Display a special talent. Design a plan for new products. Further develop their craft. Innovate on projects. Demonstrate readiness to tackle more challenges

Manager Managers are tactical, focus on the short term. Lead individual contributors and experts. Develop staff. Focus is on improving upon weaknesses necessary to succeed at being more than an individual contributor or expert. Navigate organizational structures. Maximize talent of team. Think strategically about how team contributes to organization goals.

III. Leader of Leaders Leaders are strategic, focus on the long term. Focus shifts to training level IIs on their managing weaknesses. Training and developing (mentoring) experts and managers. Role is critical to the success of an organization. Poor managers have a huge and damaging impact because they leave high turnover and disengagement as well as low morale and productivity in their wake. Refined communication skills up and down the organization, acting as a reliable conduit for information to flow up and down. Develop business acumen. Develop organizational strategy. Develop new leadership opportunities.

IV. Leader of Functions/Divisions Maximize the contributions of all groups within the function/division. Strategize the development of the function for the future of the whole organization. Builds a competitive strategy. Ensures long term growth. Mentor and engage direct reports. Build key relationships outside organization. Deepen their intimate knowledge of other functions. Attune to industry and market shaping factors (sector acumen).

V. Leader of Organization Manages all functional leaders. Sets the vision and strategy. Ensures future success. CEO Build a team of differing strengths. Empower functional/division leaders. Create a motivating culture. Share the vision of the future. Position to be at front of trends.

As you move up 3 things change: - Scope of your view - The Stakes/impact of your decisions - Proportion of management and leadership

What skills do you need to maximize your potential? What skills do you need to develop for the next level?

Leveling up is growing up. If you don't like or want to be a manager or leader, figure out why not, starting with understanding your emotions and managing your stress and anxiety. This is usually what stifles one's development.


[flagged]


We detached this subthread from https://news.ycombinator.com/item?id=13137878 and marked it off-topic.


Oh, give it a rest with this shit. Literally every thread where Javascript is mentioned, even in passing like this one, someone feels the need to pop up with the 'Javascript is terrible' meme. I get it, lots of people don't like Javascript, but this comment adds absolutely nothing to the discussion and just drives people away from HN since the same negativity and tired one-liner gets churned out over and over again.

If you absolutely must tell the world that you don't like JS, at least be interesting or actually list your reasons so there's something to discuss.


No.

JavaScript, the core language, is OK. Sure, there are warts, like this, lack of a good module system, and weird type coercions.

Most of the bad rap JavaScript gets is because it lives in the browser, and the DOM, with all the cross browser issues that come with that. It's not all JavaScript's fault. In fact, most of it is definitely not JavaScript's fault.


At least as far as I'm concerned, the biggest wart of all is variables that are global by default. That just makes no sense at all...


But this was fixed with strict mode; this is like criticising Java for not having generics, even though generics were added ages ago.

You can still assign/use global variables in the browser, but like the parent says, this is the browser's fault - having to accept old and badly-written code for compatibility reasons.


My biggest problem with JavaScript is that I never can seem to wrap my head around the multiple null/undefined type of values and how to correctly test for them, as well as type introspection. Or, more appropriately, my problem with JavaScript is that it makes that confusing, and I use it just infrequently enough that I forget what I learned last time regarding that. Of course, that's not entirely JavaScript's fault, it just doesn't make it very easy.


I think the majority of complaints about JavaScript are from people who are forced to use it infrequently, and don't quite remember how to use it.


I don't really use null at all. I either just check falsiness, or if there's some chance of the variable being a 0 I check for typeof == "undefined".


[flagged]


The guidelines ask us not to complain about downvotes in the first place, but we certainly can't go off like this.

https://news.ycombinator.com/newsguidelines.html


Perhaps you're being too sensitive. The OP is complaining that older engineers have trouble getting a job. I'm suggesting that rusty engineers have trouble, and age may not be a factor. That somebody didn't want to hear it and downvoted me as a result wasn't particularly constructive.


The problem is if you find a problem and be negative about it, you'll more likely be downvoted for it. Unless you give a solution to the problem and find a way to work it out.

Like pair up a greybeard with a mentor who will help him/her learn the new technologies, there ought to be hackathons for greybeards or something.


[flagged]


In the absence of other information, I'd chalk that up more to the client. This is a shopping site for the US arm of a German manufacturer. Cutting-edge design was probably not of high importance to them. Also, I don't see any claims by this person that they were a designer.

What your criticism reads like is a pretty good example of how things like ageism, sexism, or racism sometimes play out. Take something, find some fault with it, exaggerate its significance, and ascribe the source of that fault to an immutable characteristic of the person involved.


> In the absence of other information, I'd chalk that up more to the client.

That is my first thought as well.

Spending time optimizing it for performance may not be worthwhile for the client. Likewise, hosting it in a way that can scale to several times the traffic may not be worthwhile.

This is something you learn with experience. If you deliver a site that's super-fast, with deployment of the latest serverless architecture with the newest UI framework released this week.. but you didn't have time to put in half the catalog, the client is not going to be happy at all.

There are also clients that insist on certain specific things that may not fit with your artistic vision (specific graphics, colors, logo, text, layouts, icons, etc). You have to balance how hard you push back with how much you want to risk pissing off and/or losing the client.


There is no truth to what Zuckerberg said - the guy has said many idiotic things in his life, and this one ranks near the top. You're picking the author's website as an example, but it's a single datapoint - not representative of anything. Many younger people code much worse things, and many older people code much better things.

I'm slowly reaching 30 (agh), and I refuse to work in any team that doesn't have any members above at least 40. In all the teams that I've worked in, having people in the team (not necessarily management) who have been around for more than a few decades has always been extremely valuable. My best managers have been a man in his 50s, and a woman in her 60s. My worse managers and colleagues have consistently been too young and too inexperienced and too cocky.


It's not what he said so much as that he said it out loud to a room full of people and no one batted an eyelid.

Imagine the shitstorm if he had said "whites are just smarter" - the lack thereof demonstrates that the ageism problem is very real (probably the only form of discrimination left).


If I was to meet Zuckerberg I'd quiz him on that quote and ask him, did you A/B test that? If he did boy would he be in for a surprise ;<).


> but it's a single datapoint

Thanks. But isn't your experience with very old managers also a single datapoint, or at best, a dozen datapoints?


I have a few hundred data points of coworkers in my career at this point (maybe even low thousands), and the number of "out of touch old guy" I've encountered pales in comparison to the number of "arrogant clueless young guy" I've had to bear. (interestingly enough, I've had many good female colleagues, a few great ones, and no terrible ones)


>I've had many good female colleagues, a few great ones, and no terrible ones)

Pretty much the same here. It's a disappointing confirmation of the idea to survive a woman has to be twice as good.


I didn't downvote you, but your claims (and Zuckerberg's) about the intelligence and aptitude of us old folks (I'm not even 40 yet) are just absurd.

As regards UI, usability and the like, your comment is highly subjective and entirely unsubstantiated.

And your comment in your third edit is just as absurd as your notions about old folks: the site's author(s) likely never intended the site to be part of, e.g., a HN surge. A good number of the application backends I've written were never meant to be exposed to that kind of surge, and I didn't design these backends to handle that kind of surge because it wasn't necessary. That you think it is demonstrates more about your lack of understanding of engineering fundamentals than it does about old folks' ability to learn and adapt.


> but your claims (and Zuckerberg's) about the intelligence and aptitude of us old folks (I'm not even 40 yet) are just absurd.

I never made this claim. OP's post makes this claim.


I don't think we'd better start identifying look-and-feel of websites with programming skill.

One thing I've noticed whilst becoming older as a programmer is what pleasure I take in certain sorts of what people call "rookie" errors.


> is what pleasure I take in certain sort of what people call "rookie" errors.

Can you provide an example?


Such examples all have troll power so I might need a dispensation.


Dispensation offerred.


Ok! Well some that come to mind are (a) short variable names; (b) not bothering to encapsulate state or lock things down into modules; and (c) tables for layout.

I don't do these things because they're 'rookie mistakes', of course, just like I don't order whipped cream for dessert but might like it as an extra. I enjoy the feeling of lightness that comes with naively doing what's easy, as opposed to the weight of preconceptions that in practice impede the work.


My thoughts:

"Current tech trends in terms of UI" are generally antithetical to usability. Speed is an important aspect of usability, and web sites and mobile apps are as slow and unresponsive as they've ever been.

Like security, performance is a holistic concern; it cuts across layers abstraction (your "stack"). Because even good software is so layered these days, a younger programmer is less likely to have visited enough areas of the stack to form good mental models about performance.

On looking at the site you point out, I think it could use some CSS changes, but that's usually in the domain of design and not engineering.


> Speed is an important aspect of usability.

Agreed. But I couldn't bring myself to get past the ugly UI and get to performance testing, bundling / minifications, async calls etc.


Well the site (kettlerusa.com) has crashed now...


[flagged]


Please stop complaining about downvotes—this violates the guidelines—and please treat the community with civility by not ascribing to their actions motivations of your own construction.

https://news.ycombinator.com/newsguidelines.html


You size the hosting for a site for whatever you expect to have, not google scale - to do anything else is a waste of money/resources.


Please post your dynamic site that can survive getting to the top of hn.


Does it escape you that you're on HackerNews? It's not the pinnacle of design by any stretch of the imagination, yet here we are...


Function over form.


What is the objective measure of "ugliness"? I found the site to be well-organized and intuitive. In less than ten seconds I understood what kind of products they were selling, how to buy them, and how to find out more about the company. As a _user_ the site was _effective_. Oh, and free shipping for orders over $40! Knowing this up front (because my attention was drawn to it) could very well influence my buying decision.


It's a shopping site for a manufacturing concern, how should it look? It seems functional, I can find products on it, what else does it need?


Looks fine to me. Ikea's site looks similar (to my eyes at least), btw. And nothing like the 90's whitehouse site.

There's a lot of niche sites that get the job done without following all the latest "trends". More power to them.

Function over form FTW.


The site looks fine. I can easily pay $0-$10 to get a super nice HTML5 site that would look great because it was made by a professional designer, then tweak it a little to make it look somewhat unique in a 10th of the time. Eye candy sells, but it is cheap. Most people would be better suited concentrating on back end abilities; it's generally a much more valuable skill set.


> I understand I will be in your shoes in 4 decades

So you're 17?


Google's landing page.

Your argument is invalid.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: