Hacker News new | past | comments | ask | show | jobs | submit login
Five Pervasive Myths About Older Software Developers (lessonsoffailure.com)
81 points by gacba on Feb 23, 2010 | hide | past | favorite | 66 comments



What's hilarious about this myth is that I learned to program in NeXTStep using Objective-C well over a decade ago and am enjoying writing iPhone apps. But I hear younger programmers bitch about how hard it is to write code in Objective-C, manage memory, etc.


But I hear younger programmers bitch about how hard it is to write code in manage memory, etc.

That's because we like to create applications, not program computers. Memory management is programming for the sake of programming, it doesn't do anything user-visible. The computer can do it automatically, so why pass it off to the human?

(Also, C does have automatic memory management. Ever "free" anything you allocate on the stack? I thought not.)


There are a lot of things that you depend on that aren't "user-visible" and require what you call "programming for the sake of programming." In fact, I'd go so far as to say it's good that some of us love that stuff, otherwise there would be no one to write and/or fix the fancy language features, libraries, and embedded systems young whippersnappers like you depend on.

By the way, I'm actually fairly young myself (23), but I've always loved the down and dirty, close-to-the-metal details of programming. And I also have the soul of a 50 year old (a la: "I need a lawn so I can yell at kids to stay off it." http://xkcd.com/479/).


Uh, Q.E.D.?


"Old people like unnecessarily performing the same exact task a number of times, and enjoy spending time with obscure debugging tools when they forget to do it right in one place?"

OK, sure, I agree.


Comments like this shows immaturity. Just because a platform manages memory management doesn't mean you can create 1 million objects. If you program in C++ or C you appreciate these aspects which will help you write better code. Satellites designed in 1970's still going to the far edges of the solar system even though the processing power and memory is less than your cell phone. When you design your software without considering these aspect would be disaster. Example satellites designing now crashing on the orbits of other planets.


I agree.

> Just because a platform manages memory management doesn't mean you can create 1 million objects.

Yes, you still have to watch out. Only your example is not crass enough. A million objects does not have to be very much. And e.g. a Haskell program compiled with GHC can churn through a Gigabyte of memory in a second (see http://blog.interlinked.org/tutorials/haskell_laziness.html), and this isn't a bad thing. (As long as the memory is immediately reclaimed, and you do not actually use the whole Gigabyte at once.)


OK, but I feel like I'm not reading the same comments you (and the voters) are.

Why is allocating your own memory in Objective-C "better" than letting, say, GHC, do it for you? I may be a young whippersnapper standing on your lawn, but my programs run as fast or faster and are more reliable. And it takes less time to write them.


I'm a young whippersnapper standing on the lawn too, but there's something to be said about knowing the underlying workings of code before using it.

Here's the disconnect I'm seeing: Allocating your own memory in Objective-C is not necessarily better. But all other things equal, a programmer that understands how GHC manages memory is better than one that does not. You're arguing the utility of this knowledge, as it pertains to your usual use case, is minimal. This is not the same as saying that the utility of this knowledge is minimal.


But all other things equal, a programmer that understands how GHC manages memory is better than one that does not.

Sure, but I never said this. The original post, now many levels up, said "kids bitch about how hard it is to manage memory". It is hard. That's why people bitch about it, and don't want to do it.

I say that it's conceptually very simple. But in practice, it's very difficult to type the code to do this yourself at the right place. (It doesn't sound difficult, but most C programs either leak memory or use memory they didn't allocate, so clearly it is difficult.) There is a reason why C++ has things like the boost pointer library (and even auto_ptr). It's because memory management is for computers to do, not for humans to do.

I think people have trouble realizing that a human can know both "high-level" and "low-level" things, but that a programmer needs to stick to one level or the other -- don't write low-level concepts (memory allocators) in your high-level application program. The same person can write both parts, but the programmer writing the high-level app shouldn't be worried about the low-level details ("of course it works"). When you mix the levels of abstraction, you write bad code.

(I wrote a toy language once. I found that writing a garbage collector was much easier to get right than manually managing my own memory. When writing the collector, the details of memory allocation was all I needed to think about. When writing the high-level application code on top, all I had to think about was my high-level application. That's the whole point of abstraction, and it's what C-level languages fail to provide.)


I don't think people are arguing against GC being good. It's just that Objective-C did not have GC until very recently and that's they way it is.

Having used lots of languages and frameworks I think that Cocoa/UIKit is very elegant despite the manual memory management.

It might even be true that most of the big Cocoa applications will continue to use manual memory management.


I'm a young guy and I find iPhone memory management is very easy. The built in reference counting system with objects makes it fairly nice to work with vs. some ad-hoc system dealing with malloc'ed memory here and there. You might get a memory error here and there, but usually it means your design was sloppy somewhere [usually involving arrays, the bad part of objective-c design] and would probably eat up memory in a garbage collected system anyway.


I can't help but wonder who these "younger programmers" that don't know memory management are. To graduate from my school, you must know it very well. In fact, a CS major can't graduate without passing a class that requires you to write a malloc implementation yourself. A good amount of CS majors go on to take Operating Systems, whose core project (writing a kernel from scratch) requires that you interface with the virtual memory capabilities of the x86 processor (involving frame allocations, setting up the page table, etc.)

Maybe the ageism thing is more of a lack of understanding on both sides than a directed bias against only the older people?

I'm not trying to say that young people won't end up being better programmers in 10 to 20 years; if that weren't the case, there'd be no point in pursuing it as a career. However, to assume that comparative ignorance at 20 is the same thing as global ignorance is unfair.


Not every programmer majored in CS. In fact, these days unless you majored in CS or EE, you probably won't get any exposure to lower level programming unless you learn it on your own. For people who get into programming via other domains (math, stats, design, etc) there's generally just not a very compelling reason to learn these things because you're going to be doing 99% of your programming in a higher level language like Matlab, SPSS, Python, etc, where memory management is largely abstracted away.


My argument isn't for the relevance of memory management or that everyone is aware of it, just that a lot of junior people are aware of it, so it's unfair to characterize them all as "bitching" about the hard stuff.


I don't know what makes you think that all CS programs are like this. My operating systems course did not require me to write a kernel. In fact, there was only one class that required you to work in a language that required manual memory management (using c++). Granted, my department, at least at the time, had a relatively strong functional bent: before you graduated, you would have worked in scheme and haskell. Most people ended up taking other classes that forced them to use C or more c++, but wasn't required. I'm a vassar college alum, for the record.


It's always a red-flag for me when in an engineering organization of statistically significant size (which does exclude YC start-ups, so don't view this as a judgment) there are no older engineers. It isn't just the age discrimination angle. It says something about technological identity and mission: they aren't solving a problem that has a barrier to entry (or they aren't going to be able to get past that barrier).

As someone pointed out in the "when an engineer goes to Google, a start-up dies" thread, there's an great value to having people with vast technical knowledge and experience: they'll be able to bring new concepts vs. merely new technologies to the table.

Being a founder is a different matter (the risk factor is much more significant once there's a family), but I don't see a reason to not hire older engineers (all else being equal). Disclaimer: I'm <30 myself, but benefited tremendously (earlier in my career) from contact with older engineers.


I'm not sure how to feel about this. In 1995 I was hacking scripting languages and MySQL... and that's what I'm doing in 2010. Hardly anything of substance has changed.

The one thing that's different is my attitude.

Sometimes I feel that I know too much about programming to be employable as a programmer any more. So much of our industry is about foolish waste. Half of all projects fail. Most of the other ones are rotten ideas that make life worse for everyone. Few managers (or markets) really have the patience to build good software, or the maturity to listen to advice about what might be better.

When I was in my 20s, I was dumb enough to commit crazy overtime hours to making projects work, even if they were ultimately doomed and stupid. And this served me well in the job market.

In this environment, experience may not be an asset. It seems to me that the ideal programmer, in an economic sense, is someone who obediently writes unmaintainable code quickly and is unaware of the future pain they are creating for themselves and their customers, or is unaware that all their work is likely to be forgotten in a few months.


> In 1995 I was hacking scripting languages and MySQL... and that's what I'm doing in 2010

I think that's the problem (I'm also not a technology bigot: you can do s/scripting languages/J2EE/ and s/MySQL/Oracle/). Doing simple things may pay well at some point, but there's no technical career growth path.

There's also no security: there's no experience/education/intelligence based barrier to entry. In the end, there's nothing to differentiate a 40 year old CRUD-screen developer from an 18 year old one. You can't say the same thing about a (for the education/domain knowledge angle) search relevance expert or kernel hacker.


For what it's worth, personally, I'm doing a lot more than simple CRUD. I'm lucky enough to have worked at larger and larger scale with every job. And the latest contract is actually kind of nifty. I picked it specifically because it has a sort of lasting impact, and the code will be open source and of use to other projects.

That said, 90% of the jobs out there do seem as futile as I described.

And is anybody making their living doing CRUD from scratch any more? I mean, not counting corporate Java programmers? (rimshot)


> For what it's worth, personally, I'm doing a lot more than simple CRUD. I'm lucky enough to have worked at larger and larger scale with every job.

Then wouldn't this invalidate your comment about doing nothing different between 1995 and 2010?

> And is anybody making their living doing CRUD from scratch any more? I mean, not counting corporate Java programmers? (rimshot)

Plenty do, I know lots taking high pay contracts doing nothing but CRUD. Many start-ups exist that do nothing beyond CRUD in PHP.

I have nothing against scripting languages (I tried to make it clear by including J2EE/Oracle in the same category as LAMP). For when you have to do CRUD, at least for webapps, scripting languages are many times less painful than Java and C++. The "Hello World" Chapter in the Hibernate book is sixty pages, compare it with Perldoc for DBIx::Class (http://search.cpan.org/dist/DBIx-Class/lib/DBIx/Class.pm). Not to mention Hibernate is actually considered "light weight" compared to EJB and other enterprisey technologies (which luckily I've never been exposed to).


"... I think that's the problem (I'm also not a technology bigot: you can do s/scripting languages/J2EE/ and s/MySQL/Oracle/). Doing simple things may pay well at some point, but there's no technical career growth path. ..."

Intelligent comment. Does it break down when you do startups? In all the products I've worked on you start off doing simple CRUD but pretty quickly move into doing other deeper things.


I think what he means is if you just stay withing doing simple CRUD. I think the vast majority of programmers do CRUD or slap GUIs on other people's code as a way to start (I know there are exceptions, but they are exceptions).

What differentiates those that grow from those that don't is whether they go deeper or just keep doing more CRUD and more GUIs on other people's deeper code.


"... What differentiates those that grow from those that don't is whether they go deeper ..."

Good point.


Depends on how you think of CRUD.

CRUD is actually just what were called 'expert systems' in the 80s. Once you realize that, you can see that it is actually the most interesting type code.

The real problem with programmers is that they don't specialize in a specific type of application and become domain experts in it. Choose a field, such accounting, or hr, or finance, and become an expert in it throughout your career, so that you develop the most featureful CRUD application of that type in the fastest time (or base a startup around it).

Knowing java, php, sql, etc. is like an accountant claiming he just knows bookkeeping. A really valuable accountant is someone who might be a specialist in taxes for medium sized manufacturing firms in california for example.


"CRUD is actually just what were called 'expert systems' in the 80s."

No. CRUD was a programming style that arose from relational database technology. CRUD = (Create, Read, Update, Delete) corresponds to the SQL CREATE, SELECT, UPDATE, DELETE statements. CRUD was popularized by Oracle. For example, Oracle's program generators IIRC used the acronym CRUD as part of their program generator's specification.

"Expert systems" in contrast, were a specific outgrowth of AI technology in the 70's and '80's and were defined loosely as systems that mimicked human expertise.

There is no necessary conceptual overlap in the two terms.


That's a simplistic unerstanding that ignores the actual technology. Expert systems were nothing more than logic programming dressed up to seem all "ai". Prolog is backward chaining, while Clips expert system shell is forward chaining. Today they are referred to as business rules engines, which can again be forward and backward chaining. Sql databases are also in fact logic programming, basically prolog without recursion and iteration, but instead have pl-sql and t-sql to make them turing complete. Also, with triggers they are also forward chaining too. The main difference between crud and expert system is essentially that crud is multiuser and everchanging, and so is actually a better version of classical expert systems.


I'd disagree; the modern day equivalent to expert systems would be ersatz "Semantic Web" (the idea of intelligent Internet applications; including, but not limited to RDF/OWL/Reasoners and/or "business logic" engines such as JBoss Drools).


My comment above explains why sql is pretty much the same as logic programming and business rule engines. The semantic web is just a way of saying that everyone should be using a "universal database schema", and once you think of it in those terms is a pretty silly idea - there isn't a remote chance that will happen.


There is a time and a place for one-off throwaway code, and also for extremely fault-tolerant provably correct code. Most stuff falls somewhere in between.

If you are experienced you can determine what is necessary and make good recommendations accordingly. If you are good at what you do, you should be able to get a job working with other people who are good enough at what they do to listen to you.

Maybe you'll be involved with a few projects that fail for various reasons, but I think it's possible to navigate the morass of failure and come out with a good track record at the end. Your production code may not last a long time, but if it served its purpose at the time then you should be proud.


Many people start in coding, but realize they are bad at it and look for opportunities to move into other areas (eg management). When I worked for large companies, many of my managers would list programming as their early roles. You could tell, or they would admit, that they were no good at it, didn't enjoy it, and got out as soon as they could.

The ones that are left are more likely to be good, simply because the poor ones with better opportunities have left.

Edit: "better opportunities" from their point of view, of course.


For the older crowd on HN: How much of ageism is real, and how much is in your head?

I only ask because all the places I've worked at have had a lot of older programmers, at least in their mid 30s up to mid 50s being the typical range. Then of course there's the slew of non-quite-programmers-anymore but who still make tech decisions (managers).

Then there's the very frequently uttered "Well he doesn't really have enough experience but we need to fill this position ASAP" and "He worked for Corp XYZ for 5 years?! Get him now!"

I'm not saying it's tough to be a younger programmer because it's not, but strictly IME (internet companies) older devs are a rare and valued commodity.

Now that I think about it, could it be because most of the senior devs don't have much web-related experience, due to it being relatively new?


I've yet to notice anything and I'm pretty old, but I also look much younger than I am, so I'm probably not a good data point. I think attitude plays a huge role here. I don't act like I know it all or have nothing to learn because I'm relatively new to development (officially), having done QA before that. If you're curious and willing to learn, it's unlikely that you'll written off as inflexible and closed minded. (AKA, "age is a state of mind.")

Regarding the article, I wish it had some actual empirical evidence to back its claims.


It's both. Ironically, the more progressive places I know of to work that do pretty cutting edge stuff are more likely to hire older programmers with demonstrable experience.


If the older programmers are discriminated against, why is it that almost any job posting I see has minimum 10 years experience as a requirement?


Because you must be between 28 and 32.


The Management Reality Distortion Field is not required to be consistent.


Age-related decline is, according to my observations, caused by declining interest in the nuts and bolts of programming. When people lose interest, they start relying on high-level abstractions even when those abstractions are inaccurate or inadequate. Or they rely on concepts they've learned in the past, without learning what's new about the new technology. It looks exactly like intellectual laziness, but "laziness" is not the right word, because it is an unavoidable consequence of boredom.

Losing interest in technology is like tires losing their grip on the road. You keep going in the same direction, no matter which way you point the wheels.

Often age-related decline is masked by working in a static technical environment that a person has already mastered. It's like sliding uncontrollably down a straight road. Then, when the person needs a new job, they find that they haven't learned anything new in five years, are hopelessly out of date, and can't muster enough interest to learn anything new. The road finally curved, and they're in the ditch.

Personally, I've come to value my ability to be interested in the technical details of computer systems. It's embarrassing sometimes, but it's the key to my livelihood. (I've gone from being ashamed of it to occasionally wishing I had more of it.) I know I can't really be satisfied without an aspiration, or at least a connection, to grander things (fame, the mysteries of the universe, or boatloads of cash, depending on who you ask) but now I treat my geekiness as an asset instead of a distraction.


Experience is of course a double edged sword. It helps good judgement in some cases but does the exact opposite in others as the underlying dynamics change.

It's interesting that some patterns do not seem to change very quickly though. I'm reading "This Time is Different. Eight Centuries of Financial Folly" right now. It's a rather dry empirical study of debt crises. The conclusion is that these crises have certain patterns that recur again and again, but each time there are people pointing at new factors convincing themselves that this time is different and loads of debt are OK.

So it would appear that a minimum of eight centuries on the job experience would be appropriate for any banker.

But even if, as in this case, experience would lead to the right conclusions, I would argue that sometimes it's folly that actually creates value and innovation, even if 3/4 of it is later destroyed in a crisis. The dot com bubble is a prime example of that.

The real myth is that older developers work based on experience. Sometimes experience is just a convenient excuse for laziness. Younger developers find other excuses for that.

I have 20 years of programming experience but I'm still struggling with the urge not to jump on each and every new fad that's out there, at least when it comes to things like programming languages and paradigms. What I can do way better than any recent college grad is to explain in a very reasoned, professional and well informed way why it is absolutely critical to use that fancy new language or paradigm now ;-)


I'm pushing 50, and doing well. While I can see getting pigeonholed into the embedded system niche, I feel that I can still learn new stuff.

My father-in-law was hacking C until he retired at 67.

Keep reading, learn something really new every year. Best continuing education I've found is access to the ACM journals online.


Above 40 good programmers probably are rare, but doesn't have much to do with age, but is more of a generation difference. 40 and below is the same as the gamer generation, starting with those growing up with the first popular home computers (vic20/c64/zx spectrum).

Someone able to write C64 assembler games/demoes 25 years ago, will still be a great programmer today, even if he didn't touch a computer since then. The basics haven't changed that much, nothing that cannot be learned within a few weeks.

Older programmers are probably less likely to jump every new hype. For them it's just the nth way to do same, only without the already collected/own written libraries and tools. That could be a disadvantage when looking for a new job.


I agree, but there is more to it. A 50 year old programmer in 1995 likely finished their formal education in the late sixties. What are the odds that their education was actually in a software development related field? As an industry and source of employment, software development grew a lot faster then sources of relevant education. In the recent past a lot of the ageism may have been due to the older generation of developers just being the guy in the office who could program. I think the ageism will wane a bit as more and more of the older programmers have the same foundation as the kids just coming out of school.


The article is not very accurate. Age and experience don't necessarily correlate to each other. There are a lot of programmers in their mid-twenties with more than 10 years of experience. On the other hand a lot of the 40+ workers did work with only one kind of technology all the time. Or they have been promoted into management positions 10 years ago and know shit about whats going on today. Don't get me wrong, older developers with the same passion for computers as todays kids are extremely valuable. But they are also very rare.


Exactly: There are people who have repeated the same year of experience thirty times in a row ...


I'm young, and I have to say that I learned the most - and often enjoyed working most with - from the older programmers. Some of them were the best developers I've known, able to work magic with the keyboard.

All I can say is that I'd have no problem hiring an older developer if I was in a position to, and I have trouble understanding ageism. As always, you do have to be careful to pick the motivated ones, but this isn't a problem that varies significantly across age groups.


It's not really age discrimination per sec but a lack of adequate compensation. It's known that good developers are 10x more productive than average ones. Aged developers are usually pretty good since they have survived this game for so long and with all the experience coming with it. But a developer with 15 or 20 years of experience is rarely making more than 2x or 3x of someone just started. Older developers just quit after this realization.


That or they get pushed into management with the lure of higher $ multipliers. But, it's too late to realize that management sucks and the lure of $$$ was a siren song. It's with great courage that you attempt to go back the other direction.


I'm in my mid 20s and this is actually a concern of mine. I love designing and building software systems and I would be happy to build them into my 40s. But people may wonder why I'm forty and still "just a programmer". I hope to have a stable buisness by then and won't have to deal with this problem.


But people may wonder why I'm forty and still "just a programmer".

I'm 54 and "just a programmer". A few thoughts:

- I've done all the other jobs. Now I'm doing what I love best.

- There has never been a better time to be a programmer. I can't imagine doing anything else right now.

- I'm on the critical path. Of all the things I could be doing, programming is where I'm needed most. It's much easier to find people to do all those other things that to build the software. (Not a judgement, just an observation based upon many instances of supporting data.)

- I don't care what other people think.

- If I did care what other people thought, I'd just tell them that I don't remember the last time my boss made as much as me. That usually shuts them up.


I'm 31 and this is refreshing to hear, however my concern is not so much what other people think as it is job security. My perspective is that by the time I'm 40 I want to have a solid nest egg, and by the time I'm 50 I want to have my fuck-you money, and only then do I only feel secure being a programmer/creative for the rest of my life. Otherwise I feel like I better bone up on management skills to hedge my bets against age discrimination in my 50s and 60s.


I agree with edw519.

I have been programming for nearly 44 years. I have done a little bit of management during that time, and I can assure you that spending much time as a manager will erode your programming skills.

In many contexts, managers are more expendable than programmers. in fact, these days, apparently more than ever, highly talented programmers are hard to find at any level. Managers are a dime a dozen, and can get whacked just as easily as anyone.


I'm 41 and programming is more fun for me than ever. Don't worry about people wondering about you; it's worth it. Just make sure you have enough money to give your kids a good education.


I'm going to call this out as an important general principle:

Don't worry about people wondering about you.


Or, as Feynman said, "What do you care what other people think?"

http://www.amazon.com/What-Care-Other-People-Think/dp/B000S3...


Not to scare you but we've turned people away (not me personally, my company) who have stayed at the same position at the same company for the past 20 years without advancing to a more senior position. That doesn't mean they're management, but more that their company didn't feel their input was relevant and they didn't have the drive to find another place where it was. My current co-worker is in his 50's and still doing what he loves but he's been exposed to a ton of different areas and has the resume to prove it (embedded, game, DB, web, you name it.) If you're going to go down this road (I hope to as well) make sure that your experiences are broad, the skills you picked up measurable, and the work performed varied. If not, companies like the one I'm part of will lump you into the "not motivated" category. I've seen the ageism and it's very real.


doesn't feel like ageism, this sounds more like the valuation of confidence. A confident person isn't going to stick around if you don't keep giving him or her promotions.

And you know? not everyone values confidence. I know I don't, mostly because I am very confident myself, to me, having an employee willing to take huge risks is pretty worthless, because you know what? I can do that myself. I want someone smarter than I am. Besides, people who are confident don't stick around very long.


I'm in the same boat with you. I love to ship a software product (doesn't matter if it's web/desktop/mobile app). I would be happy if I could churn out 1 solid app every 2-3 years with one or two long-term apps.

I did have that nagging feeling about being old and "still a programmer" or some sort, like I was going to stuck in my career path... but then again, some people are meant to be like that so I decided not to worry about what would happen in 10 years and move forward.

Besides, Linus is technically still a programmer :).


Countering "myths" with your own generalizations seems like a bad idea. The part at the end - "Young is not necessarily bad. Old is not necessarily good." is probably the best takeaway.


So far young has been bad in my experience (especially the entitled generation) and old has been great mentors for me. Usually the bad old guys are the sys-admin who turned to be the IT directors or the C programmers who scored a dream job (working in a high paying job with low responsibilities and demands).


Your brain naturally wants to make associations like that, but resist. You might find great mentors who are young, too.


I can only hope for that.

Seeing from yesterday's buzz regarding how hard it is to find a good developer, one can only hope...


Yes, "the entitled generation"... What about the "had no balls to raise their kids right" generation? Why are we hiring them into management positions if they're pushovers?


I'm ancient, so don't think I'm standing up for youth - and you do bring up a valid point. Older programmers at least have the "market proof" of being employed for umpteen years.


To some, career advancement means getting out of coding and into management. To me, that would be like switching from driving the race car to managing the race team. I'd much rather be driving thank you very much...


When I program "because I love it", I want to be programming on projects that I actually love. That typically means "not the ones at work". Getting out of programming as a career doesn't mean I'm not going to stop programming. On the contrary, it means I'll probably end up doing more of the type of programming that I love because I'm not burned out on programming by the end of the day.


This sounds more like "programmers versus cheese-head managers", not "old programmers versus new programmers"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: