Hacker News new | past | comments | ask | show | jobs | submit login
If You Want a Job Tomorrow, Cultivate Your Career Today (polyglotprogramming.com)
35 points by citizenparker on Nov 6, 2009 | hide | past | favorite | 19 comments



The article seems to take the ageism in the industry for-granted. Excellent article, otherwise.


Ultimately hiring decisions are made in most organizations by people who got out of tech and into management early in their careers and believe that that is the "successful" career path. When they look at an older engineer, they see someone "unsuccessful" (and either don't care about or more likely don't even understand his or her technical accomplishments).

This isn't going to change any time soon and it's wise to prepare strategies in advance.


You might be right, but I would like to see more discussion on the topic. Is the ageism a result of actions by non-programmers, or are programmers contributing to it too?

Few days ago a guy complained about having to write code during the job search process. Most comments I read here on HN were not sympathetic to him. He was advised not to send out as many resumes, show more passion etc etc. "Show some code" is a totally different request than "Show me the code for this exact problem I made up" - yet many programmers here were commenting as if they have never had to do a job search.

My point is that before blaming others, it might be good for the community to introspect and see if the cause is truly from outside.


I like what he says about learning a new language every year and I sincerely want to do that. However if you're in a Java shop and are learning Ruby/Python on your own, how do you keep from forgetting it. If you don't use it, you lose it right?


No matter what kind of shop you're in, you own your own machine. Use those languages for local stuff, your stuff, shell scripts and such that aren't part of the company product. Any good developer uses the command line a lot, it's the most efficient way to do lots of shit.


Then find ways to use it.

At work, my project is purely C++. But if I have to write a tool for data analysis or to test a subsystem, I'll use C# so I improve my .NET skills. Before,I used Python for that, but C# became more appealing.

Same thing at home. Web projects (of which there are few!) use PHP, everything else is C/Assembly (I do a lot of microcontroller stuff) or C# front ends using Visual Studio Express.


The important thing is to learn the concepts and structures that programming languages indirectly talk about.

If you understand the differences between the OO concepts in Java, Ruby and Python, then you'll find it easy enough to pick up the surface layer (the syntax) of the languages.


This is true but a large part of "knowing" a language is knowing its standard library, where to find the function you need, and which of several similar functions is the exact one you want. The syntax you can learn in a day, but the libraries need time.


He gives two examples in the same article; "Try to use it at work" and "Get involved in an open-source project".

If you're in a java shop and learning Python, have a look at jython and see where you can use it for testing, automation, etc. Even if only for your own local scripts.


I think learning a language that may not be applicable at work has two key benefits:

1. I learn it "enough" to know what it's good at, what it's bad at, and to have enough hand-written code to pick it back up pretty quickly (particularly if you set up a test-driven learning environment)

2. Even when I forget the language's particulars, it changes the way I see other programming languages. For instance, while I don't use Ruby in my day job yet, I still think of things I can do with method_missing and how I might approximate that same power and flexibility in my work where appropriate. In short, learning languages helps me program "into a language" rather than "in a language", to borrow Steve McConnell's terminology.


One option would be to, in your free time, get involved in an open source project that uses the new language. Additionally, that might also help build your reputation and resume quality experience.


The list of things to do seems shortsighted to me. I can't imagine guys like Donald Knuth or Linus Torvalds caring much about agile methods and the new language of every year. I think it's more like a list of things to do that worked for him.

Other than that I agree with the main idea of the article...


Guys like Knuth and Torvalds are not primarily concerned with gaining employment, because their resumes are already pretty much at peak impressiveness. There are not many people in their class. The alternative to spending all your time on the project that you're famous for (if, for example, you are not famous...) is to spend time doing some of these things.


I've got a follow-on question related to this though:

If you do have all the skills he listed, how do you make the leap into doing real top-level creative work?

For example, one of the top links on Hacker News now is a profile of Brad Fitzpatrick, and it's pretty much accepted that he's done some industry-changing work (LiveJournal, memcached, Mogile, OpenID, Pubsubhubbub, etc.) But if you were familiar with his work c. 2002, it wasn't all that impressive. Yeah, he was a good programmer, but he just ran a website with some modest success. Several of us have done the same.

I've heard the same applies to other leading programmer luminaries, eg. John Carmack.

Somewhere along the line, some programmers start really distinguishing themselves while others remain merely "good". And I don't think it has to do with ploughing all your efforts into one project. People like Brad Fitzpatrick, Jamie Zawinski, Paul Buchheit, or Rob Pike are known for multiple contributions. Is it just the cumulative effects of time, or is there something specific they do with their time that propels them from good to great?


I don't know how to be successful on that level, but I've a hunch that paying more than the slightest attention to things like "job", "resume", etc severely lowers your chances. Also, one thing I noticed while reading Coders at Work recently: while some people are brilliant, some seem like they're merely good, and happened to make the right decisions. Those decisions always seemed to be the risky ones, but we don't see what similar decisions they went the other way on, nor the hordes of similar programmers who were cautious or unlucky.

And maybe there aren't such hordes. I dunno.


I think people like Knuth, Torvalds, Carmack, etc. are the models for career development. Why would you assume that a strategy that didn't work for any of the biggest guys around would be better for you than theirs?


Because in all likelihood, most people don't have what they have. Those guys are outliers, not the norm.


I think that answer's a cop-out. Knuth, maybe. But Carmack and Torvalds had pretty unremarkable starts as programmers.


Good article, but also guard against excessive egotism. People who think they're big shots because they started an open source project, have a blog have some followers on Twitter are not always that great.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: