Hacker News new | past | comments | ask | show | jobs | submit login

This is only somewhat related to becoming a better programmer but I'd rather post here than create my own thread for this.

I just graduated from college and I start work in 4 months as a software dev at a company that you've probably heard of and use on a regular basis. What can I do in these 4 months to maximize my chance of success at this company?

I thought I might go through SICP but I'm not sure if this is more of a theoretical exercise or if it will directly improve my work. I can't work on any open source projects because of the learning curve required to get familiar with the codebase, + I won't be able to work on it once I start my regular work.

Does anyone have any other ideas?




Assuming it's not a Windows shop, if you've never learned much about unix, work your way through this: http://www.compsci.hunter.cuny.edu/~sweiss/course_materials/.... Or study xv6 for four months. Or work through the coursera compilers class. Basically, picking a foundational area of CS and spending four months on it will pay off for the rest of your career.

My career has been a constant process of letting go of being afraid of how the "low-level" and "wizardly" stuff works, and finding out it's not as scary as I feared.


Do you know what you'll be working on, or what tech stack you'll be using? I would probably try to get familiar w/ the specific languages, libraries, and tools you'll be using on the job - maybe build a small project in them and/or go through some tutorials. Also, when I say tools, I don't just mean libraries - but also things like version control (git), and continuous integration (jenkins), if your company uses those - in my undergraduate program, we never did much with those, and it's good to be aware of them.

It can't hurt to reach out to the company/team and ask what tech stack they use and what you can do to get prepared - being on the other side of this, I was thrilled when someone I hired recently asked this very question.


The biggest challenges you're going to have aren't going to come out of CLRS. Pick a large open-source project and practice getting up to speed with the architecture and organization, because that's what you'll be doing at your job (sans the open-source probably).


> I can't work on any open source projects because of the learning curve required to get familiar with the codebase

That's totally not true. You'll definitely be expected to contribute with in the first 4 month of starting at work, why are open source projects different?

The processes are probably similar as well. You need someone to help you on board, explain the big pictures, guide you through how thing are connected. Then you take bug, dig into the code and figure out a solution. This is probably what you'll do at a real job, so might as well get used to it. The hardest part is finding someone who is patient enough to guide you along and answer your questions.


I would recommend trying to learn whatever tools they use, so that when you start you can focus your mind on the projects and less so on the environment. Once you start, you will be less overwhelmed, and you'll consciously feel good about how much pain you saved yourself. Just ship your future team an email and ask them what they think would be useful to know.


What can I do in these 4 months to maximize my chance of success at this company?

This presupposes that it's worthwhile to be successful at that company. Most companies aren't worth it. In fact, there's a correlation between a company being well-known and a company having horrible code.

It depends what you mean by "successful" though. I'm defining success as "increasing one's skill." But that's completely orthogonal to most people's definition of success, which is "climbing the corporate ladder."

If you just want a secure job, focusing on your software skills isn't the way to go. Focus on your connections at the company and how you present yourself. Your connections will mostly determine whether you'll continue very long at the company. (This assumes your skill is above a certain minimum standard of quality.) Companies aren't meritocracies.


Is there really a correlation between being well known and having horrible code? Do you have anything to support this idea? It seems a bit arbitrary.


I shouldn't have used the word "horrible" when describing a codebase because "horrible codebase" has very different meanings for different people. The code of well-known companies is typically great from a business standpoint, otherwise they wouldn't have become well-known. By this definition, Mt. Gox's codebase was fine right up until they imploded, because the only reason they were large enough to matter is because rolling out features was the highest priority. But you can see how, to some, this is a distasteful definition of "good codebase."

First mover advantage is one of the most powerful forces in the startup ecosystem. The only way to exploit first mover advantage is to be the first to market, which sometimes means skipping many of what programmers consider best practices.

The general principle is that it's hard to grow large by having a bunch of principles, because principles slow you down. Idealism tends to be the opposite of pragmatism, and being pragmatic seems one of the most important qualities for success in business.

There are exceptions to everything, of course, and I'm not claiming that all well-known companies have codebases that most programmers would consider bad. I've heard Google's codebases are pretty good, for example. Viaweb grew large, and I know how obsessive pg is about code quality.

All I'm claiming is that unless the culture of the company has valued code quality from day one, the codebase of a well-known company will generally not be very good. I.e. there's a correlation between being a well-known (read: large) company and having a codebase with more hacks than most would be comfortable with. So it seems to be the default.

Maybe it has nothing to do with the size of a company or how well-known they are. Maybe it's simply that most codebases aren't very good. But it seems to have a grain of truth that business forces tend to push codebases in a negative direction by default. Additionally, it seems like most codebases you see on github are actually pretty good; better than what you'd see at most companies.


Thanks, you make some interesting points, particularly about being a first-mover. When talking about "startups we've heard of" you may well be right, I could certainly see how that could work. But in general, the companies we've heard of are the ones who survived long enough to become well known, and in that case, I suspect that their code is very maintainable, which implies many aspects of "good" code. I'd especially expect the companies known for their software to have good code, unlike companies that do other things, but happen to have developers to support their business.


But in general, the companies we've heard of are the ones who survived long enough to become well known

True!

and in that case, I suspect that their code is very maintainable

Unfortunately, there's no relation between a code's maintainability and its business viability. For example, if that were true, then there would be a correlation between beautiful codebases and codebases which make money. In my experience there is no such correlation.

Businesses generally maintain their codebases by hiring people to work on it rather than adopting good practices from the start (so that they don't have to hire more people to work on it). That's why startups have another advantage: they can work much faster than big companies, because big companies generally have to deal with ten metaphorical tons of code bloat, which slows them down almost as much as their bureaucratic nature.


Build shit.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: