Hacker News new | past | comments | ask | show | jobs | submit login

Being a software developer in the modern world (web, mobile) has little to do with computer science.

These skills can be taught but aren't in most modern universities.

Mostly, there is argument that university is not vocational training. At the same time, most people implicitly feel the need to go to university to get a job.

These developer boot camps create a new path. It's a good thing.




"Being a software developer in the modern world... has little to do with computer science."

Is a categorical falsehood that has cost me alot of time and money.

There is not one top tier technology company software team that you can get on without having a strong grasp of at least the basic fundamentals of CS (namely data structures, algorithms, and some college level math.)

You will ALWAYS get passed on if you only focus on domain specific knowledge (ie. web development or mobile) to the complete detriment of theory.

Why? because because eventually said tech/language/toolchain/development target/ or platform will become obsolete and good companies want to know your skillset won't die with them.

I've written about this before

http://www.quora.com/I-am-quite-bad-at-algorithms-but-good-w...

Anyone reading this who is interested in being a software engineer for the long haul please DO NOT discount the importance of a strong CS foundation as I did early in my career.

Edit: If you downvote please explain. I have strong evidence to back these claims in the form of missed jobs opportunities at multiple name brand tech companies. You can stick your head in the mud if you like but it wont make you anymore correct.


> There is not one top tier technology company software team that you can get on without having a strong grasp of at least the basic fundamentals of CS (namely data structures, algorithms, and some college level math.)

This just in: Getting on a top tier team requires being top tier.

This doesn't mean you can't teach for the rest of the industry, nor does it mean you can't go into the top tier work later. Just the path to top tier changes, from college to having solid experience and code out there. This can come from blog posts, open source, or simply working on hard problems at whatever gig you can get.


What's interesting to me is that every time this argument comes up, the argument of the party stating that fundamental CS skills are not required generally revolves around the work, and the argument of the opposing party generally revolves around the interview.

>> not one team... you can get on

>> passed on

Is the problem that you can't join the team without fundamental skills, or that you can't do the work?


>Is the problem that you can't join the team without fundamental skills, or that you can't do the work?

In some of the easier cases, a person who is only a bootcamp graduate will be able to do the work. In other cases, they absolutely won't. The latter will be those that require more knowledge of at least some CS theory (parts of it), algorithms, data structures, OS fundamentals (and particularly the interactions between the programs you are building and the OS, i.e. system calls, I/O to/from peripherals, data representation format and conversion issues, etc.), troubleshooting (and I don't count looking up issues on StackOverflow as troubleshooting), etc.

https://mobile.twitter.com/vasudevram/status/656512523840147...

https://mobile.twitter.com/garybernhardt/status/656512550570...

For example, I've come across both devs and sysadmins who didn't know that a binary (Unix term) / EXE (Windows term) compiled for one hardware/OS platform will not run on another hardware/OS platform (with maybe some exceptions). - due to the different hardware instruction set and OS system calls.


Very interesting observation.

I would argue you can still do the work without it but it will be at a completely different level if you have a strong CS foundation because you'll understand at a fundamental level how to build software that is performant among other things.

Many software teams have decided by using that as the interview process to have it as a floor for the aptitude of people on their team.


"(namely data structures, algorithms, and some college level math)"

If you're smart and perceptive, then you can learn enough of these things, for the average job, by osmosis or by day-to-day exposure over the course of a career.


There are lots of successful profitable companies making lots of money who are not "top tier tech companies". They still have software built for them by developers without CS knowledge.


We are talking about people who did not learn the fundamentals in college, for whatever reason. For those people I would advise them to get a job first then worry about getting strong in fundamental areas. Experience and employee referrals >>>>>> fundamentals where HR search tool are concerned.

You can learn fundamentals (and see them applied to real problems) while being paid on the job


If what you say is true, then how come the dev bootcamp hiring statistics say otherwise?

Look at any of the statistics. These bootcamps graduate people making 80k-100k. That is a good salary for someone who didn't graduate with a CS degree from a top 5 tech school.


Excellent point. If you want to waste 4-6 years of your life on theory, go to college. Bootcamp it or learn yourself if you want to spend 6-18 months learning and then jump into a junior position to get real experience (and get paid for it).

EDIT: See my post further down for clarification on this idea. I shot this comment from the hip a bit too fast.


I don't want to rip on you, because I think you're a pretty sharp dude from your posting here, but if you think theory is a waste, I really question what sorts of Actually Hard Problems you've had to tackle. :/


Thanks for the kind words!

I'm a high school dropout and have started/ran/sold a startup, managed a division for a well-regarded consulting firm, and worked on data taking for a detector at the LHC. Perhaps the problem is that I minimize the amount of effort that goes into being autodidactic (determine the problem, determine what information and skills are needed to solve the problem, acquire said information/skills, execute, repeat).

Let me correct/clarify my above post: Theory isn't a waste, per se. Is it best to spend a significant amount of time front loading knowledge one may not need? And paying a substantial premium for that experience? That's my problem with the college experience for tech professionals.

I'm not a huge fan of bootcamps, but I do believe there is much to be gained by Google's efforts here. There is value we've lost in the old apprenticeship system, and I hope to see it revived over time (I owe my skill and career progression to the luck of finding quality mentors along my way).

There are hard problems I want to solve that I want to have the resources (mostly time) to solve. College would not teach my those skills, but me trying to solve those problems are lessons in themselves.


YMM, of course, V, but I've used almost everything I ran into in my collegiate CS-related courses in some form or another. And I find that the stuff that I haven't directly used has contributed to my ability to solve problems--having the prerequisites to be able to assimilate into my corpus of knowledge pretty random stuff has served me extremely well. Much more importantly to me, however, I was exposed to things outside of "that computer stuff" that have made me better at being a manager, a leader, and (as hokey as it sounds) a human being. I literally can't put a price tag on it, but I would be vastly poorer as a person without a liberal education that I think gets aggressively discounted when attempting to view college in purely transactional terms. It is a much greater criticism of college, to me, that a student has to go looking for this stuff; the existence of bachelor's degrees that are functionally trade-school materials is troubling to me. I deal with a lot of technical people who are pretty aggressively ignorant of things that don't involve keyboards or oscilloscopes, and I think that is the greater failure of the American collegiate system.

Don't get me wrong--I'm self-taught too, I'd been writing code for a decade-ish before I went to college, but this stuff changed my life profoundly, and I think minimizing it out of hand is downright tragic. Maybe if there was a better way to get people to acquire this a few years on (calling to mind the idea that people should have a few years of experience before getting an MBA), but there isn't right now.

As far as paying a premium goes--I graduated with about $20K of debt, mostly because it was effectively free money (my total interest payments before I paid them off were less than $2K), and was getting paid through school (did Google Summer of Code twice, ran my own web dev shop, etc.). Bad choices can be made with regards to college, but that's a criticism of overly expensive (private) colleges.


Who gives a shit?

The tech industry currently doesn't need any more people to work on the Actually Hard Problems.

The tech industry currently pays people 6 figure salaries to build CRUD apps. There is such a huge shortage of tech people, that the industry can't even get the "Easy", low hanging fruit work done.

It doesn't matter how many PHDs work at Tesla designing space cars if the world has a cronic shortage of car mechanics. And right now the car mechanics get 6 figure salaries.


> Who gives a shit?

Could you make your point without that?


> The tech industry currently doesn't need any more people to work on the Actually Hard Problems.

I dunno, more people keep offering to pay me to do them than I have time to take up...


There is not a shortage of tech people. There is only a shortage of good tech people willing to work for peanuts.


Which is funny, as those exact Computer Sciencey type questions are what Google likes to throw at candidates.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: