Hacker News new | past | comments | ask | show | jobs | submit login
Apprenticeships - Employers Must Get Past Degree Snobbery (codemanship.co.uk)
101 points by csjohnst on Aug 18, 2011 | hide | past | favorite | 85 comments



Posing Apprenticeships as a significant alternative to higher education is delusional. First because a true apprenticeship requires a contractual relationship between the apprentice and the master. That sort of contract is extremely problematic to modern business entities because it does not allow flexibility in staffing levels, requires a significant investment in training (several years), and because indentured service is generally frowned upon and body warrants are hard to obtain these days -- there is little viable recourse should the apprentice terminate the apprenticeship early.

In addition apprenticeship is difficult because it easily runs afoul of equal opportunity expectations and requirements (in the US). The difficulty in differentiating between individuals and in determining each person's unique skill set before they are on the job is the reason all soldiers go through the same basic training and then are assigned to their specialties (of course one could argue that military skills are often determined before hand - but that is an argument for prior training (college) not against it).

Indeed the significant latitude of the military is an indication of what is needed to create any semblance of a workable apprenticeship program which provides equal opportunity on a large scale - an organization where meritocracy is highly advantageous both to the organization and the individuals who lead it, extreme prejudice in the enforcement of contracts (e.g. execution for desertion), and very one sided contracts (e.g. imprisonment for AWOL).

Modern higher education has grown because it offers such a powerful solution to many of the problems created by apprenticeship particularly lack of equal opportunity, exploitation of apprentices, diversion of resources to training and away from profit making activities, and long term commitments to particular individuals who may not be suited for the profession.


"Posing Apprenticeships as a significant alternative to higher education is delusional."

This is absolutely absurd and just shows pure ignorance of the economic realities beyond your grasp. Germany thrives off of apprenticeships. You can't find a job without having an apprenticeship under your belt in Germany. Students spend between 50% to 70% of their time at a company, while the rest is spent on traditional education. Apprenticeships are a vital source to Germany's economy. Dismissing it based on your pet theories is bordering on asinine.

http://en.wikipedia.org/wiki/Apprenticeship#Germany


The German apprenticeship model doesn't scale well to less affluent and less homogenous countries (i.e. pretty much the rest of the world) - and it is my impression that German Hauptschule students don't apprentice in professions such as computer programming. Then again, if one accepts the idea that it is o.k. to channel children into career paths at the age of nine or ten and generally reserve the best schools for the upper social classes, then the German model might be as successful as you suggest.


Germany is successful despite the Hauptschulen, not because of it. The apprenticeships are one of the things that allow Germany to get away with its aweful schooling system.


> You can't find a job without having an apprenticeship under your belt in Germany.

You can. E.g. you can go study at university.

Also there are vocational training programs that are done purely at a Berufsschule. Though they are seen as worse than an apprenticeship.


This isn't just a hypothetical approach. In the Software Craftsmanship community, consultancies have run apprenticeship programs for years with very good results. Scaling to large businesses might not be easy, but they certainly aren't the only game in town.

I'm proud of my CS background and I wouldn't trade it, but it's interesting to work next to terrifyingly bright guys in their 20s with deeper grasp of design and modern technology than most CS grads.


Good points regarding the shortfalls of a proper apprenticeship in this instance (particularly in the US).

However, I strongly disagree that modern higher education does much to combat exploitation of new grads, resources diversion to training, or long term commitments. Higher education often fails quite admirably on these points as well.


The difficulty in differentiating between individuals and in determining each person's unique skill set before they are on the job is the reason all soldiers go through the same basic training and then are assigned to their specialties

At least in the US, that's definitely not how it works. Enlisted folk take the ASVAB before even joining. Based on those scores, they select what job they want to do before even signing on the dotted line. Recruiters are infamous for lying to enlistees about what jobs are actually like, or about whether enlistees will be able to change rates after/during bootcamp. In my experience (as an officer), most enlistees feel cheated by the system, but must "suck it up" because it's the military and they've signed away their life.

Officers join either via ROTC/Service Academy or OCS. With OCS, all officers are assigned their specialties before signing on the dotted lines. (Imagine if a doctor had to join without being guaranteed a spot as a doctor!) In ROTC or a Service Acacademy, officers have 4 years of training before they select which community to join. About 1/2 the officer corps joins this way, so about 5% of the overall military.


Higher education in countries like the US that usually assumes taking on loans already turn it into a form of indentured service - service on the loan that is, a loan that can't be gotten rid of even through bankruptcy.

Could an apprenticeship be structured like an education loan, but one that the company reduces the balance on when you stay with them?


To your point about military specialties, the range of military occupation specialties that an enlistment candidate is qualified for is determined by the enlistee's ASVAB score at the time of enlistment, not their performance during training. It is true that during training, enlistees undergo further testing (DLAB, etc...) to identify candidates for specialized and rare positions like linguists and such. And some lucky, high-scoring enlistees are given their choice of jobs prior to enlisting as an incentive for signing on the dotted line. But the bulk of enlistees are already pre-destined for an assignment, based on the needs of the service -based on body-count, not test scores-, the day they step off the bus for basic training.


Your third paragraph reminded me of grad school!


Here's my rough criteria when selecting applicants for technology roles:

1) What have you done lately that is like what I want you to do?

2) What is your attitude like? (past references very important here)

3) Have you taken some test or certification (or can we give you one) that demonstrates skills in areas we might be concerned about?

From there, perhaps you can start learning, that is, it might be worthwhile to talk about a position. But all of that factual and procedural knowledge will be put to the test when you are inserted into our actual environment where your social skills are going to have as much to do with your value as your technical skills.

None of that involves a college degree (unless the job duties and environment mimic the college experience), and it all fits nicely into some kind of apprenticeship program. Yes, there can be a lot wrong with apprentice programs, but "apprentice program" is a very, very broad term. The trick is going to be in the setup and execution of the program.

I freely admit that we apprenticeship supporters wave our hands around a lot while yelling "apprenticeships! apprenticeships!" without providing necessary detail. But I really feel that under this rubric is where the eventual solution will lie. We need to bring education down to be as close to the actual work environment as possible. We need more rapid feedback loops in education and more specific, tailored in situ instruction. Apprenticeships do this.

Note that there is another topic -- the importance of a classic liberal education -- which I am a huge supporter of. But I think we have mixed up two concepts: things that directly translate into money for me and my family and things that make me a better overall person. Both may or may not be important to a particular person, but by mixing up the terms and lumping them all under "college education" it has confused the education argument to a terrible and unnecessary degree. This confusion is what is at the heart of the seemingly-unsolvable education discussion.


What a college degree does is that it provides a more equal opportunity for people to obtain a minimum qualification. Let's face it, Ben Franklin's apprenticeship as a printer for his brother more or less typifies the way in which people have obtained apprenticeships - through close personal connections between the master and the apprentice. This is unsurprising when one considers the degree of upfront investment by the master entailed in taking on an apprentice. Those who can and are willing take a street waif under their wing are few.

I will add that calling mentorships and internships "apprenticeships" does not make them such. And an apprenticeship requires a formal written commitment not only to teach the apprentice how to do their job, but to teach them how to be a master.


One of the best decisions I ever made was to hire a high school dropout into a de facto programming apprenticeship.

The guy dropped out for family reasons, went to work in factories, got a job in IT eventually, but was really bright and hard working, but stuck in a dead-end help desk job. One of our other devs knew him and suggested we give him some programming projects on contract- like $10/hr. So we did. He did a good job, so we offered him an entry level job, working closely with the dev who referred him.

So he moved across the country on the promise of a $15/hr 6 month apprenticeship, and in that time, learned everything you'd hope about building a Java-based web app that handled a lot of traffic, and after 6 months he was full-time at full entry level salary. After a couple years, he became one of our leads. After we got acquired, he went on to the acquiring company, then joined another startup as their first employee.

But, I would not have taken the risk unless a) he had a sponsor, b) he had some distinguishing credential.

In his case, yes, he dropped out of h.s., but he was also taking AP Calculus at the time. To me that was a sign that was enough of a time that he had the horse power to making it worth taking a risk on him.


Jordan Hubbard, one of the co-founders of the FreeBSD project, is a high school drop-out. He is now the Director of Engineering of Unix Technologies at Apple.


Good for you for giving someone a chance.


I freely admit that we apprenticeship supporters wave our hands around a lot while yelling "apprenticeships! apprenticeships!" without providing necessary detail.

A lot of those details could be worked out along the way. I'd be more excited to see the supporters provide actual apprentice programs than details on how everyone should operate such a program.


I disagree. Knowing the details of an apprenticeship program upfront are critical to attract and retain both the "apprentices" and the "masters" necessary to make a program work. It would be unfair to both groups to have divergent programs and skill levels being passed off as apprenticeships.

There are plenty of successful models that a program could be based on. In the US, electricians require 5 years of work with a journeyman + classroom instruction. Professional Engineers require a degree in their discipline, 2 examinations, and 4 years of relevant experience, usually under a licensed engineer before they can get their PE license (its not called an apprenticeship, but an EIT - Engineer-in-Training. You are expected to learn from a more experienced engineer who is responsible for overseeing your work and providing a recommendation before you are licensed).


Agreed. All we have is a vague title, and that's by design.

The last thing we need is folks going to their dictionaries and starting to create or use formal definitions for all of this. That kind of rigidity is what got us here in the first place.

Much better to adopt a general definition consisting of "in situ" and "rapid feedback" and then try a million experiments to see what gains traction. Theory is easy. Application is difficult. We need much more application and much less theory.


Many employers use a university degree as a proxy to judge whether or not you're employable, not whether or not you are qualified. As the article rightly points out, it's easier than ever to get a college degree, both because there are more universities offering them than ever before, and because the requirements to get those degrees are lower than ever before. Obtaining a degree merely shows whether or not you have the minimal foresight and work ethic required to be admitted to a university, and the even more minimal determination and resilience to actually get the degree. Thus, many employers think "If you can't muster a degree, in this lax environment, you're probably not going to be a good employee."

That said, if you can prove you have that determination, work ethic, and competence to do the job, many employers wouldn't give two shits if you never got an otherwise meaningless degree. The problem is, there's not a lot of other great proxies to demonstrate those qualities, especially proxies that would save your resume from getting tossed immediately.

As to the "degree snobbery" thought, it makes a lot of sense for some employers to use universities as their recruiting system. Think about law firms that only hire Harvard law grads. Snobbery at its worst, right?

Well, say Harvard Law gets 10000 applicants each year. This group is already self-selected to a certain extent, because most people who don't have a shot at getting in don't even apply. Next, Harvard only selects the most elite candidates (those with perfect scores only have about a 50% chance of getting in). So, anybody who makes it out of Harvard, even if they're at the bottom of the class, has already been screened extensively. If an employer just picked totally at random from this Harvard pool, he's got an excellent shot at picking a great employee, because the barrel he's choosing from has already been screened for him. This can hugely reduce the amount of effort an employer needs to put into making a hire, and so, even if lots of qualified candidates are overlooked, such a system might still make great sense.


I think the idea of setting time aside in your youthful life where you invest in yourself and expand your horizons is a powerful one. It makes the world a more interesting place and pushes human achievement. To what extent a university helps for this might be debatable and will depend on the person and the university, but I wonder if an average person would earmark 4 years of their life to learning and improvement left completely to their own.


Patio11 said it best (re: "do I have to go to college??) http://news.ycombinator.com/item?id=1182552

edit: just re-read your comment, and I realise you're not necessarily taking sides and merely speculating if the average person can get by without the investment into college. I agree so strongly I would upvote you twice.


If only US society actually did provide four years of education and living expenses, then patio11 would be correct. Sure, such offers exist but only if you're poor. Or are the valedictorian. Average Joe gets no free ride for four years.


He meant a social subsidy, as opposed to monetary. Basically, you aren't too strait-jacketed as a student, the world is your oyster, you can try out lots of different things, etc.

There was no mention of a 'free ride'...


That's a plausible interpretation of patio11's comment. I, however, think it's a bit of a stretch given the language of the comment. Perhaps it was his intent to say what you interpreted, but I'm inclined to disagree.


Well computer science is not only about programming. Programming is a big part of it and people without a degree might be good programmers, but do they really have gone through all the math and theory by themselves or just skipped it alltogehter ? For example most people without a degree might not know what Big O Notation is or how it works or why certain data structures are better than others.

So it depends much on the work someone is going todo, relativly simple programming tasks ? No degree required. Working on something that scales to millions of users or has to run with exceptional performance ? A CS degree would atleast tell me the candidate has learned about the theory required for this.

Generally for a good CS grad its easier to get really good at programming, than making a programmer comfortable with all the theory.

Try to get into any of the TOP software companies in the world without a degree and i wish you good luck (not impossible though).

I say this as someone who quit half way into his Bsc to start a company btw ;)


My experience interviewing says that a great many CS graduates still have no idea about Big O, or even really understand hashtables. That's from years of interviewing quite qualified candidates coming into AMZN BTW.

They might have passed the class and test where that was covered, surely it WAS covered, but they didn't actually retain it in a meaningful way.

I'm a dropout and worked at AMZN for a few years BTW. I don't think I would have had any problem working at Google or Amazon if I had wanted to either.

Granted, I probably couldn't have worked at those companies as my FIRST job, but that isn't the argument. The argument is who is better off after four years, someone who attended university to get a CS undergrad, or someone who worked in the trenches.

I'd say if there were more good apprenticeship positions available the latter would almost always be better.


I agree on the last point. I think this sort of stuff can be learned in an apprentice-type way, but not many positions currently do a good job of it. A great motivated way to learn big-O type analysis, for example, is to work on a system where a bottleneck gets improved from an O(n^2) to an O(n logn) algorithm, while working alongside someone who explains to you what that means, how you determined which algorithm was which, and why this is useful analysis to know how to do.

A lot of non-degree programming jobs seem to lack both the challenge and the mentoring to make that happen, though, so you end up with people who learn more about how to bang scripts together out of snippets pulled off the web. That's actually also a pretty useful skill, especially if you become a very fast and skilled applier of band-aids, but it's not quite the same as a CS apprenticeship.


Good point, and actually one I overlook too often in my own experience.

I went to CMU for two years before dropping out, so I did get a decent grounding in data structures / algorithms etc.. And I absolutely agree that that grounding has been important and is hard to get in the trenches.

Maybe we need to be arguing towards one or two year of school then apprenticeships?

My housemate teaches a one year CS fundamentals course here in Rwanda. It is essentially the only real CS training available in the country, though there are plenty of people with degrees from the universities. But he does a pretty good job of covering the basics. I'm sure he'd confess that two years would be better than one, but I'm not sure he'd argue for more.


I agree. The Co-op program I'm in allows me to switch between work and employment every four months. While I've found I learn more practical knowledge on the job, I've also noticed school forces useful concepts on me that I wouldn't have learned otherwise. This includes assembly level programming and the theory and math that you've mentioned.

I think the decision to enroll in university or to abstain is is entirely dependent upon the degree.


I think your point is generally true of computer science graduates, but there are a great many wishy-washy vocational IT degrees for which employer and employee would both be better served with an apprenticeship.


Which courses teach how to make something that scales to millions of users, or run with exceptional performance?


None directly, but you learn how to rate the performance of an algorithm and basic concepts that atleast give you a starting point. Of course that doesnt say every CS grad knows how that works or none of the non-degree programmes knows how to do that, but a degree tells me atleast that the guy should have heared about the relevant basic concepts and gone through the theory...in theory


I'm of the opinion that a university degree is less about learning the subject matter, and more about teaching you how to learn. So a programmer who has not completed a degree may be a fantastic programmer, but one who has done a degree may have a few extra skills on top of simple programming skills.

I.e. Communications skills, analytical thinking, knowing where to look for solutions, how to ask the questions that improve your knowledge and the proven ability to see through a project etc...


When I completed a CS degree in '88 I remember thinking that what it was really doing was lining you up to possibly go on to do postgraduate research - which I did eventually do. If you aren't going to be doing something that is vaguely like research then I'm struggling to see the relevance of CS degrees for most development jobs.

Universities are really rather splendid places for research and absolutely awful at vocational training!


I couldn't agree more. Universities shouldn't be for everyone - they should be for people who are in it for the learning, or for highly skilled research, not degree factories for entry level positions.

What breaks my heart in academia in the UK is seeing courses being dumbed down so that students are "happy" in their courses (i.e. not failing) so that the university gets a good response in the National Student Survey. The other thing I see is an increasing sense of entitlement - "we pay your wages so you should pass us". Going to uni is like going to the gym. You don't get fit by simply having a gym subscription - you have to work at it. Same goes for a university education. Higher fees are only going to make that sense of entitlement worse, which means for more dumbing down... and the cycle continues.


You don't learn the same way like you did fifteen years ago. You don't go to a library. Instead, you google. And try. And fail, and try more, and google more.

Maybe you still run into one fundamental problem or two where you need to read something hard. But otherwise, the "how to learn" thing is more obsolete than the other things claimed.


So the solution to increased cost of education is apprenticeship? Is this part of the current trend at bashing higher education to make sure that we become obedient but efficient drones?

There is a lot of things broken in higher education, but saying that hiring a CS/SE graduate for a developer position is like hiring a theoretical physicist to repair a car is disingenuous at best.

Guess what, graduate students who write a compiler as part of their course, who are working on a vm for matlab, who are improving IDE auto-complete based on all sorts of algorithms, who are devising a new distributed merge algorithm and evaluating its performance through hardcore network simulation, well, they know how to program! As a bonus, they know how to apply the scientific method and be rigorous when they report a result or an improvement. They have been exposed to all sorts of things that an undergrad don't even suspect their existence.

Sure, some grad students aren't good. Sure, people who don't go to college/grad studies can end up being way better and knowing more than grad students, but don't discredit a degree because you believe that it's too theoretical. Just ask about the homeworks, the projects, and the thesis the grad student worked on.


"Is this part of the current trend at bashing higher education to make sure that we become obedient but efficient drones?"

No, the goal is to remove vocational training from higher education and return it to its supposed goal.

The graduate student that you describe is rare. A masters degree in CS typically means that the person took more random computer courses and knows no more about software development. What you fail to realize is that nearly anyone who wants a masters in CS can shop around and find a school that is willing to accept them.


"What you fail to realize" <- ?

First, I believe there is a cultural difference between Canada and the US. In Canada, a Master's degree is typically not a professional degree and you usually cannot buy a M.Sc. Typically, half of the credits come from courses and the other half (often even more) comes from your thesis. I don't know about the situation in the UK, the author's country of origin.

Second, I TAed and taught programming, algorithms, and software architecture courses for undergrads, Master's and Ph.D. students so I'm well aware of the advantages and limitations of higher education. I saw students in an advanced architecture course who did not know what a thread was or who had never written a single SQL query. Well, they learned it in my course.

The graduate students that I described aren't rare. At least 75% of the Master's and Ph.D. students in the SE lab and PL lab at my university match that description. Maybe they would have a hard time building a web application in a day, but I think they have demonstrated that they can learn pretty complicated things and that they will learn how to solve your particular technical problems.

I felt the article was really about "degree snobbery", meaning that the author promoted snubbing people with degrees. I understand the frustration of people without degree who need to prove every time that they don't need one. But I don't believe that having a systematic negative bias on candidates with a degree is wise either. Honestly, does it make sense as the author says that someone with a bachelor in C.S. don't know how to implement a binary search (see [1] for a possible explanation)?

Regarding the vocational training that needs to be removed from higher education, I believe a compromise is needed. I agree that you cannot efficiently learn development process and all the latest languages and frameworks at school. But you need to learn some good programming skills and software engineering practices, otherwise, it is a lot harder to understand and play with more complex concepts and it is also more difficult to bring a significant contribution if you do a Master or Ph.D. later.

[1] http://www.skorks.com/2010/10/99-out-of-100-programmers-cant...


A graduate student is a bit different from a student just getting a bachelor's degree, which is where this article is focusing.


Well, here are a few quotes from the article that seem to target students who have more than a bachelor degree:

"So much so that, for some subjects, employers now require at least a Master's degree."

"I think CS and SE lecturers could learn a thing or two from me about writing good software."

"Many of the CS graduates I know suck as software developers, too."

"Hiring a computer science graduate for a programming role is, to me at least, like hiring a theoretical physicist to fix your car. "

"Just as there are countless CS graduates who have a theoretical grasp of compiler design but have never design a language or implemented a compiler."


Not sure if you're American, but "graduates" typically means "one who graduates from a Bachelor's program", at least in this context.

Most of your selected quotes are (probably) referring to programmers who only have a Bachelor's in CS.


Imagine you offered a job position of "software apprentice" and say you got a thousand responses. How would you identify the ones that would be worth gambling on ? - there would be little to nothing to differentiate candidates.

By and large you'll have candidates who didn't do well at school (because those who do well at school tend to go to university), have never done anything to show that they're capable of long term commitment, and have never done anything significantly intellectually challenging.

CS & SE courses at top universities which get to have the pick of the best students still have a first year drop-out rate of 10-20%.

Any company offering software apprenticeships can expect to suffer huge drop-out rates with minimal upside.


How is that different from any other job application? The job will clearly go to the boss's nephew, who is totally great at computer stuff because he plays so much WoW. (Though it is a well known fact that the best middle management comes from Eve Online).

Seriously, you will generally want to look for people who are intelligent and good at abstract thinking. In my biased opinion, you should be shooting for the best humanities majors (In disciplines like English, music, philosophy and math), as the targets of such an apprenticeship. Not high-school graduates.

Until it catches on, your competition for employing them will be Starbucks and grad school which means they will be a steal compared to programmers trained at Stanford or MIT (TM). A liberal arts major would be pleased to be making 30-40k a year out of school. Also, you can always fire them if they don't work out.

If it then catches on, you might be able to expand to people who are interested in making programming a career, then you might be able to catch some of the

Currently, CS courses at top universities have very little to do with the business of actually programming, they are more based on mathematical theory. SE courses tend to look more like 'software engineering management.'

In each discipline, you learn important skills... but they tend to have very little to do with the craft of sitting down at a computer and making your thoughts into working code. (I have a master of science in computer science, I learned many things that made me a better coder, but the degree didn't teach me to code, work experience did).


> Currently, CS courses at top universities have very little to do with the business of actually programming, they are more based on mathematical theory. SE courses tend to look more like 'software engineering management.'

Where does this meme come from? If you get a CS degree from CMU, you are going to program your ass off. You are going to write at least one of a compiler, a HTTP proxy, or an OS kernel from scratch (and if you choose to, you will write all three) as well as countless smaller applications (your own heap allocator, your own shell, etc). I didn't go to CMU and those are the project I know about.

At UMD, the curriculum will have you spending most of your time, again, programming your ass off. I don't think there's a single class where you'll spend more time on "mathematical theory" than on programming.


I went to CMU and can confirm this. I coded a shell, filesystem and kernel just in that one OS class. To anyone who thinks this doesn't prepare you for something just because the focus was on the OS abstractions, algorithms and concurrency instead of "good coding design", you are mistaken.

Universities teach math-like computer science instead of OOP/Code style/Agile/etc because it is much harder. You can learn OOP and the rest on the job. College is meant to broaden you as a person, to teach you to think -- not teach you to do a specific skill set. Not all of them can do it, and not all students want to learn this way, but the dream is still pretty great.

Interestingly this is the opposite of what apprenticeships are for. They teach you a skill set from master to student. Perhaps these two systems have less in common then we originally thought. Should the comparison be between apprenticeships and vocational schools?


There are, of course, exceptions, but programming in industry is typically a design job. Your focus is on designing an infrastructure for the application that will allow it to be functional and maintainable, and survive massive refocusing of scope without determent to the prior two points. Things like compilers, HTTP proxies, and OS kernels are plugged into the infrastructure, not written as part of it.

Your concern, as an industry programmer, is not whether merge sort of bubble sort is the right algorithm for your problem; you will just hand that off to the built-in sort function. Your concern is whether or not MVC is the right design pattern for your application. Again, there are exceptions, but that is fairly typical of the average programming job.

So while you might pick up some design skills during your study in a CS degree, it is not the focus of the program. That is where the meme comes from.


I'm not sure if that's a legit complaint. Can stuff like that be taught adequately in a course or is it something you pick up by "just doing" it?

A lot of classes where you program your ass off force you to make design decisions, some explicitly as part of the assignment and some implicitly. Isn't that valuable? What would a "design based" curriculum look like?


I don't necessarily disagree with you, just trying to explain where it came from. Software engineering is the "design based" curriculum.


When you're hiring someone who's completed an undergraduate degree in CS you have lots more information

At the CV level:

1) A university which has filtered out bad candidates for you, which university/degree you got is a good quality filter

2) An undergraduate thesis which has been chosen by the individual, both the topic and the complexity are good indicators of the candidate (did they build a website or did they implement a new GC algorithm)

3) Internships (did they spend the summer work at McDs or Google)

4) Open source / spare time projects

At the interview level:

1) An ability to write code, you can directly test if they're capable of taking a problem and turning it into a solution (aka fizzbuzz). This is a very distinct skills and no-ones come up with a good way of predicting ability in it despite numerous attempts.

2) An understanding of abstract data structures, again it's very hard to test if someone can understand concepts like call stacks and hashtables without being able to test those things directly.

That's a hell of a lot more information than you'll have with 18 year olds or even with humanities majors (incidentally I assume you classifying math as a humanity was in error).

Of those firms I know who hire non-CS grads to do programming most of them require some experience with programming (which tends to cover most maths, physics and engineering students). Very very few companies try to hire completely unexperienced programmers.


Also, you can always fire them if they don't work out.

I'm guessing you're in a right-to-work state, in the UK and EU firing someone for anything other than gross negligence is very, very hard. Hiring someone who "just doesn't work out" is a huge risk for an organization that can't afford to shuffle them off into a "VP of Paperclips" position.


"in the UK and EU firing someone for anything other than gross negligence is very, very hard"

It certainly isn't difficult to fire people in the UK providing they haven't been working for you for too long (it's either six months or a year). Other countries in the EU can be a nightmare though.


Typically it's laid off or terminated without cause, in which case the person gets unemployment, the business takes a hit on their unemployment insurance but it's typically no hassle as long as it wasn't for a reason that's explicitly against the law, like discrimination.

"Terminated with cause" avoids the hit in unemployment insurance to the business, the person doesn't get to collect, but they can sue you for wrongful term if it really should have been the first category -- here's where you need to document gross negligence.


Do they not have contract to hire in the EU?


Degrees are usually a signal about how well the person is expected to perform. A degree from a top school with a decent GPA means that not only was this person select among many to attend, but they also managed to go through it okay. That is why employers ask for a degree in anything for certain jobs. They may not care about art history, but a degree is a good signal that "hey this person is smart."

For CS it's a little more practical but a degree is still fundamentally a signal. As is an active github account, blog, etc...


We have to stop pretending degrees are worthless. The way I see it, going to school, expedites your learning process, by exposing you to professors and your peers, it helps you learn best practices, which although generic, saves you a lot of time making same mistakes others (your professors and your peers) have made. Also it gives you the focus and urgency to finish your learning on time.

As an analogy, consider the knowledge accumulated while atending school as open source software, even if your knowledge or the OSS is generic to be fully usable for the task at hand, it almost always gives you a big headstart to get your job done, because it avoids the trivial and non-trivial pitfalls through years of maturity.


It is true that degrees are not inherently worthless. They can indicate a lot of prowess in an individual. The problem is that the process of attaining a degree has become too standardized and universal. Now, it is easier for someone to game the system, thereby acquiring a degree without any of the intended benefits of the process.

The meritocracy and strict requirements of more individualized approaches in open source is a good alternative (see my other replies).

Bearing these concepts in mind, I agree that it is important to maintain that a degree is not a black mark on a CV. It remains just as important to remember that a degree is also not the shining star on a CV that it once was.


You're conflating the degree and the relevant knowledge gained in pursuit of the degree. They are not one in the same, and I believe that is the point most people are making. At least, that's what I believe.

The skepticism directed at college education is a result of cost/benefit analysis. I would gladly take a 2-year associate's program that yielded a degree in computer science, much like nurses have an 2-year nursing degree. The four year degree I'm in pursuit of now (in my 30s)? I'll likely drop out after getting the discrete math, algorithms and other math-intensive courses I likely would not study on my own.


"I'll likely drop out after getting the discrete math, algorithms and other math-intensive courses I likely would not study on my own."

I think that statement right there is why degrees are still valuable. To earn your degree you will have to study subjects that an independent learner would not likely study on their own.

So far in my career, the amount of math I was required to take to earn my CS degree has been very valuable. I have to use it a lot. (Graphics, GIS).


No, that statement is an example of why the information is valuable, not the degree. There is still a whole lot of stupid hoops people have to jump through to get a degree.


Agree absolutely.

I'd long ago decided that even with the £3,000 per year fees (which Labour said they wouldn't do in their election manifesto, got an overall majority then did anyway - remember that Aaron Porter and others slamming the LibDems at the moment), the investment in a degree before a career in software just didn't seem to add up. Realistically a £25k debt against a 3 year delay in starting a career - you may well earn less for the first few years but by the time you've paid off that £25k, is the degree really going to be a differentiating factor?

With now £27k just on tuition, plus living expenses for three years, what's the point? Honestly, I learnt more that I use professionally at A level than in my degree, let alone what I've learnt professionally. Sad to say this but I would actively recommend against an 18 year old with an interest in working in IT studying at university, the way things are at the moment.


Agree with this. With £27k debt for tuition alone, plus other debts for living costs, in a field that changes so fast as Computer Science, you'd be paying off this debt long after much of what you'd have learned would be obsolete. Far better to go straight into work, and study part-time (if employers will take you without a degree that is, which they might if they realise that 18-yr olds can be smart and cheap). It'll be interesting to see what 18-yr olds do in response to these fees. Sadly I fear many may not be clued up at that age, and will study Computer Science then regret later.


Computer Science doesn't change quickly at all. Slightly faster than maths does, perhaps. Only the fashionable language changes quickly.

I think that many people these days don't understand the difference between CS and "making websites with RoR".


I agree with that. But, a typical course contains bits that don't change e.g algorithm theory and bits that do , e.g programming skills learned from implementing algorithms, which, back in the day would have been in Pascal or C, nowadays is in Java or Python, and in future maybe some other language. So taking such a course would have 2 aims - get a good grounding in theory, and get some buzzwords on your resume / c.v too . People who don't understand the difference between CS and making RoR websites , sadly include many hiring managers, right? ;)


I graduated with a CS degree in 1988 - very little of what I learned is obsolete. Of course it was rather theory and maths based... (with a hell of a lot of development, mostly in C on Unix boxes of various breeds).


I think it says something about the kinds of employers that look at people who have spent upwards of 100k to obtain a piece of paper that implies some arbitrary level of understanding in a given subject. The same level of understanding that I could get on my own for probably a few hundred bucks.

Don't get me wrong, I'm not against higher education in theory. We've gotten to a point though were it's not about the education anymore, it's about the diploma. We enforce this idea in our high schools that you can't be successful without one and as such send our kids in droves to Colleges. We've artificially made the demand so high that these institutions can charge whatever they want and the kids are still going to attend and put themselves (or their parents) further and further in debt.


> says something about the kinds of employers that look at people who have spent upwards of 100k to obtain a piece of paper

great you're an employer. for fun, let's say you're google, except you just ignore the 'education' line resumes

> The same level of understanding that I could get on my own for probably a few hundred bucks

oops, now, you have 10k resumes of people claiming they have knowledge of computer science. but you only need to hire 10 people. what do you do?


Surely I could filter those resumes just like many employers currently do anyway. First of all we'd likely be talking about some type of entry level position as education plays less and less of a role when looking for experienced engineers. So I start with some type of test before I'll even look at your resume. In fact, if I'm an employer as popular as Google, I may have more than one level of tests to get through based on the job before I give any resume's a look. It's not hard to test for knowledge in Computer Science and I'm sure companies like Google already do to some extent anyhow.

Once you've done the initial filtering it's not too terribly hard to spot some potentially good candidates. Especially in the field of software development where it's easier than ever to build products and make them available for others to see and use.

I'm also not saying to ignore the education lines...I'm just suggesting that I don't like it as the main criteria for filtering out candidates.


> Let's face it, the best experts aren't the ones who knows all the answers, but the ones who know where to look for the answers.

In many ways I feel like this is exactly what a degree in Computer Science or a related field gives you. Becoming an educated member of society isn't about learning the "correct" answers it is about learning to ask the right questions. It is about learning some questions outlive their answers. Universities enable students to glimpse the horizon of human understanding. A glimpse of the infinite unknown.

I understand people feel burned by: their experiences at university, hiring practices of corporations, poorly performing well credentialed hires, and the cost of education in general. However, let's not toss the baby out with the bath water here.

@brudgers said it best:

> Modern higher education has grown because it offers such a powerful solution to many of the problems created by apprenticeship particularly lack of equal opportunity, exploitation of apprentices, diversion of resources to training and away from profit making activities, and long term commitments to particular individuals who may not be suited for the profession.[1]

[1] http://news.ycombinator.com/item?id=2899059


I could not agree more, and although this applies to a broad range of industries it is extremely pervasive in software development (Related, I have heard a few computer science folks highlight that computer science as a major is not targeted at creating software developers - http://news.ycombinator.com/item?id=1884255). True enough.

I stand by my sentiment on that post, that open source projects are a great 'apprenticeship' opportunity for those interested in computer science-like fields (software/web development and the like). That said, I have participated and watched at http://opensource.com and http://teachingopensource.com and come to realize just how difficult it is to get opensource into the educational system.

Knowing that difficulty, I might categorize experiences in the following best to worst order for new hires:

1. Active open source contributor 2. Active open source contributor w/ non-CS degree 3. Active open source contributor w/ CS degree

Two additional notes:

1. These are not meant to be absolutes, there are certainly individuals who fall into the above-mentioned category 3 that far surpass a category 1 candidate in a particular skill. I am merely suggesting that at a high level, the likely skill-set available to a category 1 candidate is often more desirable than the likely skill-set of a category 2 or 3 candidate. A lot more could be said here, but it is not the point of this post. 2. While this most obviously applies to software development, it also has a natural home among technical document authors, marketing, customer relations, QA, and many other aspects of business that exist and flourish in open source communities.


I've run into this exact issue.

However when you've got the managing parties who have degrees, they seek out justification for their expenditure, through self-seeking-self activities, consciously or subconsciously.

Additionally when grad students are willing to work, and have been trained in the shadow of those who are managing, all the gears in the system work as expected.

The issue comes into play when you've got kids like myself. We are ITCHING to learn, fail to see the benefit in 4 years of learning. And most of all:

We know the fastest way to the bleeding edge of research is NOT through the "tried-and-true" channels. It is most efficiently attained through jumping right into the fray and working with said researchers.

THIS is what will create that innovation that we seek. The degree system simply is the easy answer and a way to continue with the entrenched system, by providing slave labor.


I'm skeptical that this is going to happen on its own any time soon. While the legal frameworks are in place to prevent discrimination on sex/race/age/etc., I think we should put similar mechanisms in place for formal education. Especially when so many degrees are worthless as a measure of skill, so they've become irrelevant to the job at hand just like a person's race.

Make it so that employers can't ask for education, just like they can't ask for age, nor make a degree a job requirement. Of course when an applicant comes in for an interview, their race and relative age quickly becomes apparent, so it's not really a matter of information hiding as removing a more-and-more irrelevant filter. There's also nothing stopping an applicant from explicitly exposing their age/education/etc. on a resume or during the interview, and I'd still want to mention an MIT education if I had one. At the very least you would want to talk about school projects since you may not have any other experience, but that's up to the applicant. The question is "What things have you made? How did you do it?", not "Did you take a data structures course at an accredited university?" I'm not even sure it would create that much extra burden on HR departments since I hear they're already swamped with applicants matching degree requirements.

On the other hand, a free market approach may be to just leave it alone and let the tech companies that require CS degrees, or black people only, suffer to the companies that care about skill alone. I'm pretty okay with that too as a practical outcome. The question there becomes really philosophical and whether you want a big government to slim down in an inconsequential way or continue its historical path of trying to enforce certain moral directives on supposedly less enlightened people.

Downvoter(s): would appreciate a discussion on which idea(s) is/are most offensive to you. There's the additional filter of "this person made it through a 4 year program and may therefore be determined/have long-term goals/etc.", but really I don't find that a very compelling or useful filter for many jobs.


The problem with this is that universities get away with screening processes that would be problematic at a company. For example, screening applicants based on SAT scores is a legal gray area for a lot of companies, but the college you went to is a proxy for your SAT score.

I'm convinced that more than half of the value of hiring someone from a top-tier college is who they admit, rather than what they teach.

Frightening true story: Someone I know at a government contractor startup was hiring a fortran programmer. As part of the interview test he was giving a simple fortran sritten test. The company lawyers found out and had him stop. Apparently tests that haven't been vetted for cultural/racial biases are a potential source of liability for government contractors.


So very true. At the company (very large defense contractor) we have to ask the same questions to every applicant. We can't ask follow on questions or deviate from the pre-defined question list. It was deemed legally unfair to ask different applicants different questions. That makes it harder and harder to distinguish the good from the bad.


I suspect there are countless CS graduates who can describe Binary Search theoretically but couldn't hand-roll a binary search implementation to save their lives

If this is the case, and I doubt that it is very often, then your CS program has failed you miserably.


I believe Knuth says in his article on Binary Search in the art of computer programming that it took some ridiculous number of years from when Binary Search was first described to a implementation free of bugs.

I find it highly likely that the majority of programmers would produce a buggy version of Binary Search on their first attempt. Why? History indicates programmers often make small mistakes even when writing simple algorithms. A survey of 26 papers on variations of binary search found that 6 of the papers had serious errors in the published algorithms.[1] 4 of the errors were "design" errors. That is the algorithm they designed had a fault. 2 were implementation errors (1 in assembly, 1 in cobol). All of the errors were published in a peer reviewed publication. Therefore, even peer review does not always spot errors in a "simple" binary search algorithm. Why would you expect recent graduates to do any better?

[1] http://comjnl.oxfordjournals.org/content/26/2/154.abstract


That's fascinating that less were implementation errors than design errors, though "the distribution of errors among families of algorithms is not uniform" accounts for that I guess. Though the paper is from 1983...

I scanned through the paper and found no explicit mention of the typical coding error caused by using M := (L+H)/2 instead of M := L + ((H - L) / 2) (though the paper interchanges both). So I suspect a re-analysis would find more coding errors than design errors in languages without arbitrary-sized integer auto-conversions. My reasoning for that conclusion is based on: http://googleresearch.blogspot.com/2006/06/extra-extra-read-...

Of course, given the stories of all the CS-degree wielding applicants who flat-out can't do FizzBuzz, and the amusing incorrect designs/implementations of commenters who scoff at the notion of not being able to do FizzBuzz and try to prove they can but fail, I was already inclined to believe there's a lot of incompetence to go around. It's not even necessarily a problem with the CS programs, but there is the danger of learning to recite an algorithm's steps word for word without knowing what it means or what it's useful for and fail with a live use case.


A "Fachinformatiker" (for some reason translated as specialist by google translate) you can become through an apprenticeship in Germany for some time (there where predecessors to it with different names)

see http://translate.google.de/translate?hl=de&sl=de&tl=...

oder auf deutsch http://de.wikipedia.org/wiki/Fachinformatiker

so it is nothing new at all.


As a fellow german, but not a Fachinformatiker, I suspect colleagues without a degree still have a hard time to work their way up into more advanced management roles or just into the higher income brackets. I wish that would not be the case though.

What I find intriguing, although I'm surrounded by "Programmers", almost no CS! We have physicist, mechanical engineers, mathematicians, historians and english majors even.


Most (if not a lot of) startups i've seen look at portfolio and experience more than do you have a degree, anyway. I'd rather hire an 18 year old who's got lots of great stuff on Github, internships and other experience.

Than someone who's just finished their Computer Science degree from a random uni, with no experience. Almost all of those I know going to university are just planning to do their course and apply for jobs afterwards, expecting their degree to mean instant an instant job.


I have found firsthand that the lack of a college degree has increasingly become a scarlet letter in our society. It's not quite as bad for coders as it is in the rest of tech(the fact we have open source projects to prove we know what we're doing helps a lot) but it's still disproportionately difficult to get a job that pays a living wage unless you have the degree to "prove" you know what you're doing.


"Discarding the bachelor’s degree as a job qualification would not be difficult. The solution is to substitute certification tests, which would provide evidence that the applicant has acquired the skills the employer needs."

http://www.nytimes.com/2008/12/28/opinion/28murray.html


I'm an self taught programmer from Germany and since leaving school I have been working as a free lancer.

Some time ago I decided that doing the free lance is a little to stressful for me (you know, customer acquisition, taxes, business overhaed) so I talked to some friends of mine who work as "normal" office developers. They all said something like "yeah, any company should be happy to have you working for them" and so I started so send out applications.

Guess what: The HR guys won't even look at my CV without any sort of official paper. Be it a CS diploma or a finished apprenticeship (yes, in Germany we have programmer apprenticeships) - without that not even an interview.

So what I'm thinking now about is doing the 3 year apprenticeships. Only to get a paper that says "yeah, that guy knows how to spell Java and knows what ISO norm 23542 is" (programmer apprenticeships isn't more than that in Germany). Oh happy time - where I will spend the next 3 years and "learn" nonsense I already know or I don't want to know. And my classmates will be ~17 years old.


'Education', K-12, college, and later, is pushed and shoved by several large influences.

As in this thread, two of the influences are (1) employers want some 'criteria', maybe even 'credentials', that will simplify selecting promising employees and (2) students seeking jobs want to keep down the costs in time, money, and effort to meet such 'criteria' or to get such 'credentials'.

Broadly there should be some 'market forces' that provide answers: (1) If the education employers want is expensive, then employers will have to pay employees enough to pay for the education; (2) if the education costs more than it is worth to the employers, then students and employers will make do with less such education; (3) if students don't come for the education, then some educational institutions will have to make some changes to offer the students more value for the costs.

Such 'market force' influences are easy to see, but there are some other large influences less easy to see:

In the US, the 900 pound gorilla is the interest of Congress and the US DoD in technology for US national security.

Such was not always the case: Indeed, during the rapid rise of engineering in the decades before WWII, schools of engineering actually concentrated on teaching engineering for students seeking careers in engineering! Amazing! Radical! Astounding!

Then we had WWII and radar, sonar, and the bomb, right, the atomic bomb, and Ike and other influentials concluded that "Never again will US academics be permitted to operate independently of the US military" or some such. Congress went along, that is, pulled out the US national checkbook, and started signing.

Then the top three dozen US research universities got an offer they couldn't refuse: Take the US Federal money for research in math, physical science, and engineering, with a tilt toward what might be useful for US national security, or cease to be a leading research university. They took the money!

Now, for such a university, for the money, the most important activity on campus is research, research, and research as in "How do I get a grant?", especially, now, from the NSF, DoE (that is, energy, not education!), or DARPA. Since then Congress has also provided money to the NIH and its grants -- you see, there are a lot of old people in Congress eager to see progress on some of the serious diseases of old age and ..., well you can see the connection!

Now for this 'research', there are some examples and, now, a 'model': The most influential example, for both science and US national security, is physics, especially as for the bomb, right, the atomic bomb and now, too, the hydrogen bomb. So, 'research' has a really severe case of 'physics envy', especially theoretical and mathematical physics envy. So, good 'research' is supposed to 'mathematize' a field.

So, at top three dozen US research universities, math, physical science, and engineering pursue research, research, and research with physics envy.

Yes, there was that report, yes, the David Report, that said that some of math was, well, not so applicable so soon and should, then, maybe get less money. That's why now at some of the research universities the math department has trouble keeping the lights on but is still working on the analytic-algebraic topology of the locally Euclidean metrization of infinitely differentiable Riemannian manifolds (extra credit for the source!). Math is still the most respected field, but for some decades more applicable topics in math -- e.g., probability, stochastic processes, optimization, statistics, control theory, signal processing, numerical analysis, computational fluid dynamics, computational complexity, math for finance, math for theoretical physics -- has been done outside the 'pure' math departments.

Now it turns out that somehow some huge fraction of good students are really eager to get their bachelor's degrees from such research universities. So, the universities can be very 'selective' so that employers can get some of the 'criteria' they want just by a student being admitted to such a university!

But the students are being taught mostly just by professors interested mostly just in narrow, leading edge research to get grants. So, the education is not really about, say, engineering for engineering students who want jobs in engineering! Instead, in, say, computer science, the education might be closer to background for research in 'the fundamentals of computing', not that we really know what they are but some people would like to!

How to work through the security model of SQL Server for SQL Server installation, administration, and management? Professors can't get grants for mud wrestling with the messed up SQL Server security model or its just awful documentation and terrible problems with installation, management, and administration and, thus, mostly DON'T! So if someone actually wants actually to actually work with an actual, real installation of SQL Server for a real, actual, important, practical database application, in the real world, outside of academics, as part of a career, that can pay enough to buy a house and support a family, then the professors in a research university are not a very direct sources of such information!

Still good students like to go to research universities.

Then there is the money, that is, what the universities charge. So, tuition has gone up, way up, over the last few decades, up faster than even health care! Why? Well, there's a dirty little secret! What has gone up is the published, 'list price'! But there are also 'discounts', especially for good students, called 'scholarships'! So, if the student's father is wealthy and the student only so-so as a student, then go ahead and charge list price! While this student pays list price, they can also do well in their studies in beer and bed! But for a really good student, especially with poor parents, there are scholarships.

In the US, there are also many colleges and universities that don't try to be top research universities and concentrate on teaching.

And there are many community colleges where 'job training' is the goal and not a dirty word and where there are courses in auto repair, auto body repair, framing carpentry, finish carpentry, masonry, cosmetology, plumbing, electricity, HVAC, and, yes, computer programming and network management. Tuition is low; the courses are fully intended to be practical; the teachers are not researchers and are often practitioners.

For computer science, there is a secret: The more advanced parts of computer science have been heavily 'mathematized'. So, net, the best background for such material actually is not even in the computer science department but in the math department, especially a course in, say, abstract algebra.

Next, for a career, e.g., in computing, there is now a big, dark, ugly secret: 'Jobs' are no longer such a good idea! E.g., the Stanford AI course got interest from 50,000 students in 175 countries! So, generally, if take some material in college and look for a job, then here in the US, both you and your employer will be competing with thousands of eager people in 175 foreign countries!

Meanwhile here in the US the people buying houses and getting their children through college will, may I have the envelope, please? Yes, here it is: Often they will own their own Main Street business where they have a geographical barrier to entry and, thus, no competition more than, say, 100 miles away. In particular they will have no competition from anyone in any of those 175 foreign countries.

So, what future for computing in the US? Broadly, gotta do something new, out there, on the leading edge in at least some respect, something entrepreneurial, something you can't get hired for because the guy hiring doesn't understand that new thing yet. So, have to be an entrepreneur. For that, something in advanced computing might help. Then, just getting 'skills' with, say, Linux, C++, Java, Python, MySQL, SQL Server, Flash, HTML5, etc. is 'routine', maybe at some point necessary but not sufficient.

Just what should such new stuff be and just how to get help from a research university? No one really knows! Welcome to the challenge of the future!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: