Hacker News new | past | comments | ask | show | jobs | submit login
What skills do self-taught programmers commonly lack? (quora.com)
181 points by thenicepostr on April 5, 2011 | hide | past | favorite | 144 comments



It's interesting that this question implied the negative -- not what's the difference between, not even what are the strengths and weaknesses of, but:

> What pieces of the whole are missing?

The implication being that without a formal CS education, it is an impossibility to obtain a whole. As if whole matters; how does one obtain whole, anyway?

This is a very loaded question and perspective that betrays a serious bias, in my opinion. That we're making a generalization out of humanity's life experiences is futile in itself, but alas, the application of said generalization here is far more interesting to me.


I'm inclined to disagree. The converse -- what skills does a programmer who has never left academia lack? -- strikes me as perfectly valid. And surely there must be some benefit to taking four (or more) years out of your productive life to the study of programming?

But I say this as a self-taught programmer, so perhaps it's just that I'm envious of those who've had the benefit of a formal CS education.


Speaking as someone who was self-taught, spent four years getting a formal CS education, and then has 16 years post-college professional experience, I don't think you probably missed much. I think the only really practical thing I learned in school was big-O notation, and you can probably pick up the essentials of that in about one day of moderate studying. I can't recall ever learning a data structure more complicated than an AVL tree in school, and I think that was in an extra directed-study class. (If you've not heard of an AVL tree, that's not because they're terribly tricky -- it's because they're now nearly obsolete.)

Mind you, a lot of the stuff I learned studying CS at college was interesting. It just hasn't been particularly relevant to my career. My math classes (I had a double major in CS and math) have actually been much more professionally useful.


I agree with you about the math class. As I mentioned elsewhere in the thread, I ended up doing a minor in CS (which comes out to about 6 courses). I focused these courses on the more theoretical/mathy options, and that has definitely served me well.


I feel the same. Inferiority complexes for the self-taught?


I, for one, feel incredibly inadequate for lacking the stuff that you'd get on a CS course which I think actually matters - stuff like algos, operating systems architecture, etc.

I think lack of confidence is a huge factor and it's one that's plagued me for a long time, though I don't want to project my issues with it onto all self-taught programmers.

<oversharing>

I had a somewhat unique situation when it came to choosing my degree course - having been brought up in a new-age spiritual belief system, I was pressured into asking a woman in Indonesia what I ought to pursue (yeah seriously :-S), and ended up doing civil engineering despite having no interest in it. I had grown up coding from age 9, and that along with maths were my passions.

I had the added pleasure of having a vast family crisis occur during the degree which caused me to fail a year and end up with, though a good grade (a british 2:1), nowhere near the performance I could have achieved had the shit not been going on.

So I've got a degree in the wrong subject, and have had to work from scratch into some sort of development career, and now I'm in an internal job working on CRUD apps.

</oversharing>

The really frustrating thing is that many (though thankfully not all) potential good employers seem convinced that you must have a computer science degree or otherwise you're not worth considering (and yeah, I'm looking at you google and microsoft), I know I can learn the things I'm missing and contribute to oss, etc. to become a viable coder at a good place, but if you are basically turned away at the door because you lack the right degree it just doesn't matter.

Obviously for me personally, my degree was plagued with problems which weren't my fault, another way in which I can potentially be deselected from the hiring pool.

To end on a positive note - I am working on fixing these things by filling in the gaps and contributing to OSS - a good solution for people who feel similarly inadequate I feel.


And look at startups. Many startups are a lot more flexible in this regard.


Look to someone like Aaron Patterson. He studied CS at university but he did not really understand programming. Out of school, he took a job where he wrote bad Perl.

After many years of experience and a lot of dedication—he expended much effort reading CS books on different topics—he came to be a solid practitioner and now is a leader in the Ruby community.


It's called "imposter syndrome", and you are not alone. It's a powerful motivation, but don't let it make you suffer.


Yeah, it's comforting to have a label for it. I had a bit of a catharsis with imposter syndrome in this blog post I wrote for my esteemed employer:

http://blog.hipmunk.com/hipmunk-for-iphone-post-mortem


[deleted]


I didn't know you were self-taught — I wasn't projecting, but literally confessing my agreement toward the bias you described. As to the parent, envy is typically felt from a position of disadvantage or inferiority.


Missed the connection between your reply and the second paragraph of your parent on first read. I responded initially to your combination of comments with that oversight in mind -- connected inferiority to inclined to disagree to me -- so I owe you an apology. Forgive me.


No worries, thanks for talking it out with me.


None here. I'm pretty proud of how much I suck.


The ability to get shit done.


I don't think you can come up with a stereotype for that.

I've seen just as many people with real-world experience that have trouble "getting shit done" -- because getting shirt done is hard for every human being on this planet.

That said, many people coming from academia do suffer from disappointment with the state of the industry, where boring projects are the ones that generate immediate revenue.


And the ability of seeing the bigger picture (I know it sounds enterprisey, but that's how I see it). I can still remember our 1st year Physics professor telling us, a class of nerdy CS kids, about how familiarity and a deeper understanding of the field you're writing your programs for is way more important than pointers, data structures etc.


They lack the sense of urgency brought about by the working world.

Also known as making deadline.


As a self-taught programmer myself, I confess this is exactly how I would frame the question. I'm constantly worried there's something crucial to which I'm oblivious.


So then the logical answer is that self taught programmers lack the confidence to realize that they are just as good as formally educated programmers. :)


And I wonder if often better. There is value to having the motivation to learn something before being formally trained in it. I was not too impressed with some of my peers in CS, especially those who could not figure out how to open a file in C in their senior year.

I was self-taught, then formally trained, and since self-taught. I would have to say that both have value and that you are missing something without the formal education. "Confidence" is a good reason, but exposure is important, too. I am sometimes surprised when the college math pops up in my work, spanning anywhere from programming drivers PCI cards to web interfaces. Work with a large enough dataset, and maybe you will need to make use of a different sort from the language-included "quicksort" algorithms.

I thought I could my way out of a paper sack before getting the CS degree. I ended up learning a few things, some trivial, some useful and connecting. You could probably pick it all up up in a couple decades, but four years is a nice jumpstart.

Simultaneously, the learning of approaches and techniques does not end there. For example, I sometimes wish my university had made mention of MOP instead of presenting OOP only the C++, Java, and Ada way.

I think there is a difference between people who can accomplish tasks and people who can accomplish those tasks in a way that they can be easily extended and maintained in the future; in one respect, we can call that experience, and in another, we can call that an exposure to enough approaches/patterns that it is easier to select the best for the job. Any jumpstart is a huge help and can aid in removing the bad habits learned from solo experimentation.


The dangerous programmers are those who don't know what they don't know. You have moved on. Once you know that there are things that you don't know you can always pick up some key texts and fill in the gaps. MIT have some great CS videos online, you might want to check them out.


I'm in the same boat and also worry about it to a certain extent. However, 7 years of getting paid to write software has mostly given me more confidence. Also, I know a lot of really smart people (a lot smarter than I) who studied Computer Science and couldn't program their way out of a paper bag.


When I found out many "programmers" can't solve fizz buzz, I felt a lot better.

Have you read anything that really knocked you on your ass in terms of practical usefulness?


I'm not 100% sure that fizz buzz is really the indicator that people make it out to be ... I mean, it doesn't really test software design skills, or the ability to design a complex system with loosely coupled objects.


Fizz buzz is very simple. It's easy to communicate, implement and test, there's nothing fiddly about it. It's the entry level indicator and requires no prior knowledge, other than the ability to turn a simple, stable specification into software.

If you can't do fizz buzz in your chosen language, then it's highly unlikely you can do anything more complicated.

It doesn't test your ability to do any kind of high-level design. It just tests whether you can program at all.

The ability to do high-level design is only important in one of these two situations:

1) The position you are hiring for only involves doing high-level design and not doing any programming.

or

2) The position does involve programming, and the person you are hiring is a competent enough programmer to do fizz buzz.


The capability to complete fizz buzz is necessary but not sufficient.


Computer science and programming are such vast fields I think everyone must be oblivious to things that others find crucial.


So do something about it. Between the Internet and possibly some nearby university (where you could sneak in and find out exactly what books all their classes require of the CS students), you'll have access to 99% of the same relevant things you'll need to "come up to par" with the so-called college trained programmer. Wikipedia beckons. Amazon (both as bookstore and cloud hosting provider) beckons. Your terminal beckons. Etc.


As a mostly self taught programmer, I disagree. I think this is an extremely relevant question. There are fields of CS that are generally not encountered when learning programming, in spite of being relevant and often useful. Formal verification and proofs, the lambda calculus, language theory, and other "hard CS" subjects which have real world applications (well, in some fields, anyways) are also relatively rare to run across when self-teaching.

That said, the list in the article is not an especially good one. It might as well have said "The Basics", as far as I'm reading it.


There are many good books and even videos of courses for those things you mention.

You can also join courses of interest if you've got a good university around you; as I haven't seen teachers objecting to outsiders participating in their course.

It is true that effort is required without immediate gratification (so it is hard to learn such subjects on your own), but it is feasible, because if the "hard CS" fields you mention don't peak your interest, I doubt they would've had an impact on you if you attended a CS college. Just go and learn, keeping in mind that it may take a while and you need to keep doing it.


There are indeed. However, without courses to direct you to them, many self-taught programmers don't realize that they should be reading them, or watching the lectures.

The problem isn't one of ability to learn. It's just that it's hard to figure out what you should be looking at.

I am a partially self taught programmer. I did take several courses in university to get a minor in computer science. These clued me in to the holes in my education. Otherwise, I might have figured it out eventually, but it would have been harder.


I'm also saddened to see such a misguided stereotype on the front page here. It's damaging in the same way as the old question, "When did you stop beating your wife?"

To my mind, this question, and the answers given, would only begin to make sense if it were phrased "What skills do people who are not interested in Computer Science commonly lack?"

And pointers? Really? That would make sense if the question was "What skills do people who have never used a language with pointers commonly lack?"


In fairness, once pointers clicked for me, a heaping pile of computing made sense. I banged my head on x86 assembly for a few weeks when I was 14, went back to C and got pointers, then reapproached assembly later and was delighted at how easily it fell into place.

It's my personal belief that a firm grasp of C and any processor's assembly -- protected-mode Intel is a can of worms, there's easier architectures like ARM -- is crucial to success as a programmer. Crucial. Enough so that I weigh a CS program based on their application of it, and so far I've been mostly disappointed. Hence why I remain an autodidact.


I wasn't disagreeing with the importance of pointers at all. And I agree strongly that a firm grasp of C and assembly is extremely valuable. My disbelief was with the notion that self-taught programmers wouldn't understand pointers. I was saying that the only way you could get away without knowing pointers would be if you'd avoided programming in asm/C/C++/etc.


Which lets face it, is pretty common nowadays with all the scripting languages.

In all honesty, for a while i felt like i was lacking knowledge compared to my fellow coworkers, but after a while in the job, there is one truth that i only believe: you are either a natural programmer or not, and that's where the difference will be.

I never took real CS courses, i have a degree in Software Engineering, but was basically just how to write software for administration in .NET, design databases and write proper SQL. Never learned C, ASM, algorithm, data structures or stuff like that at school. But i still get a better understanding of those things that most CS graduate i know...

School will not make you a good programmer, if you are good it will help, but you could have probably achieved the same level of knowledge and skill without it.


Hmm. I did a quick poll of the programmers I know personally & would consider successful (for a reasonable value of success). One has a firm grasp of assembly (which makes sense because he uses it a lot), a handful have a firm grasp of C (most claim to have a "little" C but from experience with them they couldn't build anything w/o a book for guidance :P).

So I don't agree with the hypothesis.


Read SICP, do all exercises. No, don't simply read the book, you have to do the exercises.

Read Knuth's Concrete Mathematics, do all the exercises you can, but this time this is not a requirement.

Forget about Java related design patterns.

Now you know more about programming and CS than 99% of population. Problem solved.


You can't even guarantee people with a CS degree know any of that stuff. In my experience, many of the people I've worked with that had CS dregrees had forgotten that stuff long ago, if they ever knew it.

The people answering on Quora are people who are interested in CS or else they wouldn't be on Quora answering the question. That interest is the reason they know that stuff, being taught it in college just made it easier for them.


This is a most frustrating and poignant question.

I am a self-taught programmer. That being said, I'm not lacking in any desire or ability to teach myself. In fact I am quite active about it and not just in programming. Sitting on my bookshelf is a current set of AoCP and the companion, Concrete Mathematics. I keep journals of my thoughts, progress, and notes about excercises. I converse with knowledgable people on usenet and irc. I build things to know how they work.

On the other hand, I can understand why such a question is important. Someone new to programming might simply not know what questions to ask or where to go once they've finished the tutorials. Or perhaps they just reach the limits of their knowledge and hit a road block.

However, there are lots of people who load this question with a lot of FUD about self-taught programmers. I just want to let them all in on a little secret: hardly anyone cares about your degree. It's an investment in an institution and doesn't grant you any knowledge that you couldn't gain on your own with a little work and self-discipline. That being said, such an investment does have its advantages. Automatic knowledge and the intuition to apply it is not one of them.


Very true! I have met many university trained people who have not kept their skills up to date and couldn't keep up with the dedicated autodidact.


That's an important thing to note. Becoming a master programmer takes more than 4 years. If you don't learn outside of school you are doomed.


I'd say that programming falls into two categories. I don't quite know how to describe them, so here are two examples:

1) Your average database/object driven application. There is user input. This translates pretty directly to output. There may be some interesting algorithms in between, but it's pretty much UI programming.

2) Solving difficult problems. A logistics routing application. The bulk of the code is based on graph theory, reducing the problem perhaps into smaller segments that can be solved in different ways. The UI is just the surface.

There is a need for both types of programmers. The first type are increasingly graphic designers who decided to learn PHP or some similar situation. There are tons of great applications out there that really aren't that complex (except when scaled to extreme proportions - ie Twitter). Wordpress, Gmail, even Microsoft Office isn't terribly complex (at least, the 90% of it that everyone uses).

The second group seems to be a bit more selective. Applications like Dropbox, which need to route data in the most effective way, or Adsense, in which profit needs to be optimized in choosing what ad to display as a page loads. These are complex problems, with complex solutions. That's where O(n) analysis is important. That's where an understanding of graph theory and established algorithms is critical.

While these two groups aren't completely exclusive, I'd say many self taught programmers find themselves in the first group.


I think you've actually got no clue how complex and hard the average database/object driven application can become. Nor do you seem to have any idea what is actually entailed in a large CRUD application.

You claim that GMail isn't complex. Jeezus, a tiny % of programmers are capable of recreating that well. Or Microsoft Office! Go get the source code to OpenOffice and say that again. Think about the problem a bit harder and you'll realise it's a freaking compiler that's far more complex than most programming languages.

Far, far, far, far more complex than Dropbox.

I bet the hard bit of Dropox was the seamless UI integration with the OSes and not the routing of data. The data routing was probably solved in an afternoon. That was probably the fun bit.

You seem to have no clue about what is actually hard in programming. Hard problems, pffft.

It's as if I went to projecteuler.net and looked at some code and went 'oh, look, implementing algorithms is easy. Any A-level maths student could do that...'

Graphic designers generally aren't capable of programming or maintaining a large CRUD app, and when they try what usually happens is that some professional has to come and gradually rewrite the whole thing. Without breaking anything. That's hard.

If they are capable then they usually call themselves a programmer, and rightly so. And if they wanted to solve the 'hard' problems, they could. They'd just have to learn about a different area of computing. And I'm mightly jealous of them for having mastered two hard subjects instead of just one.

So next time you want to declare that 90% of programmers could be replaced by people with no programming skills, go look at some source code and then shut up.


Sorry. Those examples are admittedly more complex than I made them out to be. I mentioned scaling to be a more difficult problem. There are some master architects in all those projects, who deal with hard problems, but a large code base is not the same as a complex problem. Large pieces of software are many small and simple problems held together by a complex glue. The architects deal with the glue, but category one programmers can so most of the other stuff without too much difficulty.


No, there generally aren't master architects. I'm surprised to hear you talking about master architects, in traditional programming circles the term is usually met with scorn. There's a perception that people who call themselves architects or talk about architecture tend to suck at programming. That they like talking about code, but in general are not actually capable of doing it.

Hearing someone say they're a software architect or that a company has a software architect is usually a big red flag to run like hell.

Also a large code base is identical to a complex problem. Check the definition of complexity. A complex problem can be broken down into simple components.

Finally the best glue is not complex, it's simple. Complex glue tends to result in brittle magic, a poor abstraction, like traditional asp.net compared to something far more elegant and smaller like ruby on rails.


Then I stand corrected. Thanks for the insight.


I don't think "average database/object driven applications" constitute the majority of programming, and I strongly suspect that Microsoft Office is not as trivial as you say :-) See Joel's article on the "five worlds of programming": http://www.joelonsoftware.com/articles/FiveWorlds.html Each "world" has intricacies that people from other worlds aren't aware of, and there are likely more worlds than just these five.


Actually, the second category should be differentiated further:

2.1) Solving information problems. What you describe.

2.2) Solving physical/mathematical problems. Think signal processing. Think physical simulation. The bulk of the code is based on some mathematical theorem or linear algebra that is just too complex to be calculated by hand. The UI is just the surface.

The 2.2 group seems a bit more selective. Applications like audio workstations, simulation grids or wheather forecast centers need this. Universities use this for research.

While these two groups aren't completely exclusive, I'd say many CS graduates lack the engineering/mathematics/physics prowess needed for the problem spaces encountered in 2.2.


My hypothesis is that a CS education would be most useful after you have already spent 5-10 years in industry. Unfortunately for most people life just isn't structured in a way to allow for full time education later in life.


I've believed for a long time now that most people would be better off spending a few years in "the real world" before they were admitted to a university. Right now, for many if not most Americans at least, college is an unsupervised extension of high school. It's treated that way not just by patrons but also their parents and administrators. The requisite maturity just isn't there.

Interestingly, historically, there's usually been a war or famine or other hardship that has forced people in their late teens to grow up quickly. Is our society really better on a macro level for the ease that teenagers have now?

Additionally, schooling after a few years as an apprentice is extremely helpful in that you have a familiarity and context about the field and can instantly filter the busy work from the useful information. One may also be in better financial condition by that point, making loans less necessary and ultimately saving the student a lot of money.

Personally I plan to encourage my children to do something else for a few years before they enter post-secondary education (if they plan to do so at all).


[dead]


one of the thing's I've learned is that the assumptions I make nearly always come back and bite me. I think that's what's happening to you here.

The assumption you appear to be making is that a self taught programmer is one who tries but fails to gain acceptance to a CS course, so then has to teach themselves.

I'd classify myself as self taught because I've been writing code since my pre-teen years in the 80's. I also did an engineering degree which helped give me a much better theoretical grounding


Education is a method whereby one acquires a higher grade of prejudices.

Dr. Laurence J. Peter


You created an account just to spew that, good work.


I am a self-taught dev and I dropped out of a Comp Sci double-major because I found it frustrating, slow and boring.


This category "self-taught programmers" is an insidious over-generalization. Used in this context, it seems intended to conjure the image of an undisciplined hack who learned how to cobble code together from an "In 21 Days" book.

This belies the fact that programming and computing have a long and rich tradition of attracting brilliant individuals who, despite taking alternative paths, still manage to go very deep by virtue of their own drives and all-consuming interests.

I've worked closely with many dozens of CS grads in my career. I haven't detected much correspondence between "has degree" and "knows what they are doing." I've seen an overwhelming correspondence between "is truly passionate about programming/computing topics" and "knows what they are doing."

I would have somewhat more sympathy for the dichotomy implied by this question, if all CS grads graduated at the top of their class from a very good CS school.


John Carmack is a self-taught programmer.

There is no programming skill that a school can teach that a motivated individual cannot learn outside of school. That is one of the great joys of working in this most democratic and meritocratic of fields. Its secrets are not locked away in an ivory tower, they will reveal themselves to all who seriously seek them. The only tuition required is hard work and persistence.


I grew up on computers, and am exclusively self-taught, yet I have an in-depth knowledge of how computers actually work from spending time learning how to crack programs. A decent understanding of some fairly complex algorithms from writing RTS, Chess and other games and working on some awesome ML applications professionally. And a good understanding of FP from playing with langs like Haskell and implementing a lisp interpreter.

I have also never learned a single thing from sitting in a lecture hall (though I did do a two year CS diploma, the hardest thing we ever had to do was an Address Book in VB), the most important things I have learned with regard to programming have all come from tutorials on the net and the few books I have bought over the years. In a sense, I have been taught by some of the best people in their fields including K&R, Abelson and Sussman, Denthor, CLRS, _why and too many others to mention.

Since I am self-taught, I can't really give a good answer to the question, but imho, our biggest problem, is that we don't know what we don't know... And that we usually don't a solid understanding of the maths behind what we are doing, something I recently realised the importance of, and am trying to remedy in myself.

Aside: What is really meant by "self-taught programmer" anyway? I have managed and worked with people who came out of university and couldn't program their way out of a paper bag.


As far as I can tell, "self-taught programmer" is a left-handed compliment of sorts, referring somebody who actually gives a shit about programming, rather than someone who passively received some sort of education (however long ago), and has since methodically applied it (exactly as it was). Others went to school, but realized graduation wasn't the end of exploration.

Some of us never had a formal programming education, we've just spent years (perhaps decades) chasing things that have interested us and learned plenty of useful skills along the way.


Aside: What is really meant by "self-taught programmer" anyway? I have managed and worked with people who came out of university and couldn't program their way out of a paper bag.

This one always makes me raise an eyebrow too. My experience at university was that students who didn't teach themselves anything didn't really succeed in class either.


Perhaps a question equally worth asking, is "what skills do programmers who AREN'T self-taught commonly lack?"


You haven't interviewed enough then.

I interviewed dozens upon dozens of candidates who couldn't describe to me the underlying data structure for a hash table, all with degrees from reputable schools. This was at Amazon of all places, where they did a decent job of filtering before they even got to me.

Don't even get me started on post-grad degrees. Those that have only PhD's or Masters in CS have been especially bad in my experience.

Passing a course, even doing well, is a very different thing than actually internalizing the material.


Not in my experience, those have patience to study a phd or a masters are those who are really passionate about the subject. Those who move on to them tend to be the hardcore kids who started programming from 13 years old.

They may not be good in work environment doing some random UI work, it would turn their brains to mash.


> Those that have only PhD's or Masters in CS have been especially had in my experience.

When all you have is a PhD, everything looks like a nail.

Seconding your view on reality. I've only met one programmer with a masters / doctorate that knew what he was doing. The rest -- close to a dozen of 'em -- spent all their time trying to shoe-horn any project they were working on into their thesis.


>don't know what we don't know

I feel that way about everything I've tried to learn on my own. The process of becoming an expert in a field I think is largely learning what you don't know.

So long as there isn't a deficit in the total volume of knowledge, maybe it's a good thing that there be more variety in which subset everyone picks up.


I concur. I'm self-taught and would agree that "we don't know what we don't know". I'm not sure that's exclusive to programmers. Give me a tip about what I don't know that though and I'll research it obsessively. That's just my nature.


The list they've put together it long enough that I would expect nearly everyone to be missing a few.

Compilers and Machine learning are probably the two that you don't see in self-taught people much. The ones you do see it in generally go to university shortly after having been self-taught anyway.

Most of the other things on the list you learn if you deal with particular programming languages. And a university degree doesn't really supply breadth much more than being self-taught does. But someone who was self-taught and then did a degree will be much broader in skill, which is why those people seem to have such diverse skills.

The actual answer is that the self-taught programmer needs to keep learning and broadening into other languages etc. Which is exactly the same thing any other programmer needs to do.


Maybe I'm an outlier, but the self-taught people I know fare particularly well on compilers and machine-learning among my peers. Maybe it's the circles I travel in, though. :)

FWIW, I'm a self-taught information-retrieval / compiler / linguistics geek, currently studying machine learning. I've been programming since I was 5, and I always wanted to be a super-librarian, whatever that meant.


The majority of people I've come across started by doing web stuff or Basic way back in the day. I think in general that's the largest population. The best self-taught people I know have by now read everything that would be in a university reading list anyway.

I think it reinforces the point that self-taught people become very knowledgeable about the things they're interested in. The average self-taught programmer is likely to be more motivated than the average new CS grad. Although this may be becoming less true as more of the less purely interested people move to the more "practical" software related degrees that exist.


I'm self-taught and after a few years in industry, decided to go back to school so I could really accelerate my learning.

I know more than most of my fellow students, due to the few extra years of experience and personal effort, but I definitely had blind spots. It isn't even entire categories of things, but rather little pockets here and there in individual classes.

I have learned a lot, but deciding to get a math BS alongside it has helped the most. I've taken this experience in school as one to soak up as much hard information as possible, so if my personal efforts are any indication, I can see why the combo ends up with a better skills list than either alone.

I've come to really value the stuff that is taught in universities (at a decent enough school anyway). I suspect the longest lived impact of all of this is an ability to read and consume a higher level of information (journals, etc) on a broader variety of topics. It beats blogs, more often than not.


I thought it was strange that Functional Programming ranked high. In my experiences CS depts are pretty mixed, in some schools it's essential, but in many others it's all but ignored. Additionally when I list out all the people I know who are big into fp on twitter or else where, at least anecdotally, it seems to be pretty heavily skewed towards the self-taught. I have a mixed background (self-taught then back to school) but all of my fp work has been on my own time an curiosity. I've always found it rather ironic that FP is considered academic when very few academics spend their time on it.


Odd indeed. Javascript and Ruby are quite FP (At least in the lisp sense of FP) and they are predominantly hacker languages, meaning that they are used more by self-taught programmers than CS graduates.


Okay, there are some gigantic generalizations being made here, both in this post and others. I'm currently a senior at a liberal arts school, majoring in CS. Of the 10 or so CS majors in my class, I'd say half are also "hackers" in that they spend significant free time programming outside of class, for work and for fun. All of these people know at least one of {Ruby,Javascript} (Ruby happens to be less popular around here than Python since the latter is taught in intro classes). Not coincidentally, these people are also at the top of the department and are having the most success applying to grad school AND jobs.

I guess my point is that the hacker/academic distinction is not necessarily as broad as one would think reading some of these comments, and that many of us CS types think of ourselves as hackers as well as academics.


Thanks for correcting me - You're quite right.


I'm a self taught programmer and have been doing a lot of interviews with developers, both self taught and schooled lately. In general, self taught programmers seem to fall in to two categories.

1. They learned one language (PHP or C#), just enough to be productive, and manage to get jobs and make a career out of small "programming" jobs. Sharepoint monkeys, small consulting gigs for web development clients, etc.

2. They have an insatiable appetite for programming and never have stopped learning or expanding their skill set. I fall in this category. I'm constantly reading books, learning from coworkers and never settle for my skill set. I've also gone back and taken some college-level CS courses, which have helped me greatly as well. I'm currently in the process of reading all the seminal computer science books (currently reading the gang of four design patterns book).

In general though, having a CS background doesn't make you a good developer any more than getting an MBA makes you good at business. It's about how you apply it. Plus, you can be a good developer and valuable employee for lots of other reasons besides being book smarts: dependable, productive, smart instincts, etc.


A few things:

- Renowned CS programs spend less time than you think teaching you how to actually program. Programming is simply a tool to teach things like theory, algorithms, mathematics, logic, artificial intelligence, operating systems, graphics, and so on. I actually took a class on programming C++ and it was one of those independent self-teaching classes. :)

- I think one needs to differentiate between programs that teach programming as a trade and a CS program at a top school. I imagine the experiences are quite different. Much like there would be a difference between someone who learned from Learn C++ in 21 Days vs the hallowed Abelsson and Sussman (Alyssa P Hacker anyone?).

- I think most lower division books can be read by anyone and they would glean the same amount of knowledge. In fact, I would say there isn't much difference if someone went to somewhere like OpenCourseWare and learned it themselves.

- I think you'll find more difficulty at the top end. While it's possible that self-taught programmers would do these exercises, if I ever meet one I'll bow down before him. I'm talking about classes like CS 162 Operating Systems (http://inst.eecs.berkeley.edu/~cs162/sp11/) or CS 170 Efficient Algorithms and Intractable Problems (http://inst.eecs.berkeley.edu/~cs170/sp11/).

- With that said, I personally rarely (if ever) use the things I learned in these classes. Perhaps if I wanted to write my own language, my compilers class would be useful. Or maybe if I got really deep into machine learning, then my AI class would have been more use.

So, I guess with that said, it's really about what doors were open to me on matters that, even when taught, seemed impossible to understand. Doing it self-taught is just that much harder unless you're the guy from Good Will Hunting.

All in all, I'd say that for the majority of people who are not that specialized (like web developers) it doesn't make much of a difference. The people for whom it made a real difference went on to do research in a specific area and are the ones who are hired for some specific knowledge they have (eg. classification algorithms, machine learning algorithms, security, complex statistical models).


One more thing: This is why I always find it sort of useless during tech interviews to ask someone who's been out of school for a long time the difference between a thread and a process or the difference between sorting algorithms.

It's like people couldn't think of an original question that might actually be of some use to the actual job you'd be doing.


> ask someone who's been out of school for a long time the difference between a thread and a process

Quite the contrary: if you're in the industry, the difference between a thread a process should be very hot in your page cache. Understanding why ``system("cd ..");'' won't work is pretty vital.

As for sorting algorithm, you should at least know the big-Oh for a good sorting algorithm. In many specialties of programming, understanding what "amortized" means (e.g., in the context of quicksort's performance) is also important.

Somebody out of school may naively think that quicksort is always good as it's O(log N) or (if they never took an OS class-- and OS classes are, unfortunately, optional in some universities) not fully understand why their code is burning up 100% of a single core in a 8-core machine and leaving the other 7 idle.

On the other hand, I can agree that asking questions which are only answered from memory (i.e., either you know the trick or you don't) that will only be in a recent graduates page cache (e.g., remembering all the rules of a red black tree, or a dynamic programming solution to a classic problem) but can -- and should -- be looked up, doesn't make sense (if I see you implementing a balanced tree in a project without at least cracking CLRS open and looking at existing implementations, I'll have a chat with you...)


But "amortized" is irrelevant in the context of quicksort, as it is not an amortized algorithm. You may hit an unlucky streak and get the worst case of O(n^2) behaviour on every call. An amortized algorithm on the other had would guarantee that while each individual operation on a data structure may hit the worst case, it will always (and not just in the average case) average out over many operations on the same data structure.


Doh, screw up on my part. You are completely right. An amortized algorithm would be something like a dynamic array (std::vector or ArrayList) in this case, which is doubled every time it hits capacity.


On the contrary, it's irrelevant if you're not writing a sorting algorithm or your main role doesn't consist of writing any algorithms. Most web developers I know simply call sort() in whatever language they're using instead of reinventing the wheel. Even my work in Objective-C where I handle retaining and releasing memory, I still call sort() (and compare()) without caring about algorithm efficiency.


If you're calling a sorting algorithm, you should still know about the efficiency of that sorting algorithm takes so you can make a choice between sorting an array or indexing using a data structure (which data structure).

Web is just a UI to layer software that _does something_, algorithms are a way to get that something done. Have you ever needed to understand why e.g., a SQL JOIN is taking too long? That's an example where algorithms matter (in selecting the index type, knowing whether to use a join between two tables or to denormalize etc...)

Knowing how to do your own memory management and knowing Objective C is great and valuable, but if mobile/Mac desktop application development were to become less valuable and more commodity (right now, I don't think it will happen-- but I can't predict the market), will you be able to apply these skills to (say) working on more complex C and C++ software such as a web browser?


IMO, I think you're spending more time talking about edge cases than the norm. How many people are web developers vs those developing web browsers?

Understanding query cost has more to do with data structures than algorithms. Why does the optimizer seek vs scan? What column(s) do I need to index and how should it be structured? You really don't need to understand how things get sorted in a b-tree.

Knowing normal forms is also different from algorithms. Ergo, it would be more useful to ask "When would you denormalized your data?" instead of "What's the difference between Quicksort and Bubble Sort?"

Objective-C is really just one of many languages I know and use. It just happens to be the most recent. Oddly enough I did spend some time (many years ago) implementing my own browser for a job. We used Java and, again, I never really had to know the complexities of sorting algorithms.


Err, as fhars poster pointed out quicksort is not an amortized algorithm, I should have used a better example. It is, however, a good example of an algorithm with a worst case running time of N^2.


I started programming at the age of 8, and went on to study CS at the age of 20. So, I count myself a self-taught programmer with a Master in computer science, and I have an opinion:

The formal education didn't teach me any skills I couldn't have picked up on my own.

It did however, teach me to ask questions I didn't know existed, lovely abstractions and a lot of useful theoretical concepts. Instead of constantly being surprised by things like "oh, wow, Java's static classes are really similar to python modules of free functions", I can clearly see that they are both manifestations of the same idea, but shaped differently by the language design. This makes it much easier and quicker to pick up new languages and technology.


That reminds me of an anecdote a friend of mine told me... she was studying computer science and this one guy in her class, when confronted with a new concept, would always ask "...but how would you do that in Visual Basic?"

For him, Visual Basic was programming. Everything else had to be translated to VB for him to understand it.

This is one of the pitfalls of being self-taught, at least if you only learn scripting languages. You don't see the forest for the trees.

(This might be different for people who are self-taught with C, because it's so close to the machine, they instead might be asking the question "but how does the compiler translate that into a machine representation", which is a MUCH better question to ask.)


Not surprising. That is a list of things 99% of working developers do not do on a daily basis. And I'm not sure most of that will help 99% of developers on a day to day basis.

I once had a project to extract customer contact information from an Excel spredsheet. I use Bayesian probability to determine if a column was a first or last name and trained it using US census data. Then used Levenshtein distance to find names that were possible misspellings. It worked great, but on the POS computers that most people in the company had it took so long they usually just gave up. I would have been better off just sticking a DDL and letting them select what each column was.

Oh well, maybe If I was actually trained I would have figured out how to do it by writing my own compiler or Excel extraction DSL


I mostly agree with this, although I don't think classically trained programmers fare much better on average. I've had many university educated programmers that I've had to teach basic parsing, data structures, and algorithms such as tree traversal.


Agreed.

At the vast majority of even quite decent research universities, CS majors are there to get in, get out, and get a job writing very basic plumbing/form-handling/gluing-stuff-together-code. Or they get an ops job where configuring stuff and keeping it running is important. It works for most of them, too, as that's probably what most programmer-related work is. And most of it doesn't need to scale, so it doesn't hurt too badly that they don't understand algorithmic complexity.


Look, as far as I'm concerned, unless you had nothing but other peoples' source code and API documentation, you are not 'self-taught.'

I used to cobble together VBA macros for Excel at work, based on code snippets and the built-in help files. At first, although I got things done, I really had not a blinking clue what I was doing. Then after a few weeks I bought Excel VBA for Dummies - which taught me how to do things properly and understand the essence of all the constructs I was using.

I read more books, and I read web tutorials and watched video lectures. They taught me a lot. Then I left work to get a post-graduate-conversion certification in IT. I always said I was doing it for the cert, not the teaching. The course was good but only because it instilled discipline. Really attending a lecture is no different than watching one on YouTube or iTunesU. The teachers don't have time to give you personal tuition. Their notes are not more instructive than the classic CS/software books in publication. The learning materials available elsewhere are decent rivals to professional education courses. The only thing I can see being a huge benefit is doing pair-programming with an expert for a few years, but how many courses offer that? Not to mention, the learning that's supposed to go on at universities is hardly automatic.

So therefore the question is meaningless, unless you mean actually self-taught people, who would probably (if they managed to develop into competent professionals) have some strange quirks. Like self-taught musicians who never learnt common techniques I guess.


Some programmers are really engineers or scientists who solve their problems using computers. Even though they usually had some programming class at the university, they never had a formal CS education.

But then again, they know a heck of a lot about some specific problem space, be that audio algorithms, weather forecast or nuclear fission reactions.

This is one area where CS graduates are just about useless. Without knowledge of the problem space, those programming skills still won't solve the problem.


The question must have been typoed. Given that list, I think the question was supposed to be, "What skills do novice programmers commonly lack?"

Anyway, I know first hand (from interviewing quite a number of people) that a lot of university graduates lack the listed skills. Sure, they might be familiar with the term "compiler," but that doesn't mean that they have the skills to actually write one.

Don't get me wrong -- for some people, a university education is perfect. There are plenty of grads that know what the hell they're doing. But getting a degree is far from the only way to learn what you need to know to be a decent programmer, and it certainly doesn't guarantee that you'll be one.


I'm a self-taught programmer also, as many others have mentioned. I believe a CS degree can certainly be a launching point for people passionate about programming and computers. I actually did a year of CS before switching to a humanities degree. Part of it was I got a little scared off by some of the complex algorithms that came in the second year and I didn't have a good foundation. I ended up teaching myself PHP and VB.net.

In my profession, I ended up becoming a "technology guru", including some development. But as the only developer I never have had anyone else to lean on for expertise or support.

I think that for a self-taught programmer, having (good) mentors is probably a far more effective way to learn real-world development skills than a CS degree. I feel, for myself, that I would thrive in a development team where I could learn for co-workers and receive criticism and correction for my work.

Without that outside influence, it's hard to know if what I'm doing is really the best or even a good way of doing it.

I think every developer needs a mentor, and that's what I feel I missing most.

(P.S. Does anyone out there want to be my mentor? I am looking for a technical co-founder for a startup idea I'm working on)


One of the reasons I'm currently studying Software Engineering (UK) is to learn about things that will not necessary be taught on a job. Topics such as logic, formal specifications, algorithms and concurrency. It's one thing to know how to use threads etc in a language but you miss out on the underlying theory. I found these topics challenging to understand and thats with a lecture and tutorial (working on exercises from the lecture). I would have given up long ago if I was trying to learn on my own.

Now, this knowledge would be useless if I'm not in a position to guide development of a product but I certainly wouldn't discount a degree (or any formal education) as only being a launching point. I think the general consensus is that the best learning mechanism is a mix of both formal education and work experience. Formal education to learn theory and work experience to solidify the knowledge.


There's a much wider range of ability among self-proclaimed self-taught programmers. Some may be "self-taught" to the level that they have completed and understood the entire Art of Computer Programming (those released already, anyway). Some may be "self-taught" to the level that they can write a couple for loops in PHP and Javascript.

It's a hard question to answer because "self-taught" spans too wide a range of skill.


I agree with your point of view, but TAOCP isn't really a work to be completed, and I'm willing to wager that anybody who says they have completed it flipped through it and put it back on the shelf.


I agree. I meant to offer a point of contrast (perhaps a little extreme).


I thought "category theory" was a very strange thing to include in the list, so I searched down and found this equally strange comment: "I wish I had time to spend a year learning category theory to better understand how to structure things to be compositional". This is akin to saying, "I wish I had time to spend a year studying fluid dynamics so I can learn how to fly a plane".


More like: "so I could learn how to make a plane"


Exactly. The pilot is the user, get it?


I'm mostly self taught. The biggest gap I find is that I pronounce stuff wrong since I learned it from a book. It's embarrassing.


"Doesn't have a CS degree" isn't the same thing as "self-taught." Though I don't have a CS degree and I suppose I learned the basics on my own, I was largely taught how to write software by my peers. I learned a lot by working with really talented developers, and finding places to rub shoulders with them - mailing lists, conferences, blogs, HN etc.

That said, I think the hardest thing to learn outside of academia is the big picture, the general landscape of computer science and software development. It's really useful to know what you don't know, but it's hard to get that without actively seeking out something resembling a CS curriculum.


That just looks like the table of contents to an introductory Comp Sci textbook.


There's a comment there where someone says you go to school to learn how to learn. Clearly self-taught programmers already have this part mastered.

It leads me to wonder why we aren't asking, "What skills do school-taught programmers commonly lack?"


In my experience they often lack the curiosity to find background information and possess an adequate overview. Like they'll be able to write quite decent code in C++, but in an interview they'll fail at an easy question like 'what are the differences between C, C++, and C#.' (like they'll have no idea about different paradigms)

It's a complacent sort of attitude, like 'I did a degree, so obviously they taught me everything worth knowing.'


The most common I have seen is to obsess over code as an ends in itself. Writing pretty code when the solution doesn't really require it. The other is premature optimization, where all code must be fast, regardless of whether the problem requires it. These both lead to lacking a "get it done" attitude that I've seen in the self-taughts.


These problems are not specific to those with degrees.


I've worked with a few and some of the things I've noticed: No understanding of regular expressions / state machines / automata. Not understanding O() and algorithm complexity. Building kludged together language parsers. Not understanding parallel execution, threading, starvation, deadlock, etc. Generally not knowing about well-known algorithms and data structures and how to apply them.


It's really interesting to see that so many HN readers describe themselves as "self-taught programmers". I would describe myself as that as well, despite just graduating from university last year with a Software Engineering degree (at my university, Software Engineering is a four-year degree that's essentially a superset of a Computer Science degree).

I think the use of labels in this instance are not helpful. A person who hasn't touched a piece of code before they enter university is unlikely to magically become a great programmer in the course of three or four years. Indeed, my first-ever submission to HN was spurred on by a bit of disgruntlement at the quality of some of my peers (http://news.ycombinator.com/item?id=1902687 - not much in the way of discussion :).

My opinion on what you might get with a degree that you would be less likely to get without formal study: > More study of development processes and tools (this might apply more to SE than CS) > More study of the broad theory of computation > Earlier exposure to team programming (this comparison is clearly with "self-taught" programmers that aren't working in the industry) > Much, much more cruft that you're not really interested in :)

I'm personally glad that I studied SE at a tertiary level. It balanced out my other degree, it was a good way to meet other like-minded folks (and to compare my own skills against them), and perhaps most importantly I was exposed to a lot of stuff that I feel I wouldn't have sought out on my own. That said, I would feel it the height of presumption to "look down" on a programmer with a lack of degree. I know firsthand that a piece of paper does not make you a good programmer, and vice versa. The real determinant of how good a programmer you are is how good a programmer you are.


Peeling back the clearly insulting bias in the phrasing of the question, I do believe in programming as a craft. By that I mean that it's possible to become pretty good at it by just screwing around on your own, but you're fooling yourself if you think that you don't have something to learn from the academy.

To use an analogy, I think of myself as a decent carpenter, but I'm always blown away when I watch "This Old House," because Tom Silva is always busting out really great shortcuts that make things like scribing molding to the wall look really easy. I'm sure he invented some of those tricks on his own, but I'm also betting that he learned a lot of them from his days as an apprentice carpenter to someone older and wiser.

I think it's definitely possible to get such an education without ever setting foot in a school, for example by working closely with a skilled mentor, but I do think it would be hard to naturally stumble upon all of these key areas without someone laying out a self-study plan for you.


I agree. I think the apprenticeship model would work far better for programming than the lecture/assignment/exam model. Solving real world problems and learning on the job under the eye of a mentor would be invaluable.

The person/company offering the apprenticeship would get cheap labour for a few years in return for helping the young apprentice learn everything they need.


I won't discount the that there is a value in being taught by a professional educator. However, the primary sources are there for everyone. The excellent secondary sources are there for everyone. The amount of supplementary material available via the internet is very advantageous to the autodidact. The available interactions via the internet are such that they approach a collegiate peer group (if triaged). All of this taken together and approached in a steady, determined fashion, I believe, can near to perfectly simulate a college CS education (or education in almost anything else for that matter).

Academia and college educations are getting increasingly difficult to idealize as information becomes redistributed and consumed in unsanctioned ways. And sanctioned ways, too. See all flavors of OCW.


I think we tend to lack a deep knowledge of those aspects of programming that bore us.

I love learning, and I still do a lot of it by myself. But there's an infinite amount of cool shit out there to learn, and a finite amount of time.

That's why--despite having written several commercial software products, and having worked on systems deployed for years at a stretch--I can't even code a bubble sort or a quicksort, in any language, without resorting to google. ☆blush

The implied deficiency doesn't offend me; I don't really want to sit through two semesters of algorithms classes in C or Java, but at this point I definitely wouldn't mind having done so fifteen years ago.


The ability to manage a college loan.


One of the answers there specifically disparages big-oh analysis of algorithms.

This is actually sometimes important. A self-taught programmer, who really was a whiz at coding had some prototype code for data-analysis. It worked great on the sample data, but was O(n^3) (not just worst-case, average case) and we had a hell of a time convincing him that it wouldn't scale to the quantities of data we had.

Of course constant factors make a difference, but it is frustrating to waste time trying to convince someone that nlogn is going to beat n^3 on a large data-set.


It doesn't have any meaning, at least for me. If your aim, when you started to learn, was to follow a CS course (say, MIT courses), then you are going to have an equal knowledge. If your aim was to build your first web site and do HTML/CSS/JS and PHP stuff, then you are going to learn it and do it.

You cannot know everything anyway, no? And CS courses differs from an university to another. The benefit of being self-taught is that you choose the materials yourself and you are enlightened when you study it.


So, as a self-taught programmer I know a smattering of those things (admittedly compilers, FSM's and functional programming came from my engineering degree).

The problem I have with that question (as already pointed out) is that it is phrased as if those things are crucial aspects of "being a programmer", or being good at it.

Meh.

I use bits and pieces of the skillset; but I realised a while ago that the bits I use are things I knew (and made use of) way before knowing the topic in depth. Large parts of those topics are theoretical underpinning which is undeniably useful, but something you can get along without if necessary.

And that, I think, is the main difference between self-taught and taught programmers; the latter have a lot more theoretical understanding of programming concepts. In most cases it doesn't set them apart, but in the case of hard problems, or unique solutions having the theory is required for a solution.

I argue that the problem is classifying these things as "skills" rather than theory/concepts.


The best programmers I know are self taught. This is because they have a passion and hunger to learn and improve. This is what it takes to be a good programmer. I've met many people that have a CS degree because they simply didn't know what else to do while in school.

You are not vetted by final exams, you are vetted by creating something of value.


There isn't any one "self-taught programmer" who can be characterized, so a lot of these comments are claiming to be more general than I think is fair.

I have close friends who are utterly top-notch software engineers who didn't learn any of their skills in a formal educational setting. Often they have been able to learn-as-they-go. After all, these days the field is so large that nobody can be an expert at everything. And we often change jobs enough that what we need to know changes over time. So we all need the ability to learn-on-demand.

I was fortunate to get a computer science undergrad degree from M.I.T., but at the time, they didn't have any courses whatsoever on database systems. I ended up in a position where I needed to write a DBMS, so I found the textbooks and papers and learned how. A lot of my career has been based on that. It's not necessarily all that hard to learn things yourself.


A main anti-pattern I've observed from self-taught programmers (including myself during my early career as a junior dev) is lack of planning. This leads to being time-inefficient, cow-boy coding and generally misplacing priorities. Furthermore, an individual's personality type preferences contributes to this.


Background: I have a Master's in CS from Wisconsin-Madison, I'm a programmer as well as the founder of a software company. I've hired programmers with and without formal training in CS.

I think the self taught are both fine and justified in their worries that they are missing something. I'll try and explain as best I can.

Imagine that your mind is like a work shop, it's a place you go to create things. Let's consider two workshops: the neat and organized shop and the messy shop. I'm going to suggest that the neat shop is more like a programmer with formal training and the messy shop is more like the self taught. Some self taught folks will argue that their mind is well organized, which is fine, but my opinion from experience is that this view has some amount of validity in the real world. Your experience may vary.

The question here is "is it possible to produce really great work in either shop?" I think that's sort of what the original question was getting at, they talk about skills but the point is what you do with those skills, what you produce. And we all want to produce great work. So can both sorts of shops (minds) produce great work?

My view is that self taught people are typically sharp, sometimes very sharp, but their minds are a little "messy". Formal training in terms of learning basic skills is no better than self taught. But the farther you go the more useful the training becomes. The shop analogy, a bit stretched I'll grant you, is that the training organizes your mind. When you need a tool, you know where it is and you go get it and use it. For example, if you are doing a compiler, you already know that you need an AST (I had a smart guy, with compiler experience, waste quite a bit of time trying to do a compiler without an AST. In spite of another guy saying "Don't we need an AST?". First guy: no formal training, second guy, formal training.)

I think a question that might be more enlightening is "given two equally talented programmers, one with formal training and one without, which one can produce good results, over a broad domain of tasks, faster?"


It's really funny, how the LISP bigots say "Lisp", and the numerical experts say "numerics". If you ask those guys what's wrong with CS today, they would say "No Lisp" and "No numerics" respectively.

Even good programmers have huge holes in their knowledge.


This is a good question and actually one that drives most self taught people. This sustained sense of "ooh, I haven't seen that before" helps the self taught learn far more than many formally educated programmers I've met.


I'm also a so-called self-taught programmer. I thought about this very question a while ago and came across this definitive answer by Joel Spolsky: http://stackoverflow.com/questions/414779/what-should-a-self...

He also nails it again in his post, "The Perils of JavaSchools" - http://www.joelonsoftware.com/articles/ThePerilsofJavaSchool...


I did a CS course and it was pretty heavy on maths and theory - and although you had to do a lot of coding to pass there wasn't much effort made to teach you the pragmatic elements of real world development.

University CS courses are terrible at being vocational training - but they are absolutely the best place to be if you are interested in research and the theoretical underpinnings of the subject - which are pretty vital if you want to go on and do posgrad work (which I did).


I am self taught with no degree :( and think that the key is in being curious and exploring. A constant pressure/feeling of greater ignorance also helps drive that. I learned to program cause I wanted to make games so from there I picked up C, a tiny bit of numerical methods, pointers and finite state machines. Self teaching C++ was horrible so I quit computers for a couple of years. However, I came back to it but wanted a language as different as possible from C++. I tried python but my short term memory is terrible, I could not get my head around dynamic types. I tried scheme but was too dumb to get the syntax. I had an interest in pure maths so was drawn to haskell and ML. Simply being interested in haskell you get thrown in the deep end and get introduced to a bunch of theoretical stuff which I perused for a couple of years, including from the list:

Data Structures, Programming Languages (http://www.cis.upenn.edu/~bcpierce/tapl/), Code-as-Data, Patterns (mostly in the guise of how pattern Y is an inferior version of feature X oh & monads are super patterns), Functional Programming, Object Composition, Recursion, Lambda Calculus, Type Systems, Category Theory

Each of the above expands and leads to its own world. As a self learner you just have to keep exploring. For example with category theory, you pick up type theory and learn how they and sets all relate. From sites like this one you learn about the importance of unit testing and version control. From lambda calculus you can rewind to Frege, unrewind to zermelo and learn about first order logic and trace a line from Haskell to Weierstrass program - taken to its absurd conclusion with Bourbaki - to rigorize Calculus (along the way you may learn of infinitesimals and the hyperreals formulation of Calculus by robinson which I found more intuitive).

After a while I realized that I was really into Artificial intelligence, Graph theory and Subjective Probability theory. The latter two, I think will be the Calculus of the future[1]. This led me through Machine Learning, more much more numerical methods and brings me to today. I don't know much about search or compilers or a great understanding of system internals but I can pick it up if it interests me or I need it.

The downside to self learning is that with no teacher everything is harder. With no teacher you can't double check your model so really understanding takes longer. There is no need to test yourself so you are in danger of jumping around without having properly learned anything.

My solution is to read voraciously to index and just work on what I want so that if I need a concept, knowing of its existence allows me to know that I should learn more on that topic. at the least I can make that connection. Revisiting is key, if you don't understand something now move on but be sure to come back later. Relate/Analogize/search for treatments that most suit you. Good with programming and learning differentiation? Then derivatives as higher order functions paired with Euler's notation makes it way easier and makes expanding to higher dimensions more straightforward. Learning hard things gets easier if you keep at it and you continually expand your base. Its all very slow going though and knowing when to Explore and Exploit your current knowledge is tricky. Till recently I leaned too much towards explore but if you want to get stuff done you need to exploit with what you got. I doubt this approach would have been viable 15 years ago. Or before Google and Wikipedia citations. Finally, I hate academic papers behind paywalls.

[1]As the corner stone of techniques. I see the concept of entropy appearing everywhere and find the idea of quantum mechanics as a complex probability theory just incredible.


> the key is in being curious and exploring

That reminds me of another hacker:

"I think one of the most important guiding principles has been this: — that every moment of my waking hours has always been occupied by some train of inquiry. In far the largest number of instances the subject might be simple or even trivial, but still work of inquiry, of some kind or other, was always going on.

The difficulty consisted in adapting the work to the state of the body. The necessary training was difficult. Whenever at night I found myself sleepless, and wished to sleep, I took a subject for examination that required little mental effort, and which also had little influence on worldly affairs by its success or failure.

On the other hand, when I wanted to concentrate my whole mind upon an important subject I studied during the day all the minor accessories, and after two o'clock in the morning I found that repose which the nuisances of the London streets only allow from that hour until six in the morning."

Passages from the life of a philosopher Charles Babbage


Interesting how people here took the question in such a negative tone. This list is actually useful for a novice self-taught programmer, like humbledrone said. As a self-taught lots of times I felt like I was missing something when I read programming stuff. Of course I didn't stay still I just went learn what I was missing. A list like that could have come pretty handy than. (ok I could look at some CS courses syllabus too)


Quite simply they lack the education, this is different to knowledge, which, I'm sure they do not lack, otherwise they would not be competing at the same level for the same jobs.

Education can lead to an acquisition of knowledge, but it is by no means the only way of acquiring it (experience for example is another way).

Knowledge is what employers look for, and it is a better indicator of ability than education - especially for non-grad positions.


"The difference between the university graduate and the autodidact lies not so much in the extent of knowledge as in the extent of vitality and self-confidence." -M. Kundera

“[T]here are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don't know we don't know." -Donald Rumsfeld


Sadly, the most important one is nowhere to be found: good software development practices. I've met a lot of brilliant self- taught programmers, who could read up on parsers and systems programming, but not one read up on how to keep software maintainable or how to manage project risk.

Of course, they default to waterfall. (!!!)


I'm surprised that people view this as a destructive or loaded question. As a self-taught programmer, this sounds exactly like something I would ask as a way to fill gaps in my own knowledge.


What pieces of the whole are missing?

For one thing, I've seen many self taught enterprise software programmers get themselves into serious pain the first time they try to do concurrency.


I would think that the school you attended would be an important factor in the equation. And also what level of education that you received.


A self-taught programmer is like a self-taught dentist.

Funny thing is that the former gets lots of work and the later none at all.


After digesting the Quora thread some more, I think it's yet another "bait" question -- meaning, one not seriously asked, because both (a) it's loaded, and (b) the answer(s) are obvious and no need to discuss among serious people with reasonable intelligence.

For example this:

>> And as a follow-up, where can said self-taught programmers find good resources on the above subject matter?

Come on. In 2011, where could one possibly find good resources for the above. Let me rack my brain... Hmmmm.... Maybe.... rhymes with Internet? Rhymes with Google? Rhymes with bookstores? I don't know. I give up. I wish I had a PhD in CS so I had the formal education needed to figure this one out. Thank goodness for Quora! (Wait, that's part of the Internet. I missed the class.)

Seriously.


In 2011, where could one possibly find good resources for the above. Let me rack my brain... Hmmmm.... Maybe.... rhymes with Internet? Rhymes with Google? Rhymes with bookstores?

The problem is that these are also effective ways to find bad resources. At Quora, OP can find people with education/experience to identify which resources are worth the time/money.


The question, as it was posed, was pretty arrogant and contained an implicit assumption. It assumed that a "self-taught" programmer does not know, or does not have available to him, certain things that a so-called formally taught programmer does. And that's just not true. Both self-taught and academically taught programmers have access to books, computers, the Internet, peers and tools. Also, there's no black-or-white difference between the two: the academically trained programmer has had some amount of so-called self-teaching, and the self-taught programmer almost always has some amount of academic education. Also, they both have brains, and are able to reason out things for themselves. Many things are learnable by experience, and learnable when needed. Also, there's the point that everything ever put into a textbook was at some point independently discovered or learned by someone, without the benefit of a textbook or academic course -- instead they had to discover it directly or think it up themselves, or derive or synthesize from something else they read or learned previously, sometimes in adjacent or even very different fields. The computer itself is a wonderful teacher. And what the Internet makes available to everybody, regardless of whether inside a university or not, makes even the famous Great Library of Alexandria look pale in comparison.

But I overstepped myself. I couldn't possibly have these thoughts, or reach these conclusions, without having been spoon fed it from a professor in a university course. I'm sorry. I forgot my place. Back to the servant kitchens for me....

...oh look, a book on algorithms, what's that doing in the servant kitchens? I'll sneak a peak when nobody is looking. :)


Self taught programmer here. I would like to think I have at least a bit of knowledge about each of these topics (though definitely not an expertise in most) I would say the one I know the least about is Machine learning, but i'm actively working on changing that :) I just bought this book (and enjoying it!) http://www.amazon.com/gp/product/0262013193


That's a pretty silly list. Many formally trained programmers also lack understanding of several items on the list. How much can you really teach anyone on compilers and machine learning in two months? The easiest way to learn something is to use it. Taking tests and turning in homework assignments for 8 weeks only gets you so far. If you want to pick up the formal theory and functional aspects of programming then I highly recommend the course notes available at http://www.seas.upenn.edu/~cis500/current/index.html. If you get to the end I guarantee you'll know more about the theoretical aspects of programming than any "formally" trained programmer at a big name university.


How many of these things can one really be missing and still call themselves a programmer? I'm self taught myself. Pointers? Really? You can learn pointers in about 20 minutes.


It's interesting, but it seems some people need a lot more work when learning pointers than others. I struggled with operator precedence issues when I first encountered pointers, but didn't have a problem with the concept itself. But I know people who never really managed to wrap their minds around it.


A basic diagram with some arrows explains pointers pretty well.


Interpersonal relationship skills.

Even universities don't teach social skills for life.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: