Hacker News new | past | comments | ask | show | jobs | submit login
Why Learning to Code Is So Damn Hard (vikingcodeschool.com)
314 points by vike27 on Feb 4, 2015 | hide | past | favorite | 258 comments



Here's where I disagree with other people - software development is special in that it requires focus, relentlessness, intelligence, creativity and I also find it interesting that many software developers tend to suffer conditions from the autistic spectrum. To me that's a clear sign that software development requires the mind to be hardwired in a certain way.

And the thing is - I never needed handholding, which is why I have mixed feelings about such educational efforts. For me that desert of despair was fun and nothing could have stopped me. For me entering a couple of lines of code that made the computer do something was like a game and felt like magic, with each piece of knowledge learned increasing my skill, in a sort of real-life RPG game. This started before high-school, I remember begging my parents for a PC and I started reading books on hardware and one on BASIC before having that PC, so that's how desperate I was.

And my story is very similar to others. It's undeniable that some people have an inclination towards software development, like a deep internal urge that must be satisfied, much like a drug addiction. This is why you'll find many developers saying that even if they wouldn't need to work, they'd do it for free.

To me educational efforts for adults are misplaced. If you want more people to become software developers, you need to show them the magic while they are young. As for most adults, I believe that the ship has sailed already.


I disagree, and I disagree because I used to think this way (having lived and breathed programming since age five), until I started to meet people who had started to program in university (which I considered _way_ too late) and who had become excellent programmers. Not adequate-for-a-job programmers, but best-of-breed programmers.

In fact, often better programmers than "naturals" because along the way they had learned a lot of discipline that isn't necessary when programming is fun -- but as every "born programmer" turned professional finds out, programming isn't always fun.


In the “born programmer” theory, surely the trait begins latent and could manifest at any age depending on exposure / triggers / mentors etc? And university seems a likely time for that to happen; not everybody had access or was given that first nudge into computers earlier in life.


I don't really agree with the notion that software development is something that others cannot learn if they try to do it; that seems like a very harsh perspective which, from my anecdotal experience, is far from the truth. Any skill can be learned if you have the discipline to put in regular effort: that's just the way the human mind works. I do agree that for a lot of people, Software Engineering is easier because they have a certain kind of brain, just like Mathematics or Physics or Music is easier or "makes sense" for some people more than others.

If anything, we need to promote education among adults, give them the choice to possibly check out any vocation that they would want to try. Liberate them from the notion that they are "stuck" doing what they chose to do earlier in their life. I'm not saying that its easy; you have to start small and take baby steps. And its not something that can be done in a day or a week or a month. But if modern society has provided us one great benefit, it is of making knowledge accessible very inexpensively.


What you're saying is true and I used to be optimistic about it like you, however I stopped believing that it is feasible for most adults after failing to teach family members the basics or junior colleagues to become better.

Maybe I've been a bad teacher, however the thing I noticed is this - yes, if you put enough effort, Ok, we can probably do anything, like learning to play the violin or do programming, except that's easier said than done. The effort required is actually enormous. Put in that equation real-world concerns, like a social life, a day job, raising children and so on and it becomes next to impossible.

For a child it is easier - he has the time available and that desert of despair mentioned can be actually fun. Because to a child, making the computer do stuff can feel like magic, whereas to an adult it feels like a chore to get somewhere, a chore that eats all of the available time.

Knowledge is accessible and that's great. The noise is also great though and one still has to filter, analyze, learn and work. The Internet is indeed disruptive, like you get access to books or to online courses from reputable universities without being there and without paying a single dime. But I still see this as helping children in impoverished nations, rather than adults in first world countries.

Note that I'm not discarding the possibility of adults actually making it over the learning curve. I'm sure there are people out there that have succeeded. But those are super-humans and I think they represent the exception.

There's also another thing that's starting to piss me off. Some of the problems we end up working on are extremely difficult, yet because we are idealists, we go around telling everybody that everybody can do it, probably out of sheer enthusiasm for the things we do, because we believe the world would be better and maybe because we want to share with others our passion. But you know what, some of the stuff we do can be really challenging, objectively speaking not everybody can do it and going around and saying that opens us up for abuse. And then we start complaining about agism or about companies fixing salaries and entering no poaching agreements. How stupid can we get? Oh, so you want more or better software developers? Capitalism works both ways, demand and supply is a bitch, so fucking pay up.


> . Some of the problems we end up working on are extremely difficult, yet because we are idealists, we go around telling everybody that everybody can do it, probably out of sheer enthusiasm for the things we do, because we believe the world would be better and maybe because we want to share with others our passion.

This right here is the money quote. This is why I am very wary of taking a position as a "developer" or "programmer", and not a consultant. Computer science is one of the most intellectually demanding professions in existence, yet a bunch of kids who are talented try to stunt and make it sound easy. It's like the world's greatest Neurosurgeon posting his surgeries on youtube and saying "anyone can learn, come to my bootcamp!".

Absolute nonsense. It probably has to do with them trying to signal competence for mate selection...If that's the case, I've got news for you boys, CS is never gonna get you laid. Money on the other hand, most definitely will. So focus on your boatloads of cash, not how "easy" CS or programming is, and leave the dignity of your profession untarnished.


Actually you can't ever learn to play a violin properly unless you have an ear for music. There is a threshold of minimal talent that you can't jump over just by practicing, you either can hear it or not. While there is no such obvious threshold for e.g. math or programming (for a person with an average IQ), I think we could talk about a kind of a threshold defined by the concentration and willingness to invest time and effort into the learning process. Theoretically anyone should be able to do it, but in reality not everyone is mentally capable of that, simple as that. Just like running a marathon or something, most of us will never be able to do it, although in theory it's simple: you just need to follow the proper training for long enough. Losing weight is also dead simple, just as quitting smoking or getting off drugs, but many can't do it. Learning is no different, it takes certain minimal level of power of will to be able to make it.


Most people could run in marathons successfully, unless they are sick or something. It's what the human body was built for. And if you are going to argue that playing the violin requires an ear for music, as if that's so uncommon, then I could also argue that programming skill depends highly of intelligence.

> Losing weight is also dead simple

No it isn't, because weight gain and loss highly depends on one's metabolism. Say that to a diabetic and you'll probably get punched.


> Losing weight is also dead simple, just as quitting smoking or getting off drugs, but many can't do it. Learning is no different, it takes certain minimal level of power of will to be able to make it.

You say that losing weight is dead simple, and that you need minimal level of will power to achieve it, but then you say that many people can't achieve it.

Can you see there's some disonance there?


But "simple" doesn't necessarily mean "easy". The simplest way to get a boulder up a hill is to push it with enough force.


If there isn't difference in ability (which I think there probably is), there is certainly difference in inclination.

I work with a couple of guys that want to learn to program. They try. Sort of. They took the same online classes I did. But the content never transformed from theory into "hey, look what I could do with that!" They also just don't have whatever it is (fanaticism? lunacy? lack of better things to do?) that made me sit in front of a terminal for days and nights running together on end, barely eating, hardly sleeping, dreaming about it, trying again, beating on until it worked. There is something different about productive programmers, and no, I don't think everyone has it or has the capacity to become that.

I'm not going to claim it's a virtue... it really is almost more like a vice. An addiction. An ability to tune out the rest of the world (which sometimes is more important in reality) and focus intently on organizing abstraction.

I'll also say... long periods of this are not so great for your relationships and personal life. At least in my experience. So maybe they are better off truth be told.


Anyone can learn to play a musical instrument, and many people do, to a high standard, play in bands, release tracks, and so on.

But if a man selling guitars and guitar lessons tells you that your kids should learn to play the guitar and join a band because it's a well paid, secure career... Then you'd be wise to take that with a pinch of salt.


Sure, most people can 'learn' most things - but how well can they apply them? How good a developer / musician / etc will they be?

I feel that this 'anyone can code' movement is good up to a point, and dangerous beyond that. In the same way that we don't need more half-arse and incompetent engineers, lawyers, doctors, pilots, we don't need similarly incapable developers. The field of software engineering is still in it's infancy, and the last thing that is needed are barely-capable developers who needed hand-holding every step of the way to be lowering the bar.


I'd wager that the incapable developers aren't the ones that "just learned too late, welp", but rather lazy ones or ones without discipline, etc.


Yep. Discipline is the principle at work in making any craftsman better...whether it be a carpenter or a programmer. The nice thing about programming though is that there is so much technology and knowledge easily accessible for programmers to become better at their profession; I'm not so sure about other professions.


I think you are right, anyone can learn to code. But for many people the learning curve is just to steep and they give up before they can learn it. And that is ok, not everyone needs to be a programmer.


Not sure I agree with you.

I always avoided maths at school, despite being reasonably good at it, because I didn't find it interesting. Studied biology and ended up in software engineering.

Now after many years in the "real world" I can see the practical application for many forms of mathematics (particularly statistics), and kind of wish I had studied it more. If I had the chance to go back to university and study it, I am sure I would find it a lot more interesting, as now I can see where all the techniques could be put to use, rather than seeing some abstract set of symbols on a blackboard.


I feel the same on the Maths part, and have recently found an excellent essay that depicts how this is the case and why we shouldn't be put off by the terrible "mathematics" we were taught at school: A Mathematician's Lament http://www.maa.org/sites/default/files/pdf/devlin/LockhartsL...


I have a contrarian view. This is purely based on my experience. When I was first 'taught' BASIC in high school I detested it. I didn't want to stay anywhere near computers because it was no fun compared to let's say jumping off a roof. Fast forward to Uni, I still detested computer science - mostly because I never realized what one can achieve by using computers. That context was simply not there. This was 1995 in India. My mind was simply elsewhere - mostly I was disinterested in anything related to making a career.

I did find a job and turned out to be an average programmer and I worked mostly for the heck of it.

The turning point in my life came when I ventured into management. I realized how fucked up that industry was, as compared to even the most inane programming task I had ever done. So that's when my love for programming started to grow and it hasn't stopped ever since. More and more I see other industries I realize how shoddily disorganized they are. I mean - consider the process of getting an approval. It takes days where as it's just a push of a button on an app.

So it's not true that as an adult - you can't realize the magic of programming. You just need the right context and some affordabiity for the right tools.


> Here's where I disagree with other people - software development is special in that it requires focus, relentlessness, intelligence, creativity and I also find it interesting that many software developers tend to suffer conditions from the autistic spectrum. To me that's a clear sign that software development requires the mind to be hardwired in a certain way.

I agree, and I don't think it's a good thing. As Crockford once said, we chose software because there's something wrong with us.


I don't think it's a good thing either.

I also find the spread of autism terrifying. My wife works at a kindergarten and she has had to deal with children that have problems from the autistic spectrum, healthy enough to be allowed to integrate in a normal collectivity, problematic enough that adjustment is very hard - they simply stand out from everybody else. And she noticed an interesting pattern, those children usually come from families of engineers, mathematicians or software developers.

And maybe the sample is not representative of the general population, but it scares the bejesus out of me, especially since we have a son that is happy, loving, socially apt, smart and healthy, but that was a late talker and at 4 years old he still does mispronunciation for many words - this is another recent trend.

I have read a book at some point, can't remember the title, but especially in boys you have brain parts specialized for certain tasks and this happens when you've got certain parts of the brain underdeveloped, with maybe other parts overdeveloped. It usually goes away, once the child grows. Maybe this is just an evolutionary step, or maybe we are broken, I don't know.


> And she noticed an interesting pattern, those children usually come from families of engineers, mathematicians or software developers.

This is a serious issue, there was an article on HN a few years back on how the concentration of children born with Autism in SV is something like 10x higher than elsewhere across the nation. They concluded that its most likely due to a severe case of assortive mating between mathematically gifted couples.

People wanted to know why evolution didn't continue to make us smarter? There's your answer...there's an upper ceiling imposed by biology. The way our neural architecture develops puts a damper on how much abstract reasoning we can do before we sacrifice our basic survival mechanisms (i.e. social interaction).

Autism is one of the most fascinating conditions from a medical perspective, but its absolutely terrifying when the majority of HN (myself included) is at greater risk of producing autistic children.

Guess there was a reason why people married their secretaries...


Is standing out a bad thing ? Look at where we are now. Consumerism at its highest, people becoming mass educated, mass medicated, mass produced. Is the world in a good shape ?

Maybe those who stand out are exactly the specialists we need to change this world. To think differently, as Steve Jobs famously said.

We are not broken. We are perfect as we are. It is our approach to these children and people that is broken. We should embrace them, and foster their creativity and unhuman kind capability to concentrate. And not medicate them with medicine that will kill their creativity and unique way of looking at the world.

Think of them as Specialists, not Generalists. Our current education system sucks for them, sure. But it will change.


I like it. But, I do think you can't fully embrace your strengths until you've admitted your weaknesses. Maybe it's my own biases but I'm afraid of people ending up at a dead end, an unsolvable problem, and not having an outlet because they haven't been forced to lead a somewhat normal life like the people surrounding them. That's an awfully dark place to be in. I think it's important not to let someone get too inside their own world, you still have to try and "integrate" them into "society" but I certainly agree there are better ways it could be done than holding everyone to the same strict/lacking standards across the board.


I dont think so. Why wrong ? Computers have huge potential. Forget the naysayers who dont understand these "big machines taking our jobs!", they fear the change and spread this fear to those who work with them too.

Be proud of what you do. Love it. Aknowledge you are on the right path. Or else change path or your way to approach it. Why do it if it feels wrong ?


>Why do it if it feels wrong ?

I may well be mistaken but I don't believe this was the intended meaning in the quote. Do what feels right, that's not being disputed... but what feels right to someone who "sees the code" behind things in life is often very different from what feels right to the average/normal human at this point in time. Like how everyone loves a good comedian but if you ask any good comedian they almost all say you have to be kinda fucked up to embrace that life... I agree with you though, I don't thing it's wrong or fucked up, I think it's just different from the lives that most people believe to be everyone's status quo.

Amish people freak me out. I can't understand why you'd want to limit yourself to that set of strange ideals (roofers with cell phones who play angry birds on lunch break but they have to higher a driver because it's against their weird rules?) but I don't believe they are fucked up or wrong... they're just a quark of this strange existence... I don't want to believe it but I might be just like them if I'd grown up indoctrinated into that life. There is some beauty in every path if you choose to see it.


> For me that desert of despair was fun and nothing could have stopped me.

> To me educational efforts for adults are misplaced.

My theory is that the difference is just that (some) young people have more spare time. If you have lots of time, then the inherent fun of learning stuff makes it fun to wander around learning stuff. But if you don't have much time, then you prioritize the things that relate to your goals, and then unexpected things in between you and the goal are stressful obstacles.

If you have tons of time, then your default state (alternative activity) is boredom, and learning anything interesting saves you from that, so you're glad you did it. But if you don't have much time, then learning something only to find that it wasn't as useful as you thought it would be (which often happens when you don't know what you don't know, but are trying to learn) is disappointing, because you know you sacrificed other important things (in retrospect, things that were more important) to spend time learning that.


> It's undeniable that some people have an inclination towards software development, like a deep internal urge that must be satisfied, much like a drug addiction.

This describes me quite well. However, I'm not sure that this addiction is required in order to do software development at all.

There is evidence that with certain concepts in software development--pointers and recursion are two that come to mind--people either grok them or they don't, and if they don't, no amount of education or training seems to help. But I'm not sure the ability to grok these concepts is always correlated with the addiction.


I don't really agree on the hard-wiring. I do think programming is more of a mind based pursuit rather than a socially based pursuit though. "Autistic" tendencies are for the most part due to lack of experience or disinterest in social aspects rather than from actual neurological hardwiring preventing you from picking up and acting on social cues.

Other pursuits that are so interesting that the social factor isn't required include physics, mathematics, etc... These people aren't necessarily programmers.


Given that autism is a mental condition characterized by difficulty in communicating and forming relationships with others, involving an atypical brain development for some reason or another, I believe that this is evidence enough that social skills or lack thereof are influenced very much by the brain. We really shouldn't place so much faith in free will :-)

On mathematics and physics - if you'll notice, they share common traits that go beyond the lack of social interaction. Creativity is also required (well, not if you think about math or physics as learned in school), focus and juggling with abstractions as well. In fact mathematics is even more extreme, because with software development one usually gets much faster feedback from normal people. And it's not like you see mathematicians and physicists everywhere.


When I was in college, one CS professor explained the difficulty of coding to me in terms of discreteness vs continuity. In the real world, things are continuous. If you accidentally build your structure with 9 supports instead of 10, then you only lose 10% of the strength of the structure, more or less. The strength varies continuously with the amount of support. But if you're writing a 10-line program and you forget one of the lines (or even one character), the program isn't 10% wrong, it's 100% wrong. (For example, instead of compiling and running correctly, it doesn't compile at all. Completely different results.)

Of course this logic doesn't hold up all the time. Sometimes you can remove a critical support and collapse a structure, and sometimes removing a line of code has little to no effect, but the point is that in programming, a small change can have an unboundedly large effect, which is the definition of discontinuity.

(I believe it was this professor, who was my teacher for discrete math: http://www.cs.virginia.edu/~jck/ )


Wahoowa! This also explains the popularity of interactive platforms like Codecademy/CodeSchool/Treehouse, etc. Ton of handholding and pre-filled syntax. We describe them to our students as "coding on training wheels."

Cathy, a new frontend student, experienced similar struggles when starting her first real project - a simple HTML/CSS resume. She spent countless hours fixing minor typos and almost quit. It wasn't until she was reassured that this was _normal_ and "real" programming was much different from codecademy did she feel like she was truly learning. She wrote about her first month (with similar highs and lows as OP's visual) in this post: http://blog.thinkful.com/post/98829096308/my-first-month-cod....

side note: Erik (OP) is an incredible guy, we had the pleasure to share our experiences in Edtech and it's obvious that he truly cares about student outcomes.


> She spent countless hours fixing minor typos and almost quit.

it's interesting anecdote!

When i was at uni, i noticed that a large number of "beginners" tend to fall into this category too - frustrated at minor typos/language idiosyncrasies. Those who had the mental strength to endure it end up passing the class, while those that just gave up will almost inevitably change their major (and thus, stop programming i guess?).

But i think this is a result of poor educational methods, not because of an inherent property in programming.


With respect to your CS prof at one of my favorite places (Go Cavaliers!), this is not an issue of discrete vs. continuous. If your structure can have either 9 or 10, but not 9.001, supports, it is discrete, not continuous, regardless of failure mode. And remove one of the 3 legs of your stool, and you probably wouldn't have 2/3rds of its support remaining, but whether you did or didn't would be an issue of proportionality, not continuity.

There are a lot of jumbled concepts here, most of which don't matter anyway, because what you are talking about is the phenomenon of graceful degradation. In the physical world, both natural and man-made, almost nothing at the macro scale is ever perfect, so the best designs tend to be those that remain good enough under the widest range of circumstances vs. a more common software goal of being perfect under perfectly controlled circumstances.

As software gradually moves out from the wall garden of a single mainframe to fill the world with interacting systems spanning diverse machines, sensors, communications channels, data types, etc., design for graceful degradation becomes more and more of a focus for professional software architects.

Coding in the gracefully degrading way is much harder than coding in the "if even one of your ten lines is wrong, you crash" tradition. The fact that even the latter is so hard for us humans means we will need more and more help from machines that learn what to do without being explicitly told by us.


I agree that "discrete vs continuous" is not the perfect way of expressing the difference; it's just an analogy. (But the structural support example is continuous. You could have 9.5 supports by adding a 10th support with half the strength, etc. "Amount of support" is the continuous measure.)

But it's not just an issue of graceful degradation. The fact that tiny changes in a program can have very large effects is a feature, not a bug. We grade programming languages on their ability to concisely express complex operations, and that conciseness necessarily means that very different operations are going to have similar expressions (e.g. subsetting rows vs columns of a matrix typically differ only by a small transposition of characters, but the effect is completely different).

You can write software that degrades gracefully, but one syntax error (or other "off-by-one-character" problem) is still going to kill the program. You can talk about running your program on a large set of redundant servers with no single point of failure, so that you can update them one-by-one with no downtime, and that makes you robust against even syntax errors. But that's not helping you teach novices how to write code.


there are continuous programming languages out there - DNA is one such one i guess. But i don't think the discrete vs continuous nature of a programming language is what makes it difficult. It's more that a person's mind may not conceptualize tasks algorithmically, and to switch to this frame of mind is difficult for someone who isn't already in this frame of mind.


That's a good point. DNA as a programming language has to be at least somewhat continuous, or else evolution has nothing to optimize because every change has a random effect.


DNA is discrete. It can be precisely represented symbolically.


DNA is a lot less discrete that you might think. There's epigenetic factors and population proportions, for example.

But even considering DNA as just a 4-letter language with discrete characters, my point is that many, even most, small sequence changes to a genome (e.g. single-nucleotide variants) have small effects or no effect at all, which gives evolution a smooth enough gradient to optimize things over time. That's what I mean by continuous in this context. The opposite would be, for example, a hash function, where any change, no matter how small, completely changes the output. Hence you couldn't "evolve" a string with a hash of all 7s by selecting for "larger proportion of 7s in the hash function", because hash functions are completely discontinuous by design. But you can evolve a bacterium that includes more of a given amino acid in its proteins by selecting for "larger proportion of that amino acid in protein extract".


Unless you're doing Rails, in which case it'll be read as a magic method and guess what you meant :-P

Seriously, that was a major sticking point for me having programmed for a long time: going from "if you have not declared that identifier, game over" to "magic happens".


I've been teaching myself off of online resources and 'magic' was what I hated most along the way. I can't debug magic. I've ended up digging so deep to understand things that I'm covering assembly now. It's painful, but going so far has made everything else make a lot more sense. Data structures are easier to conceptualize and will be easier to work with (for example).

But most people I know don't get this far when they self teach or do a bootcamp. They just know that, given a framework, they can build things but not how anything was really built. Sure it's effective to push out a product, but it makes diving into real programming pretty difficult. That's just my perception though.


You need to keep in mind that "magic" is only sufficiently "advanced" (or rather obfuscated) technology, but omeone somewhere said something along the line of "Things that work as if by magic also break as if by magic" .. I've been looking for the reference ever since.


Nicely put. Not enough people dive to the lower levels.


We should start at the lower levels. It is all easy to understand if you build up from first principals: Binary, logic gates, cpus and assembly, then it splits with compilers on one branch and LISP and Smalltalk on two others and a bunch of brush and shrubs and time to retire that metaphor.

Unlike climbing Everest or understanding how the human body works the challenges in learning to program are entirely man-made! (Except for recursion, of course.)


I'm in a similar boat. Started with the training wheels of Codeacademy and such after having a basic working knowledge of HTML/CSS/JS and wanting to build database driven projects and grow my skillset.

Picked up RoR and was quickly overwhelmed with a million unfamiliar concepts as pointed out in the excellent (and very similar) article "This is Why Learning Rails is Hard."[1]. That knowledge tree they show is one I didn't formally stumble upon until later, but throughout my progress I realized "hey, this concept is really part of the much broader topic of X." Then I'd go down a rabbit hole on X.

Before I found that tree though, I had already given up on Mike Hartl's tutorial once, and decided I really needed to have a functional grasp of Ruby and core programming concepts. From there I realized "Ruby/RoR on Windows is not ideal." Then went down the whole devops path and learned about things like Vagrant/Chef/Virtualbox, etc.

I also started picking up books on much deeper computing concepts to understand the lower level mechanics of the magic. Like you I went down to first principles and even a bit of assembly. I couldn't write any to save my life and knowledge is still fuzzy, but I now grasp how the concept of data structures came to be, and more importantly WHY.

I recently tackled Mike Hartl's Rails Tutorial again. His updated version is a great improvement, and this time I actually understand the concepts he goes through. When a new one is introduced, I have enough of the underlying knowledge to at least have a sense of what/why something is, or what I need to Google to learn more.

I wish more classes online provided "deep dive" resources/links on things. Like, if CodeAcademy has an exercise on Ruby covering types, an eager student might really benefit from a deep dive sidetracking into dynamic vs. statically typed languages, and a high-level of what they should know.

My biggest gripe with the tutorials that are out there these days is that they cater to either absolute beginners or competent users. Wish they did a better job of trying to bridge the gap from absolute beginner to intermediate.

Another great example of that is the concept of design patterns. I haven't found many great beginner/intermediate resources on this, but as I've started learning more, I found myself saying "hmm, seems like lots of people do this a similar way--I wonder why." Turns out some approaches to problems are largely solved issues for a majority of use cases, hence: design patterns. This got me down the whole path of software architecture and starting to grok some of the higher-level abstractions and way of thinking in the abstract which was tremendously helpful compared to just being given specific examples with no broader context.

[1] https://www.codefellows.org/blog/this-is-why-learning-rails-...


That is absolutely something that irritates me. I've just inherited a large RoR application, and the amount of "magic" and things by convention is driving me crazy. There should be answers to questions like "why is this the way it is?!"

On a side note, if anyone has some great resources for RoR, I'd love to have them linked. I suspect my inexperience is the source of my problems, and I'm welcome to any assistance any one would like to give.


The guides on the ror website are pretty good:

http://guides.rubyonrails.org/

Which bits did you find were magic? The bits of convention I can think of you'd have to know about are:

DB naming conventions - these are used so that it can do joins etc easily behind the scenes, they're pretty simple so not a huge problem I find.

Rendering at the end of controller actions - it'll render the template with the same path as your route - again relatively straightforward.

Class loading - lots of things are loaded at startup time, so that you don't have to include files - I have mixed feelings about this, it feels easy and simple at first, but could leave you unsure where code comes from or which methods you can use in which files (e.g. view helpers). Definitely more magic.

One other area which does lead to real problems is that rails sites often use a lot of libraries in the form of gems - this leads to unknown, sometimes poorly maintained or inappropriate code being pulled in at runtime, and makes it far harder to reason about things like say authentication if using a gem. This is my biggest complaint with rails - lack of transparency of code paths when using gems like devise, paperclip etc but it is unfortunately quite common in web frameworks

They actually got rid of quite a few bits of method_missing madness I think recently so that magic is gone at least (all those magic find_by_ methods are deprecated or removed, not sure which as I never used them). I haven't found the conventions get in the way much as it's something you learn once and can apply anywhere, but completely understand why someone might object to some of the magic setup for helpers/rendering.


The routing system and associated view helpers can really get confusing.

For example:

    link_to @story.title, @story
You have to know that rails has some automatic routing based on the class of an object. If @story is a Story class, rails basically does this underneath:

    link_to @story.title, send("#{@story.class.name.downcase}_path".to_sym, @story.to_param)
There's implicit conversion of class names going on under the hood in a few places. It's all documented but it's not easy to find the documentation when you don't know what you are looking for.

The thing that really screws up people starting with rails is not understanding the various layers (html, views, controllers, models, http, etc.) and how rails puts those together. If you don't know how to do web programming with basic html and php, rails will eat you alive with it's seemingly magical behaviors.


I have to agree - The path helpers are very opaque. It would probably do Rails well to generate a app/helpers/path_helper.rb file with the actual implementations in them.


`rake routes` will output the routes and paths. Appending _path or _url to the path will generate appropriate methods.


Sure. But that still doesn't tell me exactly which arguments they take. Or give me an opportunity to debug the code when it doesn't do as I expect. I realise that it just-works (tm). It's when it doesn't, it gets problematic.


It's actually pretty easy to tell the parameters they take if you look at the url for the route.

For:

    story GET   /story/:id(.:format)  story#show
You get

    story_path(id)
    story_path(id, format)
    story_url(id)
    story_url(id, format)
In practice it doesn't cause as many problems as you think, even in large applications.


When learning Rails, I found the DB naming conventions confusing enough that I wrote a blog post summarizing how everything is supposed to be named when I figured it out, since nobody else seems to have:

https://shinynuggetsofcode.wordpress.com/2013/09/30/conventi...

Like a lot of the Rails stuff, it feels like amazing cool magic when things just work. But then when they don't work and do something weird instead of what you expected, it feels like it takes forever to figure out why, what was named wrong, and what it's supposed to be named.


Convention over configuration is awesome, if you know the conventions. If you don't, it's all magic. At least with configurations, you can read them and get some pointers.


Aside from the things others have mentioned, which are all really good, there are some really good books on the subject.

Jose Valim's Crafting Rails Applications[1] is a wonderful resource, since it deliberately sets out to peel back the layers of magic. A lot of the techniques are ones I probably would not use in practice (storing views in the database and rendering them!), but they serve to elucidate the operation of the entire view stack. Really good stuff.

Two other good books are Rails Antipatterns[2] and Objects on Rails[3]. Neither of them has been updated in a long time, but the general principles will still hold. The former is more practical, the latter more theoretical; precscriptive and fanciful food for thought, respectively. Both solid.

1. https://pragprog.com/book/jvrails2/crafting-rails-4-applicat...

2. http://railsantipatterns.com/

3. http://objectsonrails.com/


If you're not an experienced Rubyist, I'd recommend reading David Black's book The Well-Grounded Rubyist. Unlike many introductory books on programming languages that focus on making you productive in that language quickly, it focuses on building a deep understanding of the language. When I later read the book Metaprogramming Ruby, which uses parts of Rails for many of its examples, I already knew many of the techniques thanks to David.

http://www.manning.com/black2/


Ryan Bates' Railscasts are great. Unfortunately he stopped producing new ones, but they are still a great resource:

http://railscasts.com/


Just wondering whether a distinction should be made between learning a framework (RoR, jQuery) and a language (JavaScript, Ruby). Frameworks do magic, languages generally don't.


> But if you're writing a 10-line program and you forget one of the lines (or even one character), the program isn't 10% wrong, it's 100% wrong. (For example, instead of compiling and running correctly, it doesn't compile at all. Completely different results.)

This is where the beauty/simplicity of some programming languages, namely, intepreted languages (e.g., Python), comes in: if a bad line of code never gets executed, then the program itself will run fine. In other words, if the line is never called in the program, then you'll not know that the functionality that that line presented was bad. In this case, the analogy breaks down a bit - and also shows why certain languages are easier to learn than others (e.g., Python vs. C++).


This is a risk that you should be aware of when using languages like this and thus use them appropriately. To continue with the building analogy, you don't want it to take an actual fire to learn that all your fire exits are dead-ends.


I'd rather know when something is incorrect, rather than pushing to production and finding out later because someone else took that code path.


If you rely on your compiler to tell you that your code is correct, there are whole classes of bugs that are waiting to surprise you in production.

I think that many years of developing large applications in Perl were really good for me. Perl is compiled when you run it, so you get the basic-syntax check that you get with other languages. But it's also very lenient, so you learn through experience to get your logic right, test return values, and do all of the things that help make sure that a program which executes is executing correctly.


There is a difference from RELY on compile to catch 100% of bugs, vs having an awesome type system that can take whole CLASSES of bugs and make them impossible to get past a compile.

Is a statically typed language more likely than a dynamic language to work correctly in production, if both have 0 tests? Yes.

Is either ideal? No.

Can both be improved by adding a few tests? Yes.


> Is a statically typed language more likely than a dynamic language to work correctly in production, if both have 0 tests? Yes.

I'm not sure I agree. "Work correctly" does not just mean "compile correctly". I would want to see a lot of evidence to back up any assertion that programs written in statically typed languages are less likely to contain logic errors that compile and run just fine but don't do what the programmer (or his client) actually wanted.

I agree that neither is ideal and that adding testing can improve any code.


> I would want to see a lot of evidence to back up any assertion that programs written in statically typed languages are less likely to contain logic errors that compile and run just fine but don't do what the programmer (or his client) actually wanted.

As certain assertions related to logic can be encoded into static types (especially in a language with a type system more like Haskell's than, say, Go's), while static typing can't eliminate all logic errors, it can reduce the probability of logic errors escaping detection in the absence of testing, since compiling a statically typed program is, in effect, a form of testing (limited to those assertions about behavior which can be encoded into the type system.)


> compiling a statically typed program is, in effect, a form of testing (limited to those assertions about behavior which can be encoded into the type system.)

Fair point. (Especially if, as you say, you are using a language with a type system like Haskell's, which to me is more like a program analysis engine than just a type system.)


I agree with you on all points, but the parent sounded like he was relying on the compile-time checks to determine correctness. I was making the point that that is a bad idea.


> This is where the beauty/simplicity of some programming languages, namely, intepreted languages (e.g., Python), comes in: if a bad line of code never gets executed, then the program itself will run fine. In other words, if the line is never called in the program, then you'll not know that the functionality that that line presented was bad. In this case, the analogy breaks down a bit - and also shows why certain languages are easier to learn than others (e.g., Python vs. C++).

Actually, that's one of the pitfalls of interpreted languages.

You want to Crash Early & Crash Often [1] or you'll move along, merrily ignorant of a serious problem just because it doesn't get executed.

I try to solve this shortcoming of languages like Python with proper unit testing. It gives me the confidence that there's a decent coverage of the different code paths so that I won't learn about the problem in production.

[1] - https://pragprog.com/the-pragmatic-programmer/extracts/tips


>This is where the beauty/simplicity of some programming languages, namely, intepreted languages (e.g., Python), comes in: if a bad line of code never gets executed, then the program itself will run fine.

That's not beautiful; that's horrendous. A program that might contain syntactic (!) errors has no claim to being a sublime mathematical construct.

I'd say it's "beautifully simple" when I can tell you with 100% confidence that my program will never, ever, ever experience errors of a certain type. Even better if I can tell you with 100% confidence that my program contains no errors at all (which is possible with proof-based languages).

Saying a Python program is beautiful because it can have hidden failure conditions is like saying that a poorly maintained gun is beautiful because it can fire when rusty (but watch out for explosions!).

I wish, when learning to program, that I'd been taught to write universally correct code instead of "mostly correct" code.


Wouldn't it be better to catch an error at compile time than throw an exception or have the app completely fail when a bad piece of code is run?


Depends on if you're trying to engineer fault-tolerant, robust systems, or trying to learn how to program.


Quite right. I was only mentioning it w.r.t. actual learning, not a more general use case. :/


> This is where the beauty/simplicity of some programming languages

Simplicity? Yes, probably (at least as long as I am writing the code and not debugging it). But definitely not beauty. I find this particular behaviour the ugliest part of interpreted languages. I may make a small typo, incorrectly capitalize variable name or forget a quote and nothing will tell me that my code is wrong or where exactly it is wrong - it will silently skip the error and happily show me wrong results.


Unfortunately this is a dark hole of horrible bugs just waiting to happen not to mention the fact that interpreted languages are very liberal with silent type conversion. It is a nightmare to deal with in a large system written by careless programmers.


That's closer to the definition of instability than it is discontinuity.


"In mathematics, a continuous function is, roughly speaking, a function for which small changes in the input result in small changes in the output." [1]

I could clarify and say that the change doesn't have to be unboundedly large in absolute terms, but rather relative to the change in input. (i.e. a jump discontinuity from 0 to 1 is not absolutely unbounded, but it is relative to an arbitrarily small change across the jump.)

[1] http://en.wikipedia.org/wiki/Continuous_function


Then you might enjoy example 7 on page 18 of this document:

UNIQUE ETHICAL PROBLEMS IN INFORMATION TECHNOLOGY

By Walter Maner

http://faculty.usfsp.edu/gkearns/articles_fraud/computer_eth...


Well, that's probably where my CS prof got the example. Thanks for pointing me to the likely source!


I remember this one thing he pounded into our brains: "CS doesn't stand for Computer Science. It stands for Common Sense!" Heard ad nauseam in CS 340.


If you're lucky it won't compile. It's when it's 2% wrong that makes programming so hard.


In a general sense, it can be more difficult to reason about what effect adding or removing something will have because the skill is still being developed.


Yes, but that's true of any new skill that one might try to learn. I'm talking about specifically what makes programming harder than other things to learn. When things are continuous, you can at least experiment by making small changes and be confident that those changes will only have small effects.


i say its "brittle" ...those words discrete and continuous dont really apply cleanly, though i understand the idea somewhat intuitively


Yes, "brittle" is how it's been described to me and how I describe it to others.


Many of the coders I know (which is mostly folk in their 40s) never learned from scratch. More often, we started in support roles, and slowly worked into the code. First, learning to read it to help troubleshoot issue, then making basic changes, and slowly picking up more and more of a specific codebase. Once the basics were understood, we'd start making basic apps on our own, often while still supporting more complex apps. After a year or two, we'd be competent enough to do things from scratch, and then we'd move into a full-time coding role.

I know that few people learn like this these days. I've heard extreme negative criticism when I tell people that a few years of 2nd/3rd tier support on a large codebase is actually a good start to a coding career.

But I also never experienced the troubles described in this article. There were hard times, which would have been eased with today's online content. But it wasn't hard because of a downturn in confidence, and resulting "despair" that is described - it was a slow, but steady increase in confidence and abilities.

So are people better off today? Maybe. They certainly are coding at younger ages... but I have no complaints about my path. I still was fully competent in my early 20s, did a startup at 26, etc.

So there are many paths to developing your career. I'd recommend people keep an open mind to all options, and do what works for them personally.


My first job was Night Operator at Transdata, a dial-up timesharing company in Phoenix, for the summer of 1969, earning $2/hour.

The fun part was that they turned off the timesharing service at night - but they didn't want to power down the Sigma 5 for fear that it might not start back up in the morning. There were occasional overnight batch jobs to run, but mostly I didn't have much to do.

I already knew BASIC, having punched programs on paper tape in high school to run on Transdata's service (which was how we got acquainted). I found a copy of the Algol-60 report at the office and thought it looked interesting, so I read it, tried out a bunch of programs, and learned Algol.

Then I found an assembly language and opcode reference for the Sigma 5, which was fascinating. There were plenty of blank cards to punch, so I learned machine language too.

I could have just sat back and done the night operator job and not much else, but man, there were such interesting things to fill in the rest of that time. And it's stayed interesting ever since.

Of course that was an unusual situation, and I suppose not one you could repeat now. After all, how often do you get a chance to have a whole computer all to yourself?


I was 8 years old in 1970. My dad was working for NCR in Waltham, Mass. He'd bring home a honkin' huge teletype and an acoustic coupler on the weekends. I taught myself BASIC and got hooked on Hamurabi [0], my first computer game vice.

How much paper did all of us go through back then?

[0] http://en.wikipedia.org/wiki/Hamurabi


A lot of paper!

Do you remember the first time you saw one of those newfangled "glass teletypes"? A terminal that printed out your typing and the mainframe's reply on a CRT instead of paper?

Was your first thought anything like mine: "How do I look back at what I was working on a few minutes ago? There's no roll of paper piled up behind the machine! How do I see my printouts?"


Yes, I used a video display terminal for the first time in a computer lab full of strange gear at MIT 40 years ago. The computer had to have its initial boot loader loaded from a strip of perforated paper tape. The terminal had characters "drawn" by the crt tube's electron beam. If there were a lot of characters on the screen the first few lines would start to fade before the e-beam was able to back to the top of the screen. By keeping the overhead room lights off, we could see almost a full 24 lines of 40 character. It seemed so advanced compared to the punch cards that I'd used for years before. Those were the days!


You know, I honestly can't recall my first glass teletype. I'm younger than you are so maybe it seemed like a natural evolution at the time.

I left the computer world in the late 70s when I discovered girls, music and partying. I came back to it the 80s when I learned about networked computers. Even through all of the intervening years, my coding still sucks. ; )


My path was different.

I was decent enough to implement basic algorithms and data-structures since high-school. I couldn't build apps of course.

Then in the first year of college I got hired for a small and shitty company doing web development, on a very low salary. My first project was to clone a popular dating website (Match.com). For me the project was overwhelming, as I knew nothing about what it meant to do real things. But I felt the pressure of delivering and I really needed the job, so my path from a near zero to somebody hirable took one month, because that was the deadline for showing something working.

So basically for me the driving force was hunger - and I'm talking about both the attraction towards CS and the need for an income.

Of course, from there to somebody that can call himself a decent software developer, well, that took another 12 years. And the kind of projects you're working on matters a lot. At some point I worked for a startup that had crazy technical challenges, crazy constraints, crazy deadlines, crazy everything. For 3 years I worked on that and learned much more than I learned in the other 9 years.


This. I so agree with this.

I was lucky enough to fiddle with computers as a kid, so I kind of knew what I wanted to do, so I had years and years of "play time" in which I gently and accidentally introduced myself into programming, databases, operating systems, hardware, the web, etc. Often, things that I never thought would be useful turned out to be down the road. I also learned to be more fearless when experimenting and understanding a system - versus it takes some people years to get out of the "afraid to break things" mindset into the mindset of experimentation, trying things out, prototyping, poking things to understand them.

I think these are things that the sausage factory modality of education can't really provide, and working in the field will definitely give you this kind of thing too.

But the issue I see often is that the self taught or "graduated" (whatever that means) from support folks aren't taken as seriously as a person with a BsC in CS from a "legitimate" engineering school.

I /did/ get a formal degree, and it taught me tons (won't comment on the whole is it necessary thing) but I think it would have been way harder for me to get a programming job otherwise even though I probably had adequate skills for an entry level job in many places.

I don't think in many places today that someone would consider training a support person to do engineering, for various good and bad reasons, and we need to think hard about providing more long term learning-through-doing and on the job apprenticeships that other craftspeople do.


If it wasn't clear - I'm for people moving from support to dev, just acknowledging that it's more difficult than the traditional route currently.


I'm 37 and my first programming job after college was doing support & custom integrations for a larger product (which I wasn't allowed to change). I'd already been programming since 8 and knew BASIC, Pascal, C, C++, sh, Perl, Java, etc., but I had only taken a few CS classes and majored in English. It was a great chance to learn new things, e.g. SQL, and there was a ton of variety, autonomy, responsibility, and client interaction. Even the bad parts (Excel files, CORBA over firewalls, backwards intransigent IT departments) were great learning experiences. After maybe a year, having proven myself, I switched over to the web development wing of the company.


Amazing that there was a time that English majors could get hired for programming jobs. Nowadays your resume would be rejected by an ATS without ever passing before human eyes. You would be treated as a non-entity, incapable of offering any value.


Well, I'm not sure it was so different then. In my case there's a pretty good story behind it. I was working as a temp doing data entry for 1000+ page industrial catalogs, and then we ran a Perl script to generate QuarkXPress files while sizing/moving products within the 3-column layout to cut down on pages. It was sort of a killer feature for the monstrous CMS app they made, and we were the pilot project. But this was 2000 and there were lots of Unicode bugs around © ® ™ ” etc. Since almost every product had a table with 10ish SKUs and 2-6 columns of various dimensions measured in inches, there were a lot of problems---I'd say hundreds per page. Oh and also the script added extra spaces around every special character. We temps were kept around longer than expected so we could circle in red every bad character on every page of the catalog. That was bad enough that I started poking around, saw the app was in Perl, and tracked down the problem (a single regex). When I told my boss I could fix it, he asked me to write up a proposal, which I did, and they asked the original developer if it looked okay. So that's how I got the services job. :-)


That's pretty much exactly my recommended approach for both training people who are new to software development and bringing an experienced developer onto an existing codebase (or language, or platform, etc) they haven't worked with before. The only difference, per individual, is how long they need to spend at each level before moving on to the next.

When you get into real-world software development, it's much easier to work with an existing codebase than to start from scratch on a new one. If the application is in production, then you know that the existing code works, so you've got a baseline. You can study it to figure out how it works, and you can compare that to requested changes in the way it should work, and then you just need to figure out how to change the code to make the behavior change. That leads you to asking the right questions, focusing on the right code. It's like the hand-holding phase, except it's real code instead of play code. Figuring out how the code works also teaches you a bunch of stuff on the side, like how to do debugging in this new codebase, how to set up and work within your development environment, the practices and patterns of your team, etc.


This is exactly how I learned. I wasn't fully competent until my mid 20s, but there are many ways to skin a cat.


this is good to read. i am 7 months into a trainee developer role. the first 6 months were spent working on other peoples code and it was brilliant once they dropped me onto my first project, i didnt realise how much i was learning at the time.


did you have people readily available to ask questions to and get correct answers? it makes all the difference if you have somebody to guide you through periods of being stuck.


This may be thinking back on things with rose tinted glasses, but I learned to code in qbasic when I was 12 or so at a Boys and Girls club after school and fell in love. It was entirely effortless and fun to me. I think the difference is at that point I wasn't trying to program to enter some lucrative career and be a startup guy (where are these "coders" going to be once the market dies down and a new industry is hot? probably trying to do that). For me it was something I loved immediately, and while obviously there are really hard problems, the coding part was effortless


I first learned on a variant of BASIC myself when I was in elementary school, and it was effortless then -- but that's a profoundly different thing from learning the professional tools/design patterns/development styles to get hired in a specific domain. As somebody who picked it up again after a long break, the whole point was not 'getting hired in some hot new lucrative industry,' but 'god, please let somebody hire me to do this thing that is so much more mentally satisfying than the last few things I've done for a living.'

From that perspective, I absolutely understand the urgency here, and appreciate how this article talks about how the moment when the tutorials break off is when the real learning begins.


At the time "design patterns" were somewhere in the distant future, development style was something you had rather than something you learned, and the list of professional tools was really short. I'm sure this is part of what made it incredibly fun.

I think today's students would also have a lot more fun if they ignored all the opinionated garbage about which flavor-of-the-month checkboxes they need on their resume. Figure out what you like and get really good at it. Many top employers are looking for passion, pragmatism, and adaptability rather than specific tools and libraries.


I may have been to harsh in my assessment. But still, how many of these people are sitting down and working on some puzzle/problem/project they find interesting vs saying I know I need rails, and angular to make web apps and then just going through tutorial after tutorial. How many of them are actually interested in it in and of itself. I learned how to program very far away from the concept of writing an app that I could deploy to heroku.


I'd always thought that programming might be interesting. Picked up BASIC for dummies and basically built my career from that moment. It's crazy how stuff like that can happen.


I was never told learning to code was hard. I never found it hard. I found it fun. I was introduced in 8th grade by a friend who brought a listing to school of a small program he wrote in BASIC and it took off from there. I didn't have all the resources of the internet to help (this was 1980)

Like the article mentions there's just a ton to learn. 6 months of Code Academy will help you learn basic stuff, variables, loops, conditionals, maybe even objects and classes but only experience will help you with databases, files, sorting, patterns, threads, memory issues, debugging, big O thinking, cache coherence, performance, etc, etc and all the other stuff.

That comes from doing and doing over years and years.

Maybe it's something about certain people? I watched a guy with almost zero programming experience go through some tutorials online and then apply for a software engineering position at google. From everything he said he seemed confident he was going to get the job and was going to be very depressed if he didn't. All I could think was "REALLY? You really think 3 months of study makes you a programmer ready for Google?". I didn't say anything. Who am I to step on his dreams. Maybe he'd some how fake his way in.

He didn't

But it made me wonder why did he believe that that was all he needed in the first place? I think that's actually the more interesting discussion. There's a ton to learn in programming. You can learn some basic stuff quickly and maybe make some lights light up on your arduino art project but why would anyone think 2 to 6 months of study would make them ready for a programming job? Is that a new thing? Where did it come from?


That's because you, like all of us that learned in the 80s, were very, very lucky. We started in a world where writing good, production code and learning the very simple basics of programming were the same thing. I started with a ZX Spectrum. You could use simple machine code, or simple basic. Libraries? What are those? The one barrier to get good was when you ran out of memory, and had to switch to machine code and learn memory saving techniques, but by then, you were as ready as anyone.

I compare it to wha we do today: My code uses libraries, that use libraries, that use libraries. Languages are huge in comparison. Sure, it's easier to do what we used to do 20 years ago, but nobody expects from us what we did then: Even someone that is just learning wants to do more. This is what builds the despair phase of the article.

It's a well known issue that both affects how we train new people and how we even manage large pieces of software today, so it's well talked about. For instance, the first talk of JSRemoteConf last night was all about this issue. Hopefully they make the recordings openly available soon.


I think a lot of that despair is industry self-inflicted.

I've been in the software industry for a long time and I can't remember a time when the industry created as many new buzz words as quickly as it does today.

Take something as simple as ASP.Net for example.

In just a few years it has gone through these iterations:

* Classic * WebForms * MVC * MVVM * Razor

Each name change represent nothing much more than a new software design patterns, yet because they have a buzz word attached they become new skill set.

Look at something as simple as IOC, which again is just a new software design pattern.

Now ever IOC containers has become a buzz word and shows up as a skill requirement on the job description.

So here are just a few new IOC container buzz words that you might need to get on your resume:

* Autofac * CastleWindsor * EntityFramework * LinFu * Ninject * PicoContainer.NET * Puzzle.NFactory * S2Container.NET * Spring.NET * StructureMap * Unity

With so many buzz words things appear complex and overwhelming, but the reality is if you've seen one IOC container you've seen them all and if you haven't seen any, chances are if you are good with software design patterns it won't matter.


I definitely get the sense that industry has only accelerated its churn rate over time. Just looking at what's happening to Javascript in the past year or two gives me butterflies in my stomach: There is attractive stuff and necessary stuff, and it's only getting harder to tell the two apart.

While I wouldn't say that everything about old-school coding is great, it encouraged a first-principles engineering process - checklists, printouts, technical docs, etc. - that is eschewed today in favor of looking up some framework and fumbling your way through to half-understand what's going on and leaning on the toolchain(which you also don't understand) to give you some reassurance that it isn't completely broken. This is discouraging for _everyone_, not just learners.


Having taught many non-programmers to start their programming journey, this article rings very true.

The "cliff of confusion" he describes is a function of the Dunning-Kruger effect (http://en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect), which indicates that you don't know how bad you are at something until you get better at it. As an educator, the challenge is to make that cliff as unscary as possible and chart a path through the "desert of despair" so that you're not pushing too much at one time.

On the other hand, as a self-learner it's really really hard to get past the point where you've learned enough to know how much you have yet to learn, or in terms of the article, when you look over the cliff and see how HUGE your journey is becoming.

To a large extent, the article is an advertisement (in more ways than one) for guided learning, expressed in a pretty clear way.


Totally unrelated to this, but I was interested to see how often the Dunning-Kruger effect is mentioned on HN, since it seems like the comments on almost anything will inevitably yield some reference to it (the opposite, impostor syndrome, comes up quite often too, but has a more obvious and less interesting name); a quick search yields 730 results for "Dunning Kruger" over all time, and half a dozen in the last week alone... which was perhaps a little less than I was expecting, but still rather a lot.

Totally unrelated to that, Algolia's HN search really is magnificent - supremely fast, accurate, and even quite attractive. Impressive.


Totally agree. Helping novice programmers through that time of "Oh my god, I know nothing, I will never be good at this" as their world is expanding is crucial. I think having a support system can make a big difference here -- and it might last months or even years until you feel confident enough to work full-time, when you'll re-enter the desert (since the tools your job uses probably include at least one thing you're not familiar with).

One thing that I think separates senior engineers from juniors is a level of confidence and willingness to tackle the unknown and learn new things, tempered by pragmatism that they don't all need to be tackled and learned right away.


The Dunning-Kruger effect pretty much sums up myself. I feel like I have a firm grasp on the structure and syntax of JS, but now I'm questioning whether I really do. I always feel awful, as I can't really implement anything, and am always frustrated when I can't get off the ground starting a project.

I just don't know how to actually _do things_ with code. So, in essence, I think I know the language, and in a sense I do insofar as I know how to write an if else statement, a while statement, declare a variable, etc., but I _don't_ actually know the language, because I can't do anything with what little I do know.

Do you have any advice?


Here's a few...

Build something, anything. Then set it aside for a while, come back to it and improve it. Reading code you wrote a couple of months ago will highlight very quickly the parts that are clear and concise and those that are not.

Pick an open source project that is interesting to you and improve the documentation. Writing clear documentation requires a depth of knowledge that surpasses just employing it.

Give a presentation on and/or tutor someone on a topic. Like writing docs, this requires being able to think clearly about the topic.


This seems like a symptom of learning to code for the sake of being able to code, versus learning to code because you enjoy coding. It's just as much work in an absolute sense, but the process in the latter case seems more effortless and fun.

There's a similar thing in the music world. Some people want to be good at guitar, others like playing guitar. The former get bogged down in despair, the latter fiddle around on their instruments every night without even thinking. Guess which ones end up getting good?

[edit] cholmon makes a good point; the enjoyment is more in creating things than in simply writing code.


Shortly after I began driving my uncle taught me drive stick. We went out a few times, I'd practice in the parking lot, and I was eventually able to get it in gear and drive on the streets. I was horrible at driving stick, stalling at intersections, but I was good enough at it to get by. It wasn't until I actually bought a car that had a manual transmission that I got good at driving stick. I was terrible because I was trying to learn to use the tool (stick shift) for the sake of learning.

I learned to program in a completely utilitarian way. I had a problem that I needed to solve, I knew others had used programming to solve similar problems, so I learned only what I needed to know to solve the problem at hand. After using it this one time I began to notice all sorts of other problems around me that could be solved using programming. I developed my passion for coding because I understood it was a tool that could make my life easier.


this is exactly why most programming tutorials are terrible. they show you how to do calculator exercises and spend hours and hours on data types and syntax. it's like driving around in circles in the parking lot over and over again.

after struggling for years to 'get it' in my friends cars, i learned how to drive a stick in a single afternoon after saying 'fuck it' and buying a car with a manual transmission.


I agree to an extent, but instead of "enjoy coding", I'd say it's more "wanting to build things", at least in my experience; I do enjoy coding, but really it's the building of the things that's my primary motivating force.

Or a similar modification of your music analogy: "Some people want to be good at guitar, others like making music".


Agreed; "enjoy building stuff" paints a more complete picture than "enjoy coding."


I feel like I'm in the "Desert of Despair" and I'm not quite sure where to go. So far I'm self-taught in C, C++, Python, Lisp (a few of them) and some shell stuff. I have the basic syntax nailed in all of those. I don't know any of them well enough to be able to write simple programs without constantly looking up StackOverflow articles or reading references. I haven't even touched GUI programming.

It doesn't help that I always try and obsessively look for the "best" or "proper" way to do something. I know this is a good virtue to have but I also feel like it also gets in the way of getting things done.

It's such a weird place. On one hand I feel like I know a lot more than before but if any of my friends ever mention how "good" I am at coding I am quick to correct them by saying I'm really not. I could really do with some sense of direction, but I suppose that's on my shoulders since nobody else can decide for me what sort of developer I ought to be. I'm not sure what to focus on.

I initially got interested in coding just because of how interesting and fun it was in itself. I want to continue to pursue it because I feel like it's the first thing I'm really good at. I always got mediocre grades in school, I didn't learn any instruments or have any hobbies, then I started learning how to code and it just clicked with me and for the first time in my life I had an idea of what I might do for a living.

Sorry if that strayed a little too far from discussing the article, I wanted to try and write my thoughts down.


Personally, I think this is the phase where what you need most is practice. There are practical things you need to learn that you just can't learn without running into them during a project.

Just keep practicing. Varied things. Challenging things.

You'll get there!


I agree with gambiter, this is the phase where practice comes into play. Also, studying the code of really competant and efficient programmers. I was in the of desert of despair for years until I learned to stop being overly self-critical and had internalize the patterns of much better programmers. I still made a living in this phase. Then I found that I was a good programmer and entered the upswing of awesome and it is awesome. I have the self-confidence now to know when to cut corners (sometimes you just brute force it) and when to spend extra time (sometimes an elegant algorithm is called for.)


> I don't know any of them well enough to be able to write simple programs without constantly looking up StackOverflow articles or reading references.

Most people who write code for a living never bother to memorize the syntax for all the tools they use. Figuring out the logic is the hard part of coding, having to look up the syntax for every other line shouldn't have much impact on your productivity.

If you want to memorize the syntax then there are a bunch of guides to doing that (e.g. Derek Sivers wrote something on this), but that's probably not actually your limiting factor.


I totally get you. I'm at the same place myself. I just decided the only way forward was to do 'hack nights' at my house and invite any programmers I knew who find it fun. Keeps me focused on actually building out a project, we all get/give feedback, and I feel like I was productive.

Also, learning in a vacuum can make it hard to really benchmark how knowledgable or 'good' you are. So getting some feedback in groups may help out.


TL;DR: Don't give up! It gets better, and so will you!

  I always try and obsessively look for the "best" or "proper" way to do something.
Don't. Do your personal best, and THEN look at ways to improve it, and in one facet at a time. Otherwise, you might not always understand why the "best" way is better. (You won't appreciate source control until you lose All The Things, but ... I'm sure you're already using Git. ;))

There are nigh-infinite blog posts about ways youcan do X better (unit test, standards, unicode compliance, date handling, names, design, CSS, layout, etc) -- don't worry about those yet.

=== Coding on your own ===

Consider writing code to solve programming puzzles, like Project Euler -- it can often start small, and then improve as you learn more.

Consider building a "simple" toy project in a web framework -- Django, Flask, etc. (I mention those as they are some Python frameworks.) There are often tutorials that run you through building something toy-scale. (Your own twitter, a TODO app, etc). Doing this will let you quickly build something with a simple UI, and simple guts, and then you can look into ways to improve it.

For example, I'd like to make a flash card game for my kids. (Real project, tired dad. ;)) I can start with some code that randomly generates problems, and shows them to a UI in a pretty way. (Large font, perhaps.) Later on I can improve it with things like letting people enter the answer, tracking how long it takes to answer, tracking scores over time (so that I can focus on the ones they have problems with), leaderboards, and custom decks. ALL of those are way out of scope of the initial project though.

=== Coding with others ===

If you'd like more of a challenge, or in some ways less, try being a contributor to a larger project, rather than trying to write something totally from scratch. This is really important -- most of us are not writing Large Software solo. I might write a handy tool solo, but most of my day-to-day work (and the environment in which I do most of my learning) is maintaining a larger codebase.

Existing projects already have a build framework, a code review protocol, RPC/API infrastructure, GUI, etc already done for you. They have a test suite that tells you if you broke something. Now, your improvements are things like, "Add an error message when this happens", or "Add support for a new API call...". It's MUCH easier to add something similar to what someone else already wrote (once you understand how it works by reading their code ;)), than it is to have to write all of that infrastructure yourself. There are also OTHER developers who have domain knowledge of the codebase, and that you can ask for help -- this is VERY useful and important for our learning as programmers.

=== Direction ===

I have no idea how to help you there. ;) It's easy to feel inadequate in this profession, as there are always smarter/faster people. Since what we can create is endless, it's hard to PICK something -- that's why I suggest doing a tutorial-based project in a web framework. You can then use that to focus on the Python side, or on the UI side, or as inspiration/guidance for making your own things.

Also, don't worry about looking at Stack Overflow. Definitely do that if you're stuck on something, or you forget whether something is possible ("Can I declare a dict plus it's members in Java the way I can in Python?"). It can at times be hard to tell when the answer you find on SO doesn't match the problem you really have, though, which takes practice.

I'd be happy to talk more via e-mail if you like. I've updated my user profile to include a poorly-obfuscated rendition of my gmail address. :-)


For me, by far the hardest part was finding the time. Learning to code on nights and weekends, when you've spent your most productive and focused hours at your job, is a nightmare. It wasn't until I was actually hired as a dev that I started to hit a steep learning curve, and I attribute much of that to spending 50 of my best hours/week coding, rather than maybe 20 of my worst.


I have the same issue. I've set aside 10pm-2am every weekday to learn how to code, plus entire Saturdays and Sunday mornings. I can, at most, manage 20-25 hours, usually when I'm already bogged down.

I've been toying with the idea of quitting everything and going all-in for 3-6 months.

Would that be 100% retarded or just about 70% retarded?

I can live with 70%.


I had the option to quit to work on learning to code for 3-6 months and I chose to bite the bullet and stay employed. I woke up at 4am to get in a few hours of coding before work and kept my working hours to a minimum so I could have a few hours after work to code. Weekends were 12+ hours of coding both days. It doesn't leave a lot of room for social life or other activities, but the job search was a lot less stressful knowing I still had a paycheck coming in instead of burning through my savings. YMMV!


You'll want to account for this all-in time on your resume. (Unemployment gaps are a red flag to companies.) You could put together a company of your own, for example, and frame all your study time as product/service development.


>Would that be 100% retarded or just about 70% retarded?

I'll go with "less than 70%". Or at least I hope, because that's what I did.

Coming from a Economics background (with a minor in Statistics), I took a job as a "data analyst" at a software company out of University. It was fun, but I relied on working with a programmer to get even basic things done.

Then came the "Data Science" wave. I thought, this is right up my alley! Except I needed to learn to program. I tried doing the tutorials and academies online, but was continuously stuck in the hand-holding stage. So I did the opposite of what is often recommended around here: I quit my job and enrolled in a 2 year Computer Programming course at a college. I'll be finishing up this spring. The enforced, formal structure and discipline has been a boon.

I feel great about having done it. I still feel pretty green, but I'm probably somewhere in the middle of the "desert" now. I credit school with having got me that far.


You'd be an excellent candidate to get a lot out of a coding bootcamp. If you have like $10k in savings or money you can borrow, you'd be in an excellent position to try something like that.

PM me if you'd like to talk about my experience with one.


If you are in a position to quit, I'd put it at 0%.


It depends. What are your goals, what is it you want to do with coding? i.e. Why are you teaching yourself to code?


Go to Latin America or SE Asia if you do. Your money will go a lot further.


From the sounds of it, you weren't at Erik Trautman's "Job Ready" point when you were hired (and weren't going to get there without being hired). How did you write your resume and handle interviews?


Very true, I had probably just entered the "Upswing of Awesome".

I had quite a bit of experience with relational databases from my previous job, which I leveraged pretty heavily on my resume. As far as interviews, I was very honest about what I did and didn't know, and passed a coding test by pulling an all-nighter (and taking a vacation day) to learn a tiny bit about Django.

The offer came from a company I had initially reached out to about an unpaid internship, which eventually manifested into a full-time job. We agreed to a three month trial period, and they kept me on afterwards.

There was no doubt plenty of luck involved in the whole process. I sent dozens of cold emails, and offered to work for free several times (thankfully never had to).


Yes, end of the day learning is hard as heck. YMMV, but I find if I have a solid hour in the morning I can gear up pretty quickly.


Motivation.

If you don't have the motivation you'll never be able to do it. If you can't sit in front of a computer for 8 hours a day reading documentation and hunting for syntax errors you're not going to be able to do it. If re-writing algorithms doesn't give you an intrinsic satisfaction, you're not going to be able to do it. No amount of everybody can code tutorials is going to help. They should all be, "how to find the motivation to keep coding" tutorials.

Unrelated, but related to the article, the word sociopath get's misused a lot and this article is no exception. http://www.thefreedictionary.com/sociopath


Motivation can work for a while, but ultimately how I see it comes down to discipline.

Motivation will fail you when you are left with those last 10% of a project that feel like the first 90%, but now with uninteresting tasks like tweaking the hell out of a UI, fixing all those bugs resulting from code optimization in obscure cases, implementing database integrations to assure backwards compatibility with earlier versions or some such shit that is impossibly uninteresting but required to finish the project.

Then in play comes discipline, and that is something you have to learn systematically, and when you don't feel motivated at all to continue through and just wish to quit it all.

But you are correct, you have to get satisfaction from the process. Maybe the motivation is to see the end result. But still, there is that phase where all hope seems to be lost, inspiration and motivation are nowhere to seen and all that remains is just grind and decide to follow through.


Not sure why you got downvoted because you're absolutely right. The hard thing is the commitment required.


I have a feeling a lot of why people fail with these online resources is motivation, and I think it's why Khan Academy, while it is a forward-looking treasure of education, has failed to achieve revolution despite dramatically improving the access of education.

A lot of people need external mechanisms to keep themselves motivated, such as parental pressure, peer pressure, shame, and so on. As soon as people leave college, most people never learn a tall order of knowledge ever again, and most people let their existing knowledge rapidly decay. And then they're going to tell you a story about how everything they learned in college is useless, and how jobs want something entirely different.

Whatever human nature is going on inside of them that explains the outwardly visible behavior is part of the cliff people are walking toward.


As a self taught developer, I'm so thankful I first thought I wanted to be a designer. HTML and CSS made sense to me (the backend, even frontend js, forget it. I'd try to hack around it) and I sought deeper knowledge on the subject. Build systems and preprocessors were a gateway to command line tools and good organization skills.

Once I was thrown into full stack development, I at least had something to lean on while the backend programming caught up with the frontend. Then I started applying the things I learned programming backend applications to frontend applications (where do I keep all this state?). I think I spent a lot of time in the desert of despair wrt. certain types of programming but was never entirely there.

I think it's important to have something to feel confident in while you struggle.


I am part way through your trajectory. I was a designer first, fell in love with html/css, eventually gave in and learned js, started to really enjoy that and have now been edging my way to the back end for a while. Now I'm job hunting at the same time and definitely feeling that desert of despair. Was glad to see your post, gives me some hope!


I think one effective way to navigate the "Desert of Despair" is to join another project first. That gives you the focus you need and set of practices and libraries/language/frameworks to learn deeply. And as you learn how senior people in the project make their decisions and hear stories about the history of the project you gain context and valuable information for the next stage.

Much of what I've learned I learned by working with or lurking in the communities of open source projects.


basically, joining a project with senior programmers is akin to an apprenticeship for other crafts. If you have the opportunity, because you chanced upon somebody willing to do it with you (especially if for free), then take it.

But most people won't get that opportunity, or isn't smart enough to go look for it.


I feel I hit the job ready part and started to work as a software engineer and I think I'm about as productive as I cost the company. Actually I feel a little better than that and my colleagues a little worse than that, but I guess that's how everybody in the middle feels.

The thing is, after that job ready upwards slope there is the next downswing. You're able to get something done, but what you envision is not what the company needs. You also realise that to be productive you need a lot of skills you never learned in school time at all, e.g. packaging, shipping code to customers, versioning, and actually create a tool that another person can, reading code of other people, and finally using the tools your coworkers build which are only marginally more productive than doing it manually or writing your own tools, and only after learning the arcane ways in that they are designed (much like the tools of your design, and quite different from the billion dollar applications you are used to in daily life). Also you will really become slower, because all the meetings and compromises drain your energy more than the coding, but there is no way around it if you work in a team. Is there another high coming after that? From looking at my coworkers it seems this down swing will last.

tl;dr the job ready high is not the end.


Yes there is another upswing after that. But be aware some of the on the job skills are not unique to Software Dev.

* How to run a meeting.

* How to allocate Make time.

* How to collaborate with other people.

* How to understand business needs.

All of these are skills you need in any white collar job. Learning them is a part of any job you might do.

Even the software specific ones like:

* Packaging

* Shipping

* Versioning

* Tooling

Are all going to be job specific. You'll find some of them to totally non portable and some of them will transfer just fine.

The good news is that by now you'll have begun to hit a groove when it comes to knowledge acquisition and learning this stuff will get easier.

The next high is when you start to recognize Process Patterns in the jobs you take on and how to best handle them. Or even which ones to avoid like the plague either by effecting change in the company or changing employers.


There is a difference between knowing how to write code and being a "software developer". I think a lot of people confuse the two. I am an Engineer (not the software kind) I write code almost every day in my job but I do not identify myself as a programmer. For me code is a tool to be used to solve a problem, like Calculus or Linear Algebra.

When I was at Uni we were required to take two subjects through the computer science faculty, "Intro to algorithms and data structures" and "Fundamentals of Software Engineering". The first subject was hugely interesting and I "Got it" right away. It was basically teaching you how to represent a problem computationally, we learnt about Binary and floating point representaion, what a stack was that sort of thing. This is heap sort, this is bubble sort, this is O(N) this is O(Log N) it clicked for me.

The second subject not so much. It was all about Unit tests, object oriented programming, the waterfall model. We had to write an essay about Ariane V failure. The lecturer was really big on a guy called Bertrand Meyer and his ideas about design by contract. The subject was really hard to engage with and almost caused me to lose interest completely. It was probably a good subject to learn if you were planning a career in software development but for a first year engineer not so much.

As cruel as it sounds I think the best way to teach someone to code is to explain algorithms and data to them. "Here's a Ruby tutorial try to follow along and you to can be a programmer" is dishonest and in my opinion not learning the fundamentals up front is what causes that "chasm of dispair" the article aludes to.


Great post.

Something to add: This cycle not happened only once in a programmer life.

IS RECURSIVE!

Every time you start with a new language or tool or job, the cycle start again.

But is added ANOTHER step: Overconfidence and blind arrogant OR indifference.

This is revealed when somebody dismiss the new language/tool/programing job and because is more-or-less similar to previous knowledge and could learn the basics of it in days/hours then think it nailed again. Is possible to be in this new honey-moon for a while (or forever somethings), but could eventually and suddenly hit hard that you are NOT A MASTER OF 2 SKILLS, you are a (maybe) a MASTER of 1 and a noob of 2.

The arrogant & indifferent mind also is revealed in the condesending thinking towards who are outside the "guild" or "below" us. Is very easy to believe you are in the laters steps, when is really not.

The key point in this article is the problem of "you don't know what you don't know". For years I imagine I was a decent developer (and could have said similar things as others in this threads) but is only in the last 2 years where I realize how misplaced my understanding was. I was in the "The Cliff of Confusion" and very happy about it ;)


> Something to add: This cycle not happened only once in a programmer life. IS RECURSIVE!

I was going to say that this was nothing like my experience of learning to program, but it does quite accurately represent the process of adding new skills these days.


This article is a really odd read for me.

What sticks out is the idea of learning to code with the idea of becoming 'Job Ready'. As others have said, seemingly treating the process as a means to an end rather than a journey in and of itself.

No one learns to walk in order to hike up mountains. We start out with a fuzzy basic idea that it'd be cool if we could just move across the floor a bit and get closer to somewhere. And oh, isn't it fun to watch the world fly by!

Programming has always been that way for me. Making a computer print Hello World on the screen, and then draw a circle, and maybe calculate some primes, and so on... the entire process is learning. You're always learning. In those early days I had no magazines, no Internet, no peers to compare myself to... I just sat at a screen and tinkered, cobbling together bits from various scripts and tinkering.

Mastering it - now that's a different kettle of fish entirely.

Maybe it's some sort of capitalistic artifact, the drive to push faster and harder and more efficiently. Not being able to sit back and just, be, without constant comparison or anxiety.


Really good discussion of what falls apart when these "anyone can learn to code!" tutorials leave you high and dry, and how to get past that next huge hurdle of self-sufficiency.


As someone who is currently learning how to code, the author gets it mostly right.

For me, the hardest parts of programming as a beginner - understanding OOP, data structures, etc. - didn't really 'click' until I stopped reading tutorials about them and start writing my own programs. The idea of 'objects' and 'instance variables' was mind boggingly confusing at first, but once I stopped worrying about how to make sense of them, the concepts somehow just fell into place.

I've also been trying to learn French simultaneously. The process was somewhat similar - taking a few Duolingo lessons and thinking that 'hey, I can do this!'. Then I read some actual French prose and everything seemed impossibly difficult. Things didn't 'click' until I started living and breathing French.

It's the same thing with coding.


start writing my own programs

Indeed, you can only get good at programming by doing it.

As someone who learnt programming in the 90ies, one of the difficult things nowadays seems to be that there are so many languages, libraries, frameworks, hypes, etc. Of course, if you know your CS and have experience, most of it are variations on common themes. However, I can imagine that it can be very difficult to focus on one thing and learning it well. There must be many copy & paste programmers out there who never learn anything in depth.

At the beginning of the nineties things were much simpler. If you had a home PC (obviously without internet), you could get started with QBasic, or shell out some money for a compiler and get Turbo Pascal or Turbo C++.

I did quite a bit of Turbo Pascal programming at some point and it was all very understandable. A simple language, a small standard library that's probably all that you'll have, good documentation, and an IDE (which had a very nice debugger and profiler). And you just crafted tools with that.


I actually learned HTML, CSS and JS way back when I was in 6th grade. This was pre dot-com time and there were only a handful of resources. You could understand HTML and CSS in a day because there were so few HTML tags and requirements. 'Frameworks' was something that didn't even exist.

I somehow stopped my learning process before I hit 8th grade (around the same time I discovered that the opposite sex exists). When I picked it up again recently, the sheer number and complexity of frameworks and languages itself was daunting.

I can't imagine how hard it must be for someone who hasn't had a lick of coding experience. I could at least build a good looking website in HTML, CSS and simple JS before I started learning how to code.

It's damn tough and it has given me newfound respect for top coders. I work in marketing in my day job, and honestly, you could teach someone to replace me within a few weeks


I have been coding for a long, long time. Since the 80s. But I got started with the idea that I wanted to 'build' something...I think it was a randomized dice roll or something. Having something you are trying to actually 'make' will cause you to learn what you don't know, and keep going.

It's no different than saying 'I want to build a tree house'. as opposed to 'I'd like to learn how to do construction' or 'I'd like to understand how to build with lumber'. The first statement will drive you to figure out or learn what it takes to make something tangible, the second two statements are just nice ideas, easily discarded when things get difficult.


And math, really. Reading proofs is easy, writing your first proofs is really hard. Most people learn by doing, not by watching.


Yeah, that too. I have a humanities background, but I was always keen on mathematics. I know this is not true for a lot of my peers who studied the arts.

Really makes 'anyone can code' sound more like a marketing slogan than an evidence backed statement. If your math and logic game is weak, you'll have a hard time with anything beyond the most cookie cutter PHP code.


"To know but not to do is not to know."


It's only hard once you realize the commitment required, partly due to the pace of innovation, the depth and breadth of information available and the fact you're competing with the entire world, not just people in your city/state, and there's nobody regulating the influx of competition. A significant chunk of your life will be spent looking at a screen and there's a chance you could end up with a crippling case of carpal tunnel. It's a sacrifice and most people won't make it. In the same amount of time you RTFM, you could have learned to be a brain surgeon, a rocket scientist and a lawyer. And once you have it all figured out, half of everything you learned gets flushed down the toilet because there's some new platform. If you think it's hard and you're not enjoying yourself, don't even bother, there's easier ways to make money.


It was never hard for me. I started by typing BASIC game listings from magazines into a ZX Spectrum. Then learnt LOGO, BASIC and Pascal at school. And then taught myself QuickBasic in the army. Started a CS degree but dropped out after learning how depressing Z-routines in COBOL were. Got a job. Taught myself Assembly. Got a job writing tariffing engines for a cell phone billing system, learnt C, Informix and Visual Basic. Then Java, .Net, VB.Net and C#.

I loved every bit of it. And still do.


C'mon, all of those things you listed (brain surgeon, rocket scientist, lawyer) are WAY harder and more expensive/time-intensive to get into than software engineering.

If you feel like your knowledge is getting flushed away with a new platform, then you've been learning the wrong things.


Really? How is being a brain surgeon more difficult than being a computer surgeon ? Understanding how a CPU completely works, on the assembler and even on hardware level is FRIGGING HARD. Computers are very complex and huge beasts of logic to really understand, and the best programmers have to understand a huge amount of technical skills and systems in order to get the wanted results.

I have no idea how much stuff a brain surgeon has to learn, but having the idea that there are people who are brain surgeons, it cannot be so much harder than being a really competent programmer.

Think John Carmack for example. That guy is a friggin beast in learning new systems and producing working code. That is really, really hard to do. It requires immensive amount of thinking and applying, and sitting in front of computers while balancing your body and health at the same time so you dont kill yourself in the process or go mentally insane and just give up.

Oh yeah, but Carmack is a rocket scientist too .. so, maybe not the best example.

But putting those jobs above really competent programmers is just stupid, if you want to be best in what you do, in any life path you are taking, it will take all of your effort, and then some more. So why compare. It's all about doing what you love.


>Really? How is being a brain surgeon more difficult than being a computer surgeon ? Understanding how a CPU completely works...

Well, we know how everything about how a CPU works. We don't know everything about how a brain works.


Are there really no longer any open and relevant research issues around this topic? Is humanity's understanding of CPUs, how they work, how they should work, is it all wrapped up and complete?

Not really my field, so I truly don't know. But I'd be a little suspicious of someone who claims there's nothing new to learn here.


I don't think he means innovation which seems to be what you are implying. If you take a current working computer system, there are no "mysteries" in the existing hardware where the hardware designer just threw up their hands and decided to hope it would work. Sure when you throw in environmental factors there is certainly unpredictability in terms of hardware failure, but actually understanding what the system is doing? We know what it does. With something like the human body, it doesn't seem that we can be nearly as confident.


Sure, but I think that definition makes the difference in complexity a bit of a contrivance. You're deliberately excluding the things that make CPUs complicated and interesting, and then concluding that they aren't as complicated and interesting as something else.

The other thing is that while the brain is highly complex, that doesn't mean that people who work in it have managed to master something more complex than CPUs (or house wiring, for that matter). They may simply not really understand what they're doing to the same extent.

To me, the thesis in the original post is this "[if] there are people who are brain surgeons, it cannot be so much harder than being a really competent programmer."

I tend to agree, because I think that some types of programming push people's mental ability and sheer stubbornness past the point of human ability. In short, it will take all you have, and there will still be things you just can't understand or do.

If you define the task as "the things that we understand and can do", then by definition is is not equal in complexity to the brain, but like I said, I think the statement is a contrivance.


This, exactly. CPUs and their insides, the hardware, understanding that, and then understanding the whole software stack that runs on that hardware, I mean _really_ understanding, by definition that if one bit was off in the RAM you could completely trace it all the way from the application level to the hardware level.

Almost nobody can do that. The hardware alone is so complex, these modern CPUs have many BILLIONS of transistors packed so tight that it is impossible for us to even fix them. So we just have a vague understanding of what is going on when we program, but really, we have no clear picture. But the best programmers out there, they have this map of the computer in their heads and the systems, and the better you are, the better the map in your head is.

Think NASA level programmers. They truly have to know how the system works, and yet, they cannot most probably understand the whole stack even, down to the transistor level operation, and then below that even in some really extreme cases where the systems overheat and there is magnetic bit flipping happening and other obscure stuff.

Immense amounts of work equals to amount of commitment required, which equals hard. To be a really competent programmer that knows how to truly take advantage of the machine is really rare, and even then it is down to some very specific domain, like graphics programming, systems programming and so on. So it is very hard to achieve, and very rare.


I agree with you, and I think that programmers dismiss the complexity and difficulty of what they deal with far too quickly. I'm not saying that all programming is hard, but I do think that hard problems in software contain challenges that will take all the raw intelligence and hard work a person can have and then some.

There are plenty of other fields that do as well, but software absolutely belongs in the mix.


Being a computer surgeon rarely involves much physical dexterity and typically lacks the urgency/pressure that being a brain surgeon regularly entails.

Comparing "computer surgeon" to brain surgeon is like comparing sudoku to racquetball.


Also, in 95% of cases, nobody's life depends on your software.


There's no version control for brain surgery.


Those people are smart but there's a textbook to follow, a clear path to graduation. I didn't mean to start a war of the professions, I'm sure there's already a Hacker News thread for that.

My point was, you can teach yourself a half-dozen programming languages, operating systems, databases... (which is inevitable for most developers) or you could have spent that time collecting diplomas in academia. This is not really a new concept, I read it somewhere else. Any profession that demands a lot of ongoing learning, I think people will quit because the effort may not seem worth it.

I agree you have to learn the "right" things but that takes experience and strategy. That's an interesting aspect of all of this. When everybody says "iPhone" you might bet on Android. Everyone says "Google Glass" and you might bet on Unity. If you have a crystal ball, maybe you're the next Warren Buffett ;-)


Most of those other professions require ongoing learning, too. Doctors and lawyers have continuing education requirements for licensing, scientists have to constantly be reading journal articles, attending conferences and keeping up with the latest developments in their field.

I understand the point you're highlighting, but there are very few interesting jobs where you get to say "Ok, now I'm done learning and I know everything I'll ever need to know to do this job perfectly."


Kind of surprised you threw "lawyer" in there.

Of course, there's a bit of a difference, in that you can call yourself a programmer even if you can't program. Whereas you can't call yourself a lawyer until you've passed the bar, which almost always involves attending 3 years of law school. So it's not really an apples to apples comparison. There is really no barrier to entry to programming that sets a minimum bar. So if you're comparing something with a minimum education standard with something with no minimum standard… well then yeah.

But personally, I don't think the required training to become a lawyer is anywhere close to as rigorous as what it takes to become a brain surgeon or rocket scientist, and if you look at typical pre-law majors, they aren't as difficult or rigorous as common undergrad degrees for software developers (CS, math, engineering, physical sciences). Let's face it, people don't drop out of poly sci because physics would be an easier major with more time to party.


Why programming is hard is why everything is hard.

You’re not going to get good at anything unless you work hard at it.

All these hand holding sites are worthless because all the stuff on them is so easy. But they are popular because people are always looking for the easy option.

Want to be a good programmer. Buy a well-recommended book and spend 3 hours a day, 7 days a week trying to write code. Start with hello world and then build from there.

After three 3 months of that hard grind you'll either have some idea on how to write code or you will find you don't have the aptitude to be a programmer.

The bad news is even if you do find you can write code, the hard grind has just started.

You'll have to repeat the process for other programming topics like data structures, programming patterns, testing strategies, understanding database design etc etc.

Move forward five years and you're well on the way to being a well rounded developer.


I think one big problem is the expectations. People think that they can learn a lot of programming in 3 months or 1 year or something. I think it takes a lot of time, unless you happen to be "hardwired" for it. As someone wrote there are a lot of developers coming out from an education which still has a lot to learn. That means you might need 3 years of education and 1-2 years of practical work experience to actually learn. Or you could start when you're 8 and do it by yourself for over 10 years. When you are 20 you probably knows a lot about programming or you will have stopped already. Try to explain to people wanting to learn how to code that it might take 3-5 years, then they can choose if this is something they want. Or sell simple 1 year courses but explain that they will only learn a small subset then.


From my experience as someone who is primarily self-taught, I think that having a more experienced programmer who can act as a mentor is vital to one's own growth as a developer. My first experiences programming in a professional capacity consisted of writing VBA macros to either automate tasks like moving or reformatting data, or in one instance, run an iterative algorithm that could not be expressed simply with excel formulas. Looking back, I was stuck at that level for several years, until I was given some new assignments that were signficantly harder than my previous ones, and I had no idea where to begin. I discussed some of my current assignments with a colleague who had some programming experience, and he gave me a few basic lessons (in VBA) about object-oriented programming and communicating with SQL databases using the OLEDB library. Based on this knowledge, I set out to build some advanced, database-enabled spreadsheets, relying heavily on Google, Stack Overflow, and various blogs whenever I hit a snag. As I was progressing, I would bounce architecture ideas off my mentor and he would give me some topics to research in more depth. I would then go back to my online resources to figure out how to apply these new concepts to help solve my problem. As time went on and I became more confident in my skills, with the advice of my mentor, I moved from Access to SQL Server and from VBA to C# and the .NET Framework, eventually reaching a point where I became a self-sufficient programmer developing full scale applications used across my firm.

For a motivated student, I think this method of teaching can yield tremendous results. With the vast amount of detailed information out there from tons of easily accessible sources, having a more experienced mentor create a path for the novice to follow but also letting the student figure out the implementation details on his/her own can be very rewarding. It allows the student to develop strong problem-solving skills within a smaller context (so the student doesn't get overwhelmed with indecision) yet also provides a support system when the student truly is stuck on a problem. I'm not suggesting that this is the BEST or ONLY way programming should be taught, but for someone like me, who learns best by doing, it can be a great way to get started in the field.


For me it was never hard because I started as a kid, on my own, so I was exploring this new world at my own pace and terms, and it was amazing and fun. That's how kids learn. Learning the grown up way is on the other hand harder and takes time. For coding you actually need to master 2 separate, non-trivial skills: first to learn how to think/solve problems in a certain way, and second to learn the particular language. I think the main problem is that people don't give themselves enough time for the first, and concentrate only on the second.


I've had the pleasure of co-teaching Rails classes with Erik Trautman (OP) and he's a smart, dedicated guy. He's written a thoughtful article but underplays a key point. Learning to code would not be "so damn hard" if there were better learning resources. There's too much "learn to code" crap for beginners (as Erik points out), and not enough resources for advanced education (the "resource density" in the "Desert of Despair," as he points out). Learning to code would be easier if someone produced better advanced tutorials, books, and courses. I've been doing this for two years with the RailsApps tutorials [1]. And my "fluffy cat" book, "Learn Ruby on Rails," [2] provides guidance for writing apps from scratch without tutorials. Erik's article would be totally irrelevant if authors and teachers delivered better educational content. Erik's trying to do it with Viking Code School, and I've been trying to do it with my own writing, too. It's educators and authors that have to get smarter and work harder, not learners.

[1] https://tutorials.railsapps.org/ [2] "Learn Ruby on Rails" on Amazon: http://www.amazon.com/dp/B00QK2T1SY


No. No amount of resource will help with the kind of mental gymanstics required for programming (manipulating abstractions). Resources will help once you have that ability, but I am not sure how trainable it is itself.


I always thought the "Desert of Despair" was the point at which you should find a mentor with professional experience, instead of waiting for the "Upswing of Awesome".


The upswing of awesome sounds like a great way to prepare yourself to build things that are 90% correct with a 10% catastrophic failure rate.

I really try to keep a more emotionally neutral stance on all of my code and my abilities. If I want to indulge in arrogance I philosophize.

In the end, it's the same thing over and over again. Symbols swapping with others symbols denoting some kind of esoterically tangible, but ultimately fleeting, meaning.

It'd be nice to not feel perpetually stuck in the desert of despair though. I used to think being there meant I was learning stuff, because I had intuitively learned from repeat failure that after failure comes success. Turns out you can think about yourself plodding along at a steady pace, with no comparison to anyone else, as long as you stop assuming that there exists a clear, coherent, ordered organization to knowledge.

There exists such a thing in school, or at least the commentary on a topological sorting would have you believe. Technology doesn't always develop and get released in school though. Sometimes it develops in webs that are can not be causally described, because thought and skill do not necessarily travel in measurable directions, nor is their instantiation completely definable/observable.

People apply too many theoretical concepts to describe, dictate, and organize reality without understanding the effect on perception.


You've made an interesting comment. I've felt the same way about some of the things you've mentioned, such as

"In the end, it's the same thing over and over again. Symbols swapping with others symbols denoting some kind of esoterically tangible, but ultimately fleeting, meaning."

I feel that way about all the different languages, new ones or old. Just different symbols that distill down to machine instructions.

My question to you, how do you approach learning? Learning new things and marking your progress? What gives you the satisfaction that you've made progress in "learning" a given topic?


> My question to you, how do you approach learning? Learning new things and marking your progress? What gives you the satisfaction that you've made progress in "learning" a given topic?

I don't know. Right now I am learning how to not know when I am learning, because I have determined that measuring learning in any form can often be a barrier that actually prevents me from learning.


Interesting.


That probably depends on the individual. Some people are more self-motivated than others and may be able to make it to the upswing without a mentor.


I learned to program in BASIC, in 1981. I think that while learning to program has become more accessible thanks to the Web, it has also probably gotten harder do to the exponential increase in complexity of the systems that we program. There is also much more to software development than just programming.

In my day (taking the liberty of revisionist exaggeration), one could write cool and useful programs with a simple language and text based i/o. Programming and software development were nearly the same thing, and were exactly the same if you were an amateur like me.

Today, languages are necessarily more complex in order to interface with ... enormous systems and libraries of exponentially increasing complexity. At least, that's what it seems like from where I sit. And software development goes way beyond programming. Once you learn to program, there's a whole 'nother stage of learning how to use big code libraries, frameworks, IDE's, and so forth. I'm not sure anybody has figured out how to teach someone else to approach a big code library.

Maybe there should be a set of lessons in Code Academy, whose point is to learn how to get stuff done using the documentation and related lore (StackOverflow, etc.) for a complex library like MatPlotLib. I did something similar when I agreed to teach a review session for a math course that I had never taken. I had no choice but to show how to approach a textbook as a resource for solving problems. Maybe my students got more out of that, than they would have from just memorizing formulas.

For kids, a way around this problem might be to strip away the complexity by teaching programming on a platform that just doesn't have all of that stuff, e.g., Arduino.


Learning to code isn't hard at all. Learning to solve problems computationally is a different matter. There are tomes of knowledge one must acquire that is usually application specific and in potentially narrow fields. For example, if you want to write a RTOS you need to study and learn OS theory. Want to work on GPS code? Well, there's some math you need to understand along with a pile of other stuff. Want to work on search? Better have a very good handle of a range of algorithms and know how to implement them efficiently. Automation? Hmmm, you can't set an output and expect the machine to respond when there's physics involved. Aircraft control systems? ....

The point is that coding is easy just like understanding the basics of running a milling machine is easy. Being a machinist is far more complex than knowing how to turn the wheels. And machining highly accurate objects in exotic metals is multiple levels above that. Learning "machining" is easy. Learning to be a machinist is hard. Same for coding. Learning to code: easy. Becoming a software engineer: Harder.


I think "coding" professionally involves a lot more than being able to write simple programs for fun on his own. That's why I don't really like the term "coding".

I considered myself a good programmer, but during my first job as an intern, I was confronted to million of lines of barely documented C++ code (a C compiler for some obscure micro-controler). Most of my time was spent trying to understand what this code was doing, and praying that I didn't break anything. The days were long with no distraction and I had a lot of pressure from my boss. It was horrible.

Second job (still as an intern) was the opposite. I was supposed to design a prototype website for a bank. There were a lot of boring meetings and nothing really interesting to do.

I didn't feel I could be successful or happy in such an environment (and I ended up in academia). On the other hand, I had schoolmates that weren't passionate about programming and that learned it much later than I did, and did quite well (being persistent with good social skills).

I always wonder if I could have had a better experience in different companies.


Currently I teach Java courses (up to Data Structures) and I give the warning at the beginning of the semester. I don't use it as a scare tactic to be mean, but to help instill the sense of difficulty in the course.

Programming is only hard because it is simultaneously a basic IT, foreign language, applied mathematics and logic class wrapped into one! Also, since its 100% cumulative, every class they miss can be a death sentence. How can you understand Inheritance if you missed the lecture on Objects (or Conditionals if you missed Variables)? I've specifically told my students not to use an IDE for the first month so they get an understanding of how command line operates.

I think another issue is simply underestimating the time it takes to complete a simple assignment like implementing the distance formula. Taking an algorithm you understand and translating it correctly can be an exponential problem.

I try to model my courses very similar to the MOOCs I've used for practice (edX, Codecademy, and Udacity) - I am a developer turned instructor, so I use these courses as guidelines as what to do. Also, I look at something like MIT's edX courses and think "if this is what they teach, why shouldn't try to model that?"

The one thing I like about the courses are that they each uses a constant engagement tactic to ensure the user did learn what they heard. 5-7 minute video, then immediately a quiz or 'lecture exercise' designed to make sure you get the material (not just repeat a definition).

Again looking at MIT's edX 6.00x course, you are given "Problem Sets" (homework) that is given a week to be completed. Each part builds to a grand finale. For example, the distance formula mentioned above is then used in my course to implement a crude version of collision detection (www.youtube.com/watch?v=W84QzXUxcL0). I use collision detection because one of the key demographics of CS courses are nerdy gamers (not an insult, I love Binding of Isaac, but that's one of the people that go into CS).

One of the other things I try to do is follow the structure of my martial arts classes. In aikido, we start each class with "tai sabaki" (basic body movements). Since these are the core fundamentals of the art, practice makes permanent. I want to start adding basic keyboarding exercises of basic syntax to get them in the habit writing the words that look like english, but aren't.


That sort of deviated into more of a 'this is what I do', but the ultimate goal was to try to ease the difficulty of the courses as best you can. If they're slackers, let them fail, but if they have a want to learn and it is just confusing, figure out how to get them to learn.

Developers are problem solvers, the student doesn't get the material, figure out how to solve the problem.


Warning, unpopular opinion based on experience lies ahead.

The ability to program (and can we please stop calling it 'coding' as a career path? Writing code is the smallest part of programming) is something that either comes naturally to you or it doesn't.

If it does, you wonder why everyone says it's hard - to you, it's almost literally how you think. The pieces just fall into place.

If it doesn't, it's always an uphill battle. You can acquire proficiency through a lot of hard, painful work. Most people who fit this category can do the job, but they do it through rote following of procedure as opposed to exploration and intuition that comes from putting the pieces together without consciously thinking about it.

This isn't a matter of intelligence. it's just a certain way of thinking that some seem to have and some don't. Plenty of extremely smart people aren't built for programming.

On the other hand, if we accepted this, then there wouldn't be much of a business model left for the likes of vikingcodeschool.com ;)


Agree on this one. Certain kinds of people and thinkers tend to gravitate towards programming. The aspect of getting satisfaction from seeing the perfect alignment of code on the screen, and the satisfaction of re-factoring code into clean, modular pieces, must somehow satisfy the programmer in order to get a kick out if it.

So probably those who are naturally logical thinkers, even to a level that normally would not be very useful, become useful in the computer space where they can utilize their internal skills of putting things into neat ordered rows of zeros and ones.


Personally I find the confidence vs competence graph to reflect my feelings when learning things in general, although not necessarily for the same reasons. There's always a period of eager discovery followed by the stark realization of how far away mastery is and finally the gradual slow crawl toward success.


This seems true of any relationship with something outside yourself.

Perhaps the curve of familiarity with the other for any empiricist.

Look at a totally foreign thing. First comes cautiousness about whether it exists, but then just by looking you get a vague notion that it's a coherent thing. Then curiosity draws you closer to it and therefore you see the thing in greater detail. Some of the newly visible bits mess with your idealised picture of the thing from before. Only if you decide to persevere in understanding this other do you start to integrate these discordant stimuli into the currently running model of it. As the thing comes into clearer focus you realise that the discoveries are slowly revealing negative entropy. It's safe there. Following the trail inevitably results in reaching the end of it.

If only there were a standardised protocol for consuming the universe.


> Phase III: The Desert of Despair

I've been programming for 30+ years, and have yet to ever leave this phase. ;-)


Super interesting article. I used to be a big fan of Codecademy and other online website to learn to code. The author is right, at some point you need motivation, like in any new thing, and that's where I think most of them are failing.

Without a real project or a constant progress that you can follow, you easily end up abandoning that site.

I built CloudAcademy.com more than a year ago and even if we are focues on Cloud Computing platforms like AWS, one the things we are trying to solve is providing good, constant information to our students on their progresses. That's the most important thing. In our case they really need those skills to they have a very high motivation to complete the courses, but still, I see this as a priority for an e-learning platform.


The ease with which you code is directly related to your general problem solving ability. The better you are at solving problems, the more simple and straightforward you'll find programming, because coding is simply an extension of the thoughts which are already floating through your head.

This is why learning to program is so difficult for some people. If you have poor problem solving ability, putting a second abstraction layer over it (thoughts -> something the computer can understand) can be ridiculously difficult. This is especially true for abstractions which don't map directly to thoughts human beings typically have (recursion, pointers, etc.)

This is also why learning to program is so effortless and simple for others.


In my experience, one thing that can make the path feel more progressive and monotonic is having a set of problems that need to be solved, with a wide range of difficulties. This provides consistent short-term gratification while also providing long-term gratification and utility.

For me, as an earth scientist, there have always been small calculations or simulations that are quite valuable, even though they are not overly difficult or complex. This provided the motivation to learn basic Matlab. As my skills and ambition grew, and I learned new languages and techniques, the class of problems that I wanted to solve grew in scope and complexity. I also learned to recognize new problems and their probable solutions as my tools developed. Programming changed the way I look at the earth, and at statistics, personal finance and a range of other things in which quantification is enlightening.

In contrast, I have had a hard time picking up new languages or programming paradigms when I don't have an immediate need. I've spent time mucking around on various 'learn to code' websites, and read and sometimes completed tutorials on databases or Haskell or whatever, but it feels very different: meandering, non-essential, hard to gauge in scope (how much do I need to know to be useful, how deep is the water really), hard to link up to the rest of my life. Programming for me is a powerful and enjoyable means to many ends, but not something that I am inclined to do for its own sake.

I think if I had said, "I want to learn to program so I can build web apps" without actually having a simple and truly useful web app that needed to be built, I would very quickly move on. If I had said, "I want to learn to program so I can find a new job that pays more money" I would stick with it for a bit longer but it would be incredibly frustrating, because it doesn't seem like there is a very straight path without formal guidance (such as going to school of some sort).

I suspect many people feel the same, but I don't necessarily know how smaller, less formal education systems can work on that. Students always need some amount of self-motivation, and useful results are a long ways off in some areas.


What you said really resonated with me. Especially your professed sources of motivation and how programming for its own sake was never enough.


Huh. This must be a generational thing. When I started learning to code 20 years ago, there wasn't any hand-holding. There were college courses and textbooks and sometimes mentors and ultimately your own ability to work things out from documentation and first principles. My projects grew in scope as I got better at it, but I never thought it was anything but awesome.

Then again, I never set out to "be a developer". I was bitten by the coding bug as a kid, and tripped over the discovery, as an adult, that it paid pretty well.


The hard thing about learning to program is that to create anything even vaguely useful you have to learn a million things in parallel.

Say you wanted to build the simplest of Rails apps - you're simultaneously learning not only what the terminal and a text editor is, but how unix commands work, what an MVC framework is, probably a little of HTML and CSS, database migrations (maybe some SQL), asset management/pipeline, some random Rails-specific syntax, probably git, and if the creator of the tutorial is feeling ambitious he/she may throw in some TDD and testing frameworks. And that doesn't even begin to go into Ruby -- the entire programming aspect of programming.

So you're thrown out into the middle of the ocean, and blindly writing code you don't understand (because there's no way any tutorial could fully explain everything you're learning without being 2,000 pages long). You follow the tutorial, you get your little app running, then you realize, "I have no fucking idea what I just did." There's no way on earth you could do it again.

The other approach is to bring you from the bottom up, starting with language/syntax Codecademy style. So you spend a month learning how to almost be able to write a for loop in JavaScript, and then you realize you have no idea why you would ever need to know what a for loop is, and even less of an idea of why it's useful.

I got stuck bouncing back and forth between the two for years (literally), wondering how the other programmers were possibly smart enough that they could grasp meaning from random blobs of tutorial code, or how they possibly had the patience to grind through enough JavaScript tutorials enough that they could actually create something. I finally decided to throw away the crutches and venture out on my own. I think that was the single biggest step in becoming a (decent) programmer.

The timid, "I don't know how to program" side of me said, "Wait, I have no idea how to do this yet. You need to read up on it." But I finally bit the bullet and said, "You know what, I'm building this app right now. No, I don't know how to do a lot of it, yes, my friends that are a lot smarter would probably mock my code if they saw it, but I don't care. I'm building this." I don't think you can ever truly learn to program without saying, "I don't care, I'm building this." It took a long time and more Stack Overflow than anyone should ever care to read, but things finally started clicking. I built a few apps (Rails and iOS), went back to the tutorials, and said, "Are you kidding me? That's what they were trying to teach me?"

There was no way I would have remembered that crap if there was someone guiding me through or holding my hand. Sometimes you just have to start, having no idea what you're doing, and figure it out as you go. That's a foreign concept to people who aren't used to creating things, but I'm convinced it's the only way to truly learn.


My first real, non-toy app was an Android card game. I knew the basics of Java going into it, but nothing else. Following tutorials got me to where I had an app that I could install and run on my device.

At that point, it was up to me to figure out what to do. I started with a single screen that showed some text and added a few more screens that had the same text. Then I added a button that triggered a change of that text. Then a button that displayed an image. And so on until I had a real card game that could even identify whether there were any legal moves available and if the player was stuck.

Every step along the way was not easy and I got derailed a couple times, nearly giving up. The hardest single feature to implement was dragging and dropping a card. It took a lot of cribbed code from a few useful blog posts, but the feeling when I actually got it working was indescribable. It was a Saturday morning and I was running around my house, dragging and dropping cards on my tablet like I was 8 years old and it was the greatest Christmas gift ever. That was the moment when I realized I could actually finish this if I was willing to put in the effort. The rest of that weekend was spent in blissful coding and my commitment to becoming a developer has never wavered since.

Now in my job, which I got an interview for because of the card game, I have witnessed other people with less self-guided (contrasted with CS class projects) experience than I started out with be unable to persist long enough or self-teach hard enough to solve a problem by themselves without asking for help to get over minor bumps. I won't ask for help from a more senior dev unless I have exhausted my abilities to understand the problem space and can enumerate the things I have tried. I refuse to be the person who simply "doesn't know how".

Never ever give up.


> So you spend a month learning how to almost be able to write a for loop in JavaScript, and then you realize you have no idea why you would ever need to know what a for loop is, and even less of an idea of why it's useful.

This is a carbon copy of my life as a programmer right now.

I've made the decision to just fucking do it about 50 times, but each time I get barely started and get so frustrated with myself for not understanding and not knowing what to look for that I give up. I know that's another point where I just have to sit, searching, until I find the answer, but it seems that from my profound lack of understanding of the basics of the language (JavaScript; despite doing the Codecademy course, and the CodeSchool Node.js and Express.js courses) that I will have to do that for every minute step that I take in the program. It just seems so overwhelming that I become paralyzed in my desire to push on, but feeling that I know I won't make any progress.


>I don't think you can ever truly learn to program without saying, "I don't care, I'm building this."

This is a great point because it's easy to get stuck in analysis paralysis. There's always a tension between needing to stretch out and build something just beyond your capabilities and then having to backfill some of the fundamental knowledge you missed along the way so next time you can reach further.


For many years, I thought everything was very complex and would be difficult to learn and perhaps I wasn't smart enough to gain a full cognition on the subject matter

But I was able to take a big step back one day and realize everything isn't always complex, a lot of the times it is just over complicated and over engineered. Whether it's true or not, it has helped me immensely in my approach to look at new code from a more advantageous angle and more importantly, supply me with the confidence.


"You can make that application work but what's happening beneath the surface? Your code is duct tape and string and, worst of all, you don’t even know which parts are terrible and which are actually just fine."

I commonly get to this point after just wanting to proof out a concept and get something working. And then I realize I need to go back and do the unglamorous work of getting it right. But I learn the most then, and helps enormously on future projects.


Also, for rookies and veterans alike, when you are learning something new (especially tech stuff) and it's getting progressively more difficult, remember to take it easy on yourself and to relax. Those stress induced tension headaches from squinting, furrowing your brow and getting overwhelmed from the unavoidable and apparent decrease in your progress are productivity killers.

Relax, and remember that at some time, everyone is a neophyte.


This reminded me of another well written article from last year. Posting it here for others who might have missed it: http://techcrunch.com/2014/05/24/dont-believe-anyone-who-tel... (Yes its from TechCrunch, but its actually good)


Hey there, really, really great article! It really spoke to me. I've got a small question for you, though: I'm in the Cliff of Confusion, but I have a serious problem. I can't even build a program. I know syntax, structure, etc., but I don't know how to pull all of it together to actually build something that serves a purpose. Do you have any advice for me?


This is actually a good time to check out a few tutorials to ease into building mindset and patterns. They start to become a crutch eventually but can be a great transition between syntax and building. I don't know your stack, but if you're looking for Rails check out Daniel Kehoe's RailsApps, Tuts Plus has some free stuff and, if you're looking to get deeper, the Hartl Tutorial in Rails is the standard (though it's often too much for a beginner). Google will know more than I for specific resources.


As Baq said, programs must:

- be executable - (optional) take input from somewhere (e.g. stdin, or command line arguments) - (optional) write output (e.g., print statements)

Would you be able to write a small script that implements solutions for Project Euler? (Even just a few of them.)

- You don't need any input at first, because the problems are already specified. - You can write a separate program for each problem, or just comment out calls to each exercise's solver - Brute force is OK for many early problems - You probably know enough to write a function that solves an easy one (e.g., Nth fibonacci number)

Once you write such a function, you should be able to find out how to write an executable program in your language, and then you're off to the races: Run it over and over from the command line until you (a) stop getting errors, and (b) get an answer that you can defend and understand. :)


Get some very small problem, put something toguether, and polish it until it works.

Do no stop to think if it's any good, neither stop to plan ahead what it will look like. Not on your first few programs.


programs, in the limit, look like this:

1. consume input 2. do work 3. print output

find a small but interesting piece of work to do and code that. repeat several times. there are sites that provide you problems to solve if you don't have any ideas of your own (like project euler).


Learning to code is not hard if you define the code to a smaller step of problem. Learning to code is impossible if you define it as "program for voice recognizition in english and spanish'. If problem is defined as "print Hello World' then you know how to code once you learn to solve that problem. Further more, it is easier problem to solve.


breaking problems down into smaller and smaller pieces, I think, is a challenging skill to master, and people struggle differently with it (from "not at all," to "immensely.")

If you break a problem down far enough, eventually you end up with groups of problems, each containing simple conditional statements, a few variables, maybe a loop :-). That's the challenge new programmers face!


> Unfortunately, in later phases the density of resources drops off fast. Anyone who's made the jump from beginner to intermediate can attest that there is a BIG difference between the amount of resources available when you first start out versus when you're first looking for help building things on your own without too much hand-holding.

Scarcity = opportunity!


Learning to code is not hard. At all. Not relative to things that actually are hard. Having trained as an electronic engineer, programming is by far one of the easiest things I've ever done, by an order of magnitude. You can tell by the number of teenagers that can code, how many teenagers can design a nuclear submarine? That is much harder to learn.


Your comment is both unfairly dismissive and wrong.

| Learning to code is not hard. At all.

This assertion is empirically untrue in the market for programmers (if nothing else). Why are programmers paid as skilled laborers? How many not hard (at all), highly paid, and quite frankly cushy job exists? In the Bay, hiring is the biggest problem.

| how many teenagers can design a nuclear submarine?

Why is designing a nuclear submarine the point of comparison? Is this some sort of humble brag?

| That is much harder to learn.

No one claimed programming is the hardest job in the universe. It is perfectly possible for harder jobs to exist and for programming to be hard.


I'd disagree, except that so much of this depends on how you define "learning to code".

Digging into someone else's code base to understand it well enough to fix bugs and add features without introducing unintended side-effects is, at least for me, quite a challenge. It takes a tremendous amount of persistence, the ability to build a mental blueprint containing substantial amounts of logic, and - most importantly - the ability to swap into the mindset of whoever wrote the app, rather than the way you would have written it.

You know, a lot of us have studied tough things - I was a pure math major and I was a PhD student for a while in a very math-ish engineering department at Berkeley where most of the day was spend on proofs about stochastic systems or convexity. This sort of background isn't unusual here on HN, yet many of us would say that while code itself contains a wide range of challenges and difficulties, the field absolutely has monumental challenges that will push very smart people as far as they can go, maybe farther.

I once (about 15 years ago) got rebuked (gently) by a researcher at Sun Labs once when I dismissed a potential project as "easy." A museum was looking for someone to help build a search engine for their art collection. I said "a db search across some metadata, that's not a huge challenge." The researcher replied, with a mild smirk, "oh, I didn't realize you'd already solved all the issues in image search."

So then I backpedalled a bit and talked about the specs the museum had mentioned, "oh, well, they're just looking for search on some metadata". He replied, "well, maybe that's the only way they understand the problem, but you don't need to accept those limits".


There's also the amount of context required. To write a basic program, you don't need to know how computers actually work. Sure, it helps, but it isn't a prerequisite.

Designing a nuclear submarine, however, requires a lot of interconnected disciplines that rely on an understanding of each other.

To a lesser degree, it's similar to setting up your own ESXi cluster at home. You need to know more than just VMware--you need to understand networking, storage, flashing and configuring the host BIOS... AWS is quite a bit easier than managing your own physical environment.

Which may relate to why we see more younger people getting into coding than infrastructure/operations (regardless of the money).


I think it depends how you define "coding". Writing a compiler for C++ from scratch for instance isn't the same feat as coding a bubble sort in php. Besides, nobody designs a nuclear submarine anyway. I assume it's always a team work. It involves a lot of knowledge that not a single individual possesses.


I think the biggest advantage, the ease comes from the instant feedback you get while learning. Tweak something? See what it does. Nothing's more frustrating to learn than something that comes with no feedback. i.e. relationships, getting a job, education its self


>how many teenagers can design a nuclear submarine?

Just about every one I meant.

It isn't likely to work. But then I wonder how many teenagers could code a working control system for something like a nuclear submarine.


This comment illustrates a major reason why bad design and bad interfaces exist: engineers who think that "easy for me" is the same thing as "easy."


Yes and no. From my own experience I would agree that learning to code is initially easier than hard engineering disciplines. However, the worst code I've encountered is written by EEs and MEs that overestimate their skills.

And, as mentioned elsewhere, there's a big difference between writing an application or two and being involved in a larger software engineering effort.


Well learning if - while is like drawing a submarine on paper. How many teenagers can design an embedded software system of a nuclear submarine is a better question I think.


In my experience, this graph looks similar if you get hired at the first peak, but the desert of despair doesn't dip as low as long as you have some form of mentor or reviewer for your code. You can, of course, get hired to a bad job and the desert will dip further down (and your risk of leaving the industry increases).


Even with years of experience in a given language under your belt, heading into a new job and facing a new set of frameworks can lead to issues. It isn't quite like starting from zero again, but you have re-learn how to do the most trivial of things, as it may not even remotely resemble how you were used to doing it.


Haven't read the article so it might be just random rant.

I find most articles confuse 'code' (producing code) and 'program' (making programs).

I would argue the first one is relatively easy once you grasp concepts and turing-cturing-completness.

The later however takes years if not decades.


This is probably true for a lot of other things. Learning an instrument, learning karate, etc. basically its easy to get started than to keep going.


I can't remember the source, but I learned this in a management class in college about 10 years ago. It was a general statement about learning something new. You don't know what your don't know (so you are optimistic) -> You realize you know nothing (so you are negative) -> Then you get better with practice -> Mastery


exactly!


When someone asks me "How do I learn to code?" I always say "You don't. You just code."


And then they give up and do something else, with lower self esteem this time.

Seriously. "Just code" is the worst advice that everyone gives. It's trite. You'd be better off saying nothing at all. Do you think you can sit someone down with a C compiler and come back five years later to find they've written the Linux kernel?

Don't say "just code". It's not helpful. It's disparaging. You don't tell a child "just walk". You learn to code, you're not born with it.


Of course one is not "born knowing how to code". Saying "Just code" does not mean "sit down by yourself and try to figure out the C compiler." It means, for most of the people I've met/worked with "Stop agonizing over what course to take, what book to read, what you should build first - just build." Implicit in that is that the person can ask me for help outside of some master/grasshopper relationship.. no I'm a coder and so are you. Just show me what you're working on and ask your buddy here for a hand if you need one.

I know a lot of people say something like "Just open up a git repo and start making pull requests on the back end of your gulp process after you wget the source from such-and-such repo"... Yeah, that annoys me too. I've had a lot of success just sitting people down with Chrome and have them hit Ctrl+Shift+I and show them a couple fun things with DOM maninpulation, etc. The point is, the worst thing to do that I think claims the most victims on the path to learning to code (I know I struggled with it for years) is just constantly deliberating about what is the "best way to learn" when you should just be playing, having fun and failing. That's the hump I try to nudge people over - the idea that you need to have permission or a "master" to become a developer.


I think we're way past the point where mindlessly playing around and idle curiosity can reliably teach someone how to program. That might get them interested in programming, but it's one thing to print something to a terminal or change the text on an exiting page; it's quite something else to get a GUI app going or deploy Rails to a server. And GUI/web is pretty bare-minimum when it comes to keeping people interested in programming. Not a whole lot of people want to write console applications anymore.

Here's the sticking points where beginners need your help, and might not even have enough information to actually ask what they're looking for:

What framework do I use? I see people talking about AngularJS, so I'm going to pick that one. Oh wow, it's harder than I thought bzzzt you just lost a future programmer.

How do I deploy? DigitalOcean costs money, bzzzt there goes another. I paid for DigitalOcean, but I ran into troubles installing pip, bzzt there goes another.

How do I make a GUI? Tkinter, Wx, Qt, GTK, WPF, bzzzt there goes another.

And the worst of them all... "What language should I learn?"

Yeah, a lot of programmers might not want to hear this. But there are a lot of languages. You can't blame a beginner for not being able to decide. And to be honest, there isn't really a good choice here, which is why programmers get into fights about languages and when that happens, bzzzt there goes another. Javascript sucks, but people claim its essential. Beginners get turned off by hearing "Javascript sucks but you have to learn it". The better option is to just not learn programming. Or bringing political wars into it. "Don't learn C# because M$". Beginners don't care about your politics, they want to learn. Eventually people settle on Python, so the beginner starts learning Python, then asks "what GUI framework" or "how do I deploy Django" and we start back at the top of this list.

So, in the midst of all of this confusion happening for a beginner, they're (understandably) lost and ask for a lighthouse to guide them on a path, any path, as long as they don't have to make the choices themselves. So they ask the question "how do I learn to program". And the response they get? "You don't learn how to program, you just start programming."

Bzzt, you just lost the entire next generation.


It's funny you use walking as an example. I never told my son to "just walk" but I may as well have. No one taught him, he never asked how. It just never occurred to him that there was any option but to keep trying until it works. As far as I know everyone learns to walk this way.

In my generation and before it, most of us learned coding the same way. (And self esteem for that matter.) Even if you do have the luxury of teachers and classes and handholding and cheerleading, the real learning is not going to happen until you sit down with a code editor and actually try to make something real.

And if the first difficulty you hit causes you to turn around and run then that's another win. It's a very fast way for you to learn this not a career or hobby for you, because dealing with those decisions and difficulties is a major part of the job. 30 years in I'm still dealing with stuff like that every day.


You taught your son to walk by walking. He learned to walk because everyone around him was walking. That's undeniably the best way to learn: being in an immersive environment. That's akin to going to hacker school.

What I'm saying is 'just write something' doesn't help people who would ask the question 'how do I learn programming', It helps the people who ask 'how do I get better at programming'. You get better at walking by walking more. You get better at coding by coding more. You start walking by learning it from someone else, or by using something to pull yourself up (like a chair or a table leg). You learn programming by asking how and following their advice.

Your son didn't ask how to walk because he didn't know the question. Beginners don't ask how to deploy because they don't know what deploy means. "Just code" doesn't answer the problem of "how do I write a GUI app". It makes it seem like it's so easy that you shouldn't have to ask. And since you don't know the answer and no one will tell you, programming must be hard, or you must be stupid. That's what a teacher or a mentor offers. Advice. Not glib remarks.


It's an ad. At the end: Our core program is specifically designed to bridge this whole process. ... Sign up below. Worse, the training just creates junior web developers. There's a glut of junior web developers.

The concept of Ruby on Rails was to make the whole process of web site development a tutorial-level job from start to finish. How did that work out?


It's content marketing, a bit different from advertising. It only works if the content is good (which it is in this case)


"... was convinced that the seemingly normal programmers I ran into were actually sociopaths who had experienced, then repressed, the trauma of learning to code."

It's true. The saddest part is that we forget who we were before we crossed over, and lose the ability to sympathise with people who haven't been through it.


I sympathize if I detect that they're genuine in their desire to learn and improve themselves. As for the others, the ones that just want to "get by", they don't get much of my sympathy. And that is after I try my best to inspire in them a sense of self-improvement when it comes to programming. Alas, not everyone has that outlook on programming, but rather just see it as a dumb-tool to muddle over some arbitrary problem someone dreamt up in some ivory room, somewhere.


What I wrote reads that way and I can see my error. There are lots of pretenders and I do not apologise for them. I intended to write that our brains get rewired to make things that were once hard easy, that we do not know what to explain.

My strategy for people who ask me to tutor them is this: work through the first five chapters of /learn python the hard way/ by yourself, and I will mentor you the rest. Those chapters are so easy, it is just a test of motivation. I have only had one starter. Also, she finished.


Learning to code is not so hard, but learning to code well is extremely hard - almost impossible. I have been doing software development years and years also in some well known companies and I am not sure if I have yet met a good developer.


I never found it that hard, either back in the 1970s or now.


The more your learn, the more there is to learn!


I have no memory of the basic concept of coding being hard to learn. I do remember thinking it was cool that I could make stuff happen by itself.


vike22 Awesome! Keep it up!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: