Hacker News new | past | comments | ask | show | jobs | submit login
Teach Yourself Programming in Ten Years (1998) (norvig.com)
253 points by MindGods on June 6, 2021 | hide | past | favorite | 115 comments



>> Learn at least a half dozen programming languages

I'm always surprised by how many people disagree with this; they're on the search for that one language they can use for every task. Or even worse, they think they've found it and their search is over, that's a tragic situation given how spoiled we are for great languages today.

Clojure (STM / refs, vars, atoms & agents), kdb/q (non-loopy array code), Rust (ownership), Go (async done better, although i also really like core.async in clojure too), Python (trio nursery), C++ (asan, msan, tsan).

There's some blogs continually give me good food for thought in this space, all signal no noise:

Eli Bendersky, (every language under the sun) e.g. https://eli.thegreenplace.net/2016/the-expression-problem-an...

Fasterthanli.me (Go, Rust, others) e.g. https://fasterthanli.me/articles/so-you-want-to-live-reload-...

Mechanical Sympathy (Java), defunct but still worth visiting https://mechanical-sympathy.blogspot.com/


I was one of those who disagreed with that, but then I was forced to learn several new languages for work. Now I agree with it wholeheartedly.

my favs:

-clojure for processing deeply nested data

-rust for general stuff

-java for concurrency support and general stuff

-c for easy pointer manipulation and "tricks"

-sql

-bash for scripting (I'm that weirdo who would rather make a huge shell script rather than using python)

I still need to find something good for arrays/ matrices. MATLAB was kinda fun I can't see it being used in production in anger too easily


For arrays/matrices I'm falling back to Fortran. It has builtin support to matrix/vector multiplication (and other operations of course). There's no need to install other libraries, or to deal with virtualenvs, or package managers. Fortran is not that sold creepy thing from the 70s anymore, Fortran90 (and later) is pretty elegant IMO.


Cheers to that! Fortran introduced me to “:” syntax, back before I’d ever heard of numpy. (Or expression templates for that (somewhat tangential) matter)


Python's NumPy library deliberately emulates much of the functionality of Matlab's core (similarly, fundamental plotting with matplotlib follow similar design cues). As a lapsed long-time Matlab user, I think they're just as fun to use.

There is also Octave, an open-source Matlab alternative, but when I last used it (admittedly quite a while ago) it had limited support for toolboxes (e.g. signal processing). I imagine it has only improved since.


For elegant matrix syntax I like julia or fortran better, if I need a lack of a license or more speed. Octave is good too, but it is much slower than Matlab. Numpy is in a weird spot. Its still slower than Matlab, but its everywhere, but all the versions and environments are different so you need containers to be portable.




I write huge scripts with bash. I know many of the pitfalls referenced on that list (and more), but this a great resource for teaching others. Thanks for sharing it!


> I still need to find something good for arrays/ matrices

R?


MATLAB, if you can get a license would likely be better. Name literally means "matrix lab". Has excellent matrix manipulation features.


And Octave(https://www.gnu.org/software/octave/index) if you can't get a license.

I actually had one hectic weekend porting our whole departments MATLAB code over to Octave due to a hardware failure of our licensing server and some stonewalling from Mathworks, Octave is pretty close to fully MATLAB compatible.


Worth a mention, but it's not comparable in terms of feature set and execution time, sadly. I'd love someone to come along and make a MATLAB (Modern Fortran) knockoff that kills MATLAB.


The biggest advantage of bash over scripts is parallelism. It's not bad at job control, and if you write your scripts so they communicate well with streams, you can scale them up with a little bit of bash, xargs -P, parallel, etc.


Even more important than different languages is to learn about different execution models. For instance, the difference between Java and C++ is relatively minor from that perspective, compared to (e.g.) understanding how Prolog works. That will open entirely new horizons if your previous experience was based on C(++)/Java(Script)/Python (just to mention a few of the usual suspects). Just look at how the addition of FP features to “OOP” languages considerably widened the field of possibilities.


Learning (very basic) Prolog indirectly caused me to learn to write parsers, AST transformations and symbolic derivation - I picked up a book on Prolog and was mystified by how simple the examples of symbolic derivation were in Prolog, a subject I was struggling with. So I sat down and figured out how to do an expression parser just so I could write a solver that mimicked how Prolog did it to understand how it worked. And in the process I learned a lot more.

It was transformational in a way my maths teacher was not...


Yeah i agree, I was a bit clumsy about it but this is really what i was digging at in my examples.

I see a lot of conversation on here dedicated to whether you’d like your data with your functions (objects) or your functions with your data (closures). Rarely see array, logic or constraint programming paradigms mentioned.

FP has been rediscovered again and it’s the hotness right now but it’ll die out again like it did before OO stole its lunch, i can almost write the blog post headlines just now “arity considered harmful”, “OO and side-effect management not mutually exclusive after all!”

I’ve got a fiver on the next big one being a re-visit of structured programming.


> a re-visit of structured programming.

God I hope so, probably the worst time I have teaching junior developers is helping those who have only used for-each write a loop with a useful invariant and not a garbage fire of breaks/continues.


I’m surprised you don’t mention Perl in this list. It’s the de facto Unix scripting language and frankly more suitable to the task than others.

Perl on the command line is beautifully concise; it’s easily embedded into CI/CD pipelines because the q, qq, etc escaped quotes; it has best-in-class regex support; calling external utilities equally concise with qx or back-ticks; etc etc.

Personally I tend to favor C/C++ for the main event and Perl for everything else on the systems side.

JavaScript is obviously ubiquitous on the front-end but it feels very clumsy. Not a huge fan of that on the back-end system it’s just too weird to do simple stuff like “parse a file” or “rip through a directory tree” or call a shell command.


Ruby is a better Perl; it's closer to Perl than e.g. Python, it has fewer footguns but is also very expressive. It also has alternative quotes, backtick, shellescape etc. for ad hoc shell integration though you're better off learning IO.popen and others to take the shell out of the equation.


Subjectivity aside: the fact stands that Perl is the most ubiquitous scripting language on Unix and the de facto standard. You’ll find it by default on many container environments as well—but hell if you’ll find Ruby. ;)


On most of my Linux systems Perl is only installed as a dependency of debconf or git.

I’m sure the Linux kernel community is still using it, but FreeBSD even managed to remove it from the base system.

It’s still ubiquitous… “standard”?


Perl is frustrating. Having variables in scalar, hash and arrays is not do bad. But having the strange syntax to use them is so annoying. Coming from other languages it always takes me a day to get back in the habit. I'd gladly never touch Perl again.


I think it's fine to "learn" lots of languages, but you're going to be pretty bad at most of them. And I personally wouldn't be comfortable with building a system that will reach hundreds of thousands of LOC in the majority of the dozen or so languages I've "learned" over the years.

If you're really going to learn a language - as in, build idiomatic programs - its a constantly moving target and requires a lot of effort to maintain.

I think it's better to learn 2-3 languages, and have passing familiarity with others. People like to pretend that language choice solves all your problems, but usually you're talking in percentages: one language is a 80% fit for your problem, another is an 85% fit... there is never a 100% fit - are these differences worth working in a language you barely have any experience in?


I think it depends on the language and background. Learning Go is maybe a 20 hour enterprise. Learning C++ if you're coming from Rust? Maybe 20-40 hours, not sure. Learning C++ coming from Python? Well....buckle up.

Same in reverse btw. Going from C++ to Python is probably easier than Python to C++, but it's still not going to be as easy as, let's say, java -> C# imo.


I mostly agree, but I think 2–3 is too few. I think I'm pretty good at C, bash, Python, make, and JS, and also (if we're counting languages that aren't programming languages) SQL and HTML. I used to be pretty good at Perl, but the language has moved on since then. I think I have more than "passing familiarity" with i386 assembly, PostScript, OCaml, Java, C++, BASIC, Bicicleta, Scheme, Lua, and Brainfuck; that is, I can figure out how to write things in them without constantly looking stuff up, and I'd be comfortable writing hundreds of KLOC in them, but what I write is surely not fully idiomatic. And there are another 32-64 languages I have passing familarity with, meaning that I've written useful complete programs in them, including Golang, Rust, Excel, Smalltalk, PV-WAVE IDL, Prolog, Common Lisp, Scala, Octave, Elisp, Ruby, Pascal, Clojure, Forth, R, and Tcl, but I can't think of most of them right now.

I guess in the context of your comment this sounds like bragging, but that's not my intent; I'm trying to rebut your comment, using myself as an example, because I think this level of polyglot programming is pretty close to normal, at least after you've been at this stuff for a few decades. Maybe if you're getting into a rut of only knowing 2–3 languages well, you'd benefit from putting more effort into exploration. Unless you're about to stop programming, in which case you won't have time to take advantage of your newfound powers.

I don't think percentages are a good way to think about language fit. I think it's more like an effort multiplier. In theory I can solve any programming problem in C, but when the problem isn't huge and performance isn't much of a challenge, I can usually solve it about with about a tenth of the effort in Python. There are problems where solving them in SQL is about three times easier than in Python, and problems which can barely be solved in SQL, so solving them in Python is about a hundred times easier.

As an example, in my experience it's pretty common for Python to be about half as much effort as Scheme, but there are much better Scheme implementations out there, so when performance is a challenge, solving them in Scheme can be a hundred times easier than solving them in Python. But if you know Scheme and not Python, you can often get twice as much done by learning Python, even for problems where you might say Scheme is a pretty good fit. But it's true that at the beginning, when you have barely any experience in Python, you won't be faster; you'll be slower. And you won't know if that will ever change.

Most commonly, though, people use the language that is best integrated with their chosen platform, because that saves them the effort of writing a bunch of integration code in addition to the actual application code. If you're writing DHTML, for example, use JS, not C. If you need to invoke JVM libraries, reasonable options include Python, JS, Kotlin, Clojure, and Java, but not Perl, C, or C#. If you're writing a Minetest mod, do it in Lua.

So, if you only have more than passing familiarity with 2–3 languages, I think you're going to frequently run into cases where that ignorance costs you a factor of 2 or 3 in effort.

I certainly agree, though, that choosing the right language won't solve all your problems, and it's easy to have exaggerated hopes for it.


Languages evolve and it takes time to learn all the quirks. Usually you need a systems language, a scripting/shell language, and a front-end language. Sure you can get good at the entire stack, but that take time. I would say spend 70% of your time to get shit done, then 30% learning/researching.


I could write a backend in Elixir, Go, JavaScript, Python or C#, and language choice alone will lead me to very different architectures. Knowing multiple languages from the same category broadens your horizon, and choosing the right one helps you to work with the language and its ecosystem instead of constantly fighting against it


This is a great strategy to have but I would reformulate it as “make sure you know enough languages to solve system, scripting, frontend and backend problems efficiently”.

You don’t necessarily need a separate language for each domain.


Incidentally, ASAN / MSAN / TSAN are part of Clang and GCC, not C++ itself, so they're available for other languages, e.g. Rust (https://github.com/japaric/rust-san describes how to use them) and Golang (well, TSAN and MSAN at this time).

Related: MSAN is made so much better by -fsanitize-memory-track-origins. I can't count how many times MSAN caught an error, and with that flag enabled, it directly pointed to the exact line of code at fault.


I like the general purpose GC languages like Java JS c# python lisp for most tasks though, and then go with something a bit wilder or low level like Go Haskell Rust Erlang c++ c when needed. Or R/apl for a mathy problem.

Since I enjoy high level programming and the natural experience chicken/egg I never do the low level stuff. Not since Acorn Electron assembler! But I know of it.


JS isn‘t a general purpose language, even though is has been shoehorned to be one.


What does that even mean? I've pretty much gotten away with using JS for just about everything for the last 4 years. The evolution of a language to make it compete with other languages isn't necessarily "shoehorning". Am I not a true developer because I like JS and haven't found a need to write code in something like Rust?

EDIT: Do you mean that it lacks certain features without a runtime like Node or the browser? For instance, it doesn't have a language construct for opening files like Python or Ruby. If so, I don't think that's a good argument.


I would say it is. It didn't start as one but over the decades it sure has become a general purpose language.

I can't find a domain where JS wouldn't fit the description. While it may be a bad fit for certain problems, it can still be done.


Safety-critical controller for manned or unmanned aircraft, ground vehicles, etc. High frequency trading. Tiny embedded devices. System libraries.

I’m sure I could come up with more. JavaScript is a tool, just like c, go, rust, python, lisps, etc. I’m going to give you the benefit of the doubt and hope you just didn’t consider all of these other domains when you made this comment.

Cheers!


The domains you listed are not really disproving my point. I'm certain that programs could be made in JavaScript for these domains but, like I said in my first comment, it's probably a bad fit.

For example: Espruino is a platform for embedded JavaScript.

I just think about it this way: Just because you wouldn't doesn't mean you couldn't


JS is incredibly versatile, though I would agree that that's mostly JavaScript being equally bad in nearly every domain :)

I wonder how much of the horizontal scaling trend is driven by JavaScript's horrible threading story.


You might be confusing the JavaScript programming language and the browser runtime.

The former is a general purpose programing language that favors prototypical object oriented programing paradigm but supports multiple other paradigms such as imperative and functional.

The latter uses the JavaScript language and adds on a bunch of specific APIs (such as the DOM) for specific purposes. This includes APIs for DOM manipulation, playing audio, validating inputs, parsing forms, handling key presses, etc.


I'd be curious to hear some examples where JS doesn't fit as a general purpose language. In my arrogance I suspect it would be hard to give examples that don't also disqualify other scripting langs like python and ruby. Perhaps you don't consider those to be general purpose either?


For example, without something like Node, you cannot even open a file. I consider this a very basic requirement for being "general purpose".


it sounds to me like you're talking about a particular implementation (the browser) not the language itself. The browser intentionally has that limitation. The language itself has no problem with it.


Yes I mean JS with node


Haskell is also garbage collected


Well, this advice is from 1998 and still got some truth to it.

In 1998 languages did not have that many features and were in my opinion easier to grasp. Just have a look at modern C++ or Java's cadence of new features.

In 1998 Java and JavaScript were new. Now they evolved into widely adopted general purpose languages.


I remember 1998 pretty well - mostly for Linux distros becoming mature enough to be accessible for “normal”-ish people but it was a pretty exciting year for languages too. There was an absolute ton of hype for Eiffel, every month dr dobbs mentioned it at least once and Ada95 was enjoying a renaissance too - i forget the specifics but something changed in licensing and GNAT was integrated more with the GCC. Haskell was a hive of activity and Python was starting to gain widespread attention but it was no where near to dethroning the favourite language of people who signed their emails “JAPH” :-)

Your comment reminds me how much i miss dr dobbs journal. I learned so much on those pages.


The thing, in my opinion, is to avoid spreading yourself too thin.

I started using C/C++ more than 20 years ago and even if I feel really comfortable with the language (except the latest revisions) I am still learning and improving.

I don't think I could achieve the same level of mastery on half a dozen languages.


The title is a bit misleading to me.

10 years is not to "teach yourself programming," it's to "become an expert in programming."

Most people do not want to learn programming to become experts, most people want to learn programming to get a job. After getting a job, some will plateau right away, others will plateau after some time, and others still will keep learning even after years and years.

The problem is "how long until I become employable," not "how long until I become an expert".

The answer to the former is months of deliberate practice. The answer to the latter is [tens of] years of deliberate practice.

Books with titles like "learn C++ in 24 hours" target the former. And, I would say that the number of jobs that require the mastery of the craft is not large.


> And, I would say that the number of jobs that require the mastery of the craft is not large.

This mindset is how we end up with layers upon layers of badly designed and buggy software that underpins almost every aspect of modern life.

Just apply the same reasoning to other areas: Would you want to drive in a bus with a bus driver who just barely got his driver's license? Would you want to use a bridge designed by an architect who knows just enough to complete the job and does not care a bit for more?

It does not take mastery to write one-off scripts that "get the job done". However, they will probably not be a general purpose solution of the problem at hand, ignore corner cases, contain bugs… and the real test comes when requirements inevitably change.

If you want to write dependable, high-quality, maintainable, reusable software you better know more than the bare minimum.


not all code is life threatening in worst case scenario. Better comparison would be "would you like to be served by first time waiter or rather expert one?". In most cases it doesn't matter, and when it does, the price point of the service is significantly higher.


This is mostly just a rant not directed at you but on the state of software...

The Colonial Pipeline ransomware wasn't life threatening. These things happen because of bad* developers and zero accountability from developers over publishers to software owners. As it is now pretty much all commercial software is "first time waiter" quality. Granted it isn't all on the actual developers but on the whole pipeline(!). Most software is on the "let's use an anchor to stop our newly developed car because we have used anchors for decades and understand them better than drum brakes" level of quality.

* Bad code because of a lack of understanding and/or time constraints. But developers don't mind coding bad code (not enough to refuse at least). Developers need to be more like doctors and engineers. Accountability and proven skill matters and would force managements hand like at hospitals and building projects.


Bad code is happening for non-malicious reasons. I’ll suggest these two being the main factors, and I’ll concede that it is conjecture, but I’m drawn to these two since I assume good-faith amongst all actors:

1)

Everyone needs to pump out all of their bad code before they get to their decent code. Imagine your lifetime array of code, it will look like this [bad, bad, ...(lots of bad), bad, good, bad, good, good, ...(lots of good), good]. As you can see, you’ll have to pop (poop) all the bad out before you get to good. Some people have a smaller length array due to other factors (talent), but even so, there is bad in front of the array.

What are the implications of this? That should be obvious. This industry hires people straight out of college or people in their mid 20s to be project leads. You do the math.

2)

Lack of suffering. Many people haven’t had to toil in someone else’s codebase. Many people are given green-field projects that are scrapped quickly, at which point the business (or another business) gives a brand new project. Constantly building shit from scratch by yourself means you don’t understand pain. Go work in someone else’s garbage app to feel pain. Then you will rethink what good code is. Good code is not painful, and the definition of ‘not painful’ will be obvious to the survivors of pain.

Solution:

Continue to crap out your bad code in non impactful areas of the codebase and avoid crapping in critical parts. You have to crap somewhere, and that is understandable. Lastly, don’t turn down the experience of working in someone else’s labyrinth. The experience is valuable.


I'd like to add that modern agile IT environments are different than that of traditional engineering.

We aim for fast MVPs, fast iterations, "failing fast", "moving fast", "delivering value quickly", generally preferring speed over quality, agility over "waterfall" planning, continuous delivery over batched releases and, well, often accruing tech debt over shipping features later.

One can argue that "fail fast" should not be about code & feature quality but rather about determining proper scope, splitting features sensibly etc. But then we still have deadlines to meet and code quality suffers.

What I'm saying that this kind of fast-moving environment actively discourages traditional (like doctors & civil engineers) engineering practices.

You might have lots of (unit|integration|E2E) tests but the company may miss practices like proper security reviews and code won't be reviewed, tested and approved by multiple people like, say, bridge building plans are.

This problem mostly comes up not with your usual web app but when critical systems are affected (like in the aerospace industry or industrial systems). But through supply chain attacks, nearly any system can be vulnerable these days.

We should definitely have better regulation on how software is made.


The problem with the waterfall model is that it only works if you can determine the requirements up front and they stay fairly static. Sometimes that's true, but all too often its impossible.

But iterative development and fail-fast are independent of quality. If a company skips security reviews and is fine with technical debt because "it works, just ship it", what makes you think they would do security reviews or care about better-than-minimum code quality when following a waterfall system


I hate to do this, but:

Stop equating programming and bus drivers or waiters. They aren't equal. Programming is difficult in its depth and breadth in a way that waiting tables is not.

Stop treating all programming tasks as equal. They are not. Most of the easy ones are only easy because they run on top of all the difficult programs. That is the goal of the difficult programs, to make it possible for laymen to do work that would require an expert for certain repetitive tasks if the underlying program did not exist.

Most people would not be capable of getting Hello World to run without the OS and high-level abstractions that take expertise to develop. And making that possible requires many many experts in many different kinds of programming.

Stepping off of the single thread and single machine model into concurrent and distributed computing to enable even more laymen to make stuff takes even more experts.

There is a meme in the industry that people shouldn't strive for expertise. We still need more experts, and you become an expert through practice while still a layman. Nothing would be worse off if we had more experts capable of deciding when to use something extensible off of the shelf to enable laymen to do simple tasks, and when to build a platform to eventually enable that task to be done by a layman.

A lot of things would be worse off if we encouraged all people to stop at the layman level. We should encourage everyone to at least become a layman for their own enrichment. But we should also encourage laymen to strive for expertise to lower the barriers to entry to produce more laymen capable of doing simple tasks.


I once experienced a waiter holding raw meat (to be cooked at the table) over the drinks while serving, unknowingly letting it drip into the glass. (But as I didn't want to find out if it would life-threatening or not, I didn't try the drink.)


This one time I had a waiter who it turned out wasn't even our waiter and we gave him our money and the food was good but where did he get the food?

Also there are some great thoughts in this thread on why there is such a shocking amount of spaghetti code out there and I like the ransomware note also. Thought provoking. Too bad I'm such a terrible programmer.


> This mindset is how we end up with layers upon layers of badly designed and buggy software that underpins almost every aspect of modern life.

No, it’s not. Changing requirements by adding features is by far the biggest cause of badly designed and buggy software. I’ve worked on multiple teams where everyone was fully committed and highly skilled, and it didn’t magically fix the problems of software. It wasn’t any better than working on teams of people who were less interested. Our inability to stop adding features is what kills us, and this inability is actually stronger with people who think they’ve ‘mastered’ the art of programming than with people who can finish a one-off task.

> they will probably not be a general purpose solution of the problem at hand

Ironically, perhaps, in my decades of professional programming, the number one biggest waste of money I’ve seen is people over-engineering something under the banner of making something “general purpose” that didn’t need to be. I’ve watched a couple of different teams of very smart people waste literally tens of millions of dollars by deciding to rewrite something that didn’t need rewriting, and dramatically overestimate their ability to finish it in a reasonable amount of time and avoid the same mistakes they made the first time.

> If you want to write dependable, high-quality, maintainable, reusable software you better know more than the bare minimum.

This sounds good in theory, but is specious. The real way to get high quality software is to define what that means and stick to it ruthlessly. You don’t need to know a lot about programming. You need a management that is okay lengthening schedules to make room for testing. You need a CEO who is okay with saying no to customer demands for features that the competition has. You need programmers who know when to stop programming and when to avoid rationalizing their ‘general purpose’ solution that solves problems that don’t actually exist.

Good luck in your search for high quality software, it’s very very difficult to find a team who can commit to it, and there are good reasons why: it’s extremely expensive.


I would even say that good code is not always in a businesses best interests. The business needs to change quickly and cheaply. So they rack up technical debt. Often getting away with it for decades. The developers suffer through it but the CEOs walk away with more money in their pockets.


> good code is not always in a businesses best interests.

You nailed it exactly. High quality engineering requires costly business practices that most companies don’t need, so the tradeoff is easy to decide. This is why organizations that do high quality engineering are well funded and are very restrictive about what kinds of programming features are allowed. NASA published a famous essay about how to write safety critical code that disallows use of dynamic memory allocation. https://en.wikipedia.org/wiki/The_Power_of_10:_Rules_for_Dev...


How often do you check your surgeon's grades?

Or any of those things you mention? I normally just trust that the bus company has hired a guy with a license.


When people say driver license, they mean the one kids can get at age 16. Driving a bus requires a commercial driver license, which is referred to as a CDL - it's a much more rigorous test.

How often are you driven around by bus drivers who just finished their learner's permit?


> Just apply the same reasoning to other areas: Would you want to drive in a bus with a bus driver who just barely got his driver's license? Would you want to use a bridge designed by an architect who knows just enough to complete the job and does not care a bit for more?

I don't think this part is useful. If you're writing directly code for an application you don't usually need to be the best, and your errors can be handled by layers of supervision/QA. If you're writing internal libraries and tools, you need to be a better programmer. If you're in one of the let's say Rust compiler groups, you will need to really know your stuff. But even the Rust compiler has issues for newcommers that aren't too hard.

I don't think comparing software engineering to bus drivers and bridge architects leads to useful insight. I do agree with the rest of your post though. A part of my day job is to develop and maintain applications made by people that didn't/couldn't program "properly" (shadow IT). Everything more or less work and the end users are satisfied, but I often fear that one day something will depend on one of those applications. On the other hand taking the time to do everything "properly" would lead to a lot less experimentations and/or use a lot more resources. It's hard to find a good balance.


I have always assumed many programmers are terrible at sports in general so they turn programming into this weird sport of who can be the Cristiano Ronaldo of programming.


That's a pretty unfair way to put it. People that do "too much" to try to impress their collegues or themselves exist everywhere. You can see it in some people's code, you can see it in the way some people dress, you can see it in some powerpoint presentations, you can see it in sports.


> Most people do not want to learn programming to become experts, most people want to learn programming to get a job.

No, I think it is little bit different.

"Most people think they will get a job and then will become experts after some time of performing the job"

And they are right. But they are mistaken what kind of expert they become -- they are becoming experts at what they are doing which means, if they are mindlessly repeating same things they are becoming experts at mindlessly repeating same things.


>they are becoming experts at what they are doing which means, if they are mindlessly repeating same things they are becoming experts at mindlessly repeating same things.

That's a great way to explain it. Totally stealing that!


> the number of jobs that require the mastery of the craft is not large

The number of jobs that benefit from mastery of the craft is essentially equal to the number of jobs.

I do not enjoy finding the work of a bunch of people that studied enough to get hired and no more.


> The number of jobs that benefit from mastery of the craft is essentially equal to the number of jobs.

I think this is true. Nobody disputes that.

Imagine you'd hire Torvalds to debug and fix why your Wordpress does not work. I truly think he'd do an excellent job figuring out that AWS has a firewall that's blocking the connection from your WP node to your MySQL node...

Somehow, the example above feels bizarre and yet these are the types of issues a lot of architects, consultants, and other highly-paid job positions deal with on a daily basis.

> I do not enjoy finding the work of a bunch of people that studied enough to get hired and no more.

This sentence makes no sense to me. I'd say you "find the work of people that studied enough to get hired and no more" possibly just slightly more often as you "find work of people who never wrote anything public until they mastered everything they could, from building computers from NAND gates up to AI/ML".


Torvalds had trouble installing Debian, IIRC. Your wordpress scenario sounds more like basic IT / sysadmin stuff, which he likely never does himself. Same deal with Stallman (he gets others to set up his computers). These guys seem ultra-specialized to the point you could argue they don't know how to use a computer that well, even if they're fine programmers.


They may benefit from it, but they won't pay more if it's not required. So if you expect more compensation for your mastery, you may get passed over for someone with less skill, but the required skill.


I was mostly trying to say that you shouldn’t stop trying to learn just because you got hired :)


I don't think the title is misleading, it's deliberately placing itself in contrast to the glut of content that is already out there geared towards people who want to become programmers as fast as possible to get a job. Programming may be a job, but it's also a craft, and I'm happy to see content in that direction.

Also as someone who has had to hire a team of programmers in the last 18 months, it's clear there's an industry of snake-oil salesmen willing to promise to teach you to be employable in 3 months or less, and they're not benefitting the industry in need of good programmers, or the people paying them thousands of dollars in hopes of improving their employment prospects.

A 4 year degree is not necessary to become an employable programmer, but imho what we need more of are 2-year programs which properly teach the fundamentals of computer science alongside practical skills, not 6 week or 3 month bootcamps which only teach the bare minimum.


I agree with everything you said.

Not sure about this though:

> I would say that the number of jobs that require the mastery of the craft is not large.

Perhaps the number of jobs that require mastery is less well known? i.e. Not advertised as often / not vacant as often.

I suppose I'm splitting hairs here in one's definition of "not large", by in my circles (~25 years of dev working experience) there are quite a lot of "jobs" requiring mastery, but they certainly do become available less often (or rather, they're usually quietly filled without much public notice).


Also, the number of developers is increasing exponentially over time. There are maybe a hundred thousand people out there who have >25 years of dev experience, and millions who have <5. E.g. see [1] - 16 times more engineers filling the SO survey with <5 years of experience compared to >25.

The job market has to roughly follow this demographic trend, otherwise companies would fall behind competitors.

[1] https://insights.stackoverflow.com/survey/2020#developer-pro...


What are some of the jobs that require mastery in your mind? I suspect we'll diverge in that opinion, hence our different conclusions.


Everything you say is true, and leads to the oft-repeated "why would I want to learn this complicated programming concept, when my job doesn't require it?". That is, it encourages the idea that employment is the yardstick.

The rub is that this all depends on what you find personally rewarding. If what leads to a rewarding life to you is having a good job, then absolutely, you probably don't need more than a certain amount of depth and knowledge on the subject.

But if depth and knowledge on the subject is what you find personally rewarding, then suddenly putting in all this work makes sense. For many, lifelong learning is what is most rewarding and brings most joy, which one might argue is the most important thing as long as you have a sufficient income.

If you do want to frame it in terms of achievement, it's also obvious that many prolific people in their fields were driven by a passion for what they do — the playful nature of Feynman, Shannon and others comes to mind.


> 10 years is not to "teach yourself programming," it's to "become an expert in programming."

I've been programming for a lot more than 10 years, and would barely call myself an expert on anything. 10 years is a decent amount of time to learn to be a decent programmer. As a rule of thumb at least.


Agreed. I can teach my 12 year old cousin how to program in half an hour. This isn't about simply knowing the basic starting point. It's about becoming an expert.


In this case, what does it mean to know how to program? Assuming you succeed in teaching your cousin, what did they do, or what can they now do? I would not say "hello world" is enough, but I don't know where I'd draw the line. It's easy to go through a tutorial and still not be able to actually do anything, I think.


The power of compounding which comes from discipline is one of the most underrated concepts. There's one caveat to "keep at it for x years"; deliberate practice. You have to work towards being aware of gaps, identifying them and improving them. In addition, the amount of effort required to go from 0-40 is different from the one required to go from 50-70 or 80-95. The recognition and acceptance that a 100m sprint is different for Usain Bolt than it is for you could give you inner peace and focus.


Learn linear algebra properly and this will ring true. If you know the math, it's trivial to pick up control, computer vision, back propagation, optimization, I'm sure many others.


More like linear algebra, real analysis, and probability for the areas you listed.


Statistics for sure. But i don't really find myself using analysis on a day to day basis


Gradients and convex analysis pop up in optimization. My understanding is controls uses analysis as well (at least for understanding theory).


I agree with the article, ~10 yrs will give you a good start if you stay with it.

When asked by others how much it would take, I try to avoid giving a number like that because it might discourage them. Who knows, had I heard or read that number early in my involvement, it might have discouraged me. However, my innate curiosity and fascination for the field made the time fly by without me noticing it. If you're in doubt, you might take that as encouragement. Programming is a field where the learning never stops, and if that's your thing, the time required will become secondary at most.


The number means nothing anyway. What do you get at 10 years that you didn’t have at 9 or don’t yet have from 11.

You can have a well paying job as a programmer in only a few years.


> The number means nothing anyway…

Yep, that's why I put the “~” in front of it and called it a start, not the end of the journey.

> You can have a well paying job as a programmer in only a few years.

Absolutely, if that's the extent of what you want out of it, you could achieve that and call it a day. But programming is also a field that's ever evolving. If you catch yourself thinking, “I've learned everything I ever need to learn”, then chances are that you'll find yourself left behind at some point in the future.


I really like the advice in this.

At the same time, I often find it dispiriting to hear "it takes x amount of time", be that 10 years, 10,000 hours or a lifetime.

While I understand that it is important to have these metrics precisely to dispel the idea that anyone can learn c++ in any meaningful way in 24h, at the same time I find it harder to start something when I am constantly reminded of the fact that nothing I produce will be worth anything compared to those with a giant head start (yes, yes, never compare yourself to others, just to yourself from x time ago).

The 10 year thing means I can meaningfully learn, what, 70 thing in life if I am not spending the vast majority of my time at a full time job? So, that makes it 3-4 things?

How depressing.

Admittedly, I feel I have almost never studied deliberately. I happen to have learned about 3 things to reasonable expertise in my 30 ish years. All three just happened. It was effortless to learn, as it was interesting to learn.

I simply cannot imagine what it would mean to have to dedicate 10 years or 10,000 hours of practice to something I am not interested in. In my own experience I would go so far as to say that I either love something enough to want to do it 8h a day, or not at all.

I wonder if I'll ever find a 4th thing, later in life. I really want to learn more things, but the rule of x hours is discouraging to think about when I have 30 something years less left on this planet than when I started.


Yes our lives are limited so we can not do it all. And comming to peace with that is a process that is different for everyone. Every choice rules out others, hence the paradox of it.

Your own struggle with learning new things does have some openings to look at it a bit lighter. Most things are already useful, fun, interesting with less than 10k hours. Way less. You don't need to be an expert to enrich your life and that of others with a new skill!

In addition, learning related things goes much quicker (ie I enjoy doing CAD recently using Fusion 360 and I find that many aspects of it fit my programmer mind very well, same for expanding foreigbln languages, different instruments, etc).


I think the 10 year, or 10,000 hours are for learning stuff you are not completely in love with.

It would take me 10,000 hours to become a proficient Programmer.

Now---stuff I am truly interested in took me less time to learn. I was going to say "master", but I didn't master anything in my life. I'm a good mechanic, and Watchmaker only because learning it was easy because I was interested.

It took me less than 4 years to become good at Watchmaking. Why--because for some reason I became facinated with watches, and clocks.

I can strum exactly 12 guitar chords. I memorized the charts, and then memorized the finger positions on the fret. I can play a few songs. I sometimes wonder why I don't practice more. The reason is I don't love it. When playing a mental picture of Blutto pops up from Animal House.

Pick something you love. You never know where it will take you. Then again--maybe I should have learned things that society valued, and paid highly for?


If you are 30ish now, you will be 40ish in 10 years. Might as well develop a new skill or two. Though I completely agree that it should be something you are truly interested in.


That just applies to the one thing, you can find things that fit well together where some portion of you other skills still apply, but that will depend to the individual. You also shouldn't only learn things to be an expert in them, a very small initial input makes a lot more of a difference than honing those skills.


That's a trick title, because a big flag of someone still learning programming, is when they say that they're no longer learning programming.

Programming is like life, constant learning, constant mistakes, constant improvements. And that's why it's so beautiful.

A life long partner ^^


People learn at different speeds, and intelligence + natural talent is a huge factor when it comes to learning this stuff. The two most talented programmers I know have quite different backgrounds: One is your archetypical hacker - started coding as a young child, demo-scene by the time he was 13, and got headhunted before he could enroll college.

The other guy didn't write a line of code until he was 21, after switching majors from econ to engineering - three years later, he was getting offers from FAANG companies and quant/hedge funds.

But, I do think that if people consistently work and study programming for 10 years, then most will have a solid grasp afterwards. That means doing actual constructive work almost every day, for 10 years - doing work that either challenges you, or teaches you how to use the tools.

Being an expert (or highly competent) programmer is not only about knowing CS-theory at heart, and being able to translate that into working code - it's also being fluid with the tools. And, luckily, becoming good with tools doesn't take much more than time and effort.

You could be the worst programmer in the world, but given enough time, you could learn every nook and cranny of some language and its ecosystem.


How long does it take to learn programming is like asking "How long does it take to learn 'sport'?".

You could learn the basics of football in an afternoon or spend a solid few years getting good. If you started playing futsal, you'd do ok. Maybe you could hold your own as a sprinter. You'd probably have a terrible time with a javelin.

In the same way, programming experience in one area transfers to others but not completely.


I've learned programming as a kid in a department store watching other kids hack in BASIC.

Writing code for 40+ years I still make mistakes, don't know a lot of things, make the wrong decisions sometimes and learn every day - currently fighting Rust lifetimes.


I've always considered this the most reasonable method to "Teach Yourself C++ In 21 Days":

https://abstrusegoose.com/249

(note: is comic)


I don't think this is (1998) if Clojure (2007) is referenced in it.


It's originally from 1998 and has been updated since then.


30 years on, still learning


Programming is like the universe. There are parts of of expanding faster than I can ever learn so are unreachable even with unlimited time.


It does not take ten years to learn things. It might take 10 years to be good at something. I recently took up lawn bowls. I learned it in about 30 mins. It's quite simple. I'm not any good at it.

I used the Learn C++ in 24 Hours book as an undergraduate. It was excellent. It was the book the university recommended, and it was probably the only book I bought as an undergraduate engineer that was actually indispensable. I actually used it, page by page, and during that course I produced a really cool 'moon lander' game in 3D. Was I an accomplished programmer? Absolutely not. Had I learned the basics? Yeh. Good enough for an undergrad electrical engineer.

It feels like you are railing against one premise (a programming language can be learned in 24 Hours) only to propose another similarly bonkers premise (a programming language can't be learned in less than 10 years). What's the point?


We love in a world where businesses at large make money by overpromising and underdelivering; people buy these goods/services as a "alibi" of some sort, so that they can say "it's not my fault, I even tried [insert here alibi product].

That's what stupid people do.

When smart people need to solve a problem, they find a system, they make time to work long and hard, and they keep on going, each and every day, failing often failing fast, and then, little by little, they incorporate the new knowledge, understanding and experience into their life.

2-3 years often gives good skills borderline mastery, 3-10 true mastery.

It's a system, not a goal.


One of the concepts in the name that tune, errr learn to Code in fewer days than the other brands of books is that each chapter is a Lesson that should be worked on for a while before proceeding to the next lesson.

So while it is 24 hours or 7 days or whatever, they are not meant to be a sequential 24 hours, but 24 hours spread over a period of time.

I could never interest a publisher in creating a series called "Learn in 23 Hours", I'm sure it would have a clear advantage over its slower 24 hour competition. Perhaps 23.99 hours?


I started to learn programming when I was 9 and it took about 10 years. About half of his suggestions came after that. Good tips if you're starting as an adult I suppose.


You can learn to program in a month or less. But it takes 10 years to actually put out quality code that is easy to maintain, readable and efficient in CPU and memory allocation. It took me a little less than that only because I was a code freak for several years and used to code more than go out.



This needs to be higher. This guide shows it's possible to learn C++ in 21 days.


I know that it's probably a joke but to take it seriously: no, I don't think it does. It shows how to circumvent the 21 days constraint but the whole point of that discussion is that time is limited.


Man, I remember reading this article while I was still getting my bachelors in Mechanical Engineering (a little over 10 years ago) but was tinkering with programming.

Made the switch to software and am now a senior software engineer. Didn’t know then that I would somewhat follow the trajectory outlined here.



> One of the best programmers I ever hired had only a High School degree; he's produced a lot of great software, has his own news group, and made enough in stock options to buy his own nightclub.

How did Norvig interview JWZ?

Did he use the whiteboarding rituals promoted by Norvig's company?


This was written before he joined Google, if that's what you mean.

I don't remember a lot about my interview around then, but one bit that stuck: when he outlined some problem he had at the time, and I was like "maybe bayesian networks would be good for this", I felt rather silly saying so since it'd been his book where I learned about bayesian networks.


Ten years later I still struggle to get involved language standardization effort.


Anyone here from the 2008 graduating class?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: