Hacker News new | past | comments | ask | show | jobs | submit login
John Carmack discusses the art and science of software engineering (2012) (uw.edu)
123 points by hypr_geek on Aug 20, 2013 | hide | past | favorite | 42 comments



"I would like to be able to enable even more restric­tive sub­sets of lan­guages and restrict pro­gram­mers even more because we make mis­takes con­stantly."

I like this very much. For all the talk of power in languages, we make lots of mistakes. One underlooked decision factor in choosing languages should be, "How hard is to make accidental mistakes?" and "How easy is it to find and fix mistakes?"


"In terms of software, the languages Ada and C have very different attitudes to freedom. Ada introduces restrictions and checks, with the goal of providing freedom from errors. On the other hand C gives the programmer more freedom, making it easier to make errors.

One of the historical guidelines in C was "trust the programmer". This would be fine were it not for the fact that programmers, like all humans, are frail and fallible beings. Experience shows that whatever techniques are used it is hard to write "correct" software. It is good advice therefore to use tools that can help by finding bugs and preventing bugs. Ada was specifically designed to help in this respect."

https://www.adacore.com/adaanswers/gems/gem-30/

(And if even Ada is not restrictive enough, one uses Ada's subset Spark, e.g., http://www.spark-2014.org/ )

Not to say that one should use Ada (though it would often help I guess), but this different approach to software design and maintenance ("Ada culture"[1]) is worth knowing.

[1] http://archive.adaic.com/intro/ada-vs-c/ada-vs-c.html

My personal rule of thumb:

C and Scheme for explorative hacking, Ada and Haskell for reliable implementation, Perl for teh recreational lulz.


> "Ada introduces restrictions and checks, with the goal of providing freedom from errors (...) Ada was specifically designed to help in this respect. [write correct software]"

And it is a huge failure. No amount of runtime checks can make your already compiled and deployed program less wrong.

> (Taken from the Ada vs C++ comparison article) "Stroustrup (...) presents a use-defined vector class (...) Unfortunately, the proper implementation of the class requires exceptions (...)"

A proper implementation of the class requires dependent types. Runtime index checks are effectively testing, which can show the presence of bugs, but never their absence.


Testing does show the absence of bugs that you test for though.


Couldn't you argue that Scheme is on the safer side of the spectrum? (Or maybe it's just how I was taught Scheme)


> should be, "How hard is to make accidental mistakes?" > and "How easy is it to find and fix mistakes?"

Sure, accidental mistakes are avoidable. But I don't think there is a way for the language to help us write proper abstractions. I mean if one doesn't know how to take mutable state from a bunch of functions and abstract its mutation away, i.e. allow it to mutate inside a single function only, than what can the language do? But what if you simply have an inconsistent code, like handling errors differently in different places?

Would you use a language that produces a warning of a hypothetical bug if you have state changes too far away from each other?


I think this is a ridiculous attitude. Mistakes are a given. People will always make them. Creating ever-expanding lists of arbitrarily prohibited language features just makes code more verbose and cumbersome to write and makes everybody's life more difficult. The idea that mistakes are caused by too-permissive languages is bankrupt.


We already see this in culture, in business and other public activities, even religion. The language is restricted in various ways to prevent error that can result in an unrestricted environment. Unfortunately, this leads to compartmentalization, which can hide rot and bloat of various flavors.

I'm not exactly sure what the solution is, but if history is any indication, these linguistic systems are often considered foul-proof to the point that those managing them stop managing them, or hand off management to someone and forget about it, letting any issue outside the scope of the language become a cancer of sorts that eventually kills the host.


> "The language is restricted in various ways to prevent error that can result in an unrestricted environment."

Sadly, this is not true for us programmers. By and at large, we have preferred to devise mechanisms (such as exceptions and contracts) that protect us from the consequences of our mistakes, rather than attacking the root cause of these mistakes - our failure to enforce pre- and postconditions using static techniques. And this situation is not going to improve as long as we stick to a limited vision that embraces integers, strings, lists and trees, but not equality of terms as types.


I'm sorry, but this has nothing to do with "us programmers." Humans, by and large, don't have a good way to tell a desired story with all of the branch points accounted for. This is why cookbooks don't have recipes with "if you cooked it too long here, switch to this recipe..."

Hell, this goes so far as to how we currently give people directions to cross a state. Consider http://goo.gl/maps/4F89Z This clearly doesn't work because there is not a car that can do that trip completely non-stop (well, that I know of). Yet, it would be silly to even try and account for all of the random stops that a person making that trip would want to do. How is this the fault of "embracing integers, strings, lists and trees?"

Having looked over all of the various ways that we give directions to kids lately, I'm becoming more and more convinced that the higher theories of programming are missing something (for a great presentation on this idea, see [1]). Look at how music is written. Look at how lego directions are written. Ikea directions. Shampoo directions. Cooking recipes. All of these are relatively successful methods for communicating often complicated sets of directions. None of them necessarily encode things to the level of detail that our programs require.

Are you concerned that all of my examples are not necessarily life and death detail oriented? See the blueprints of rockets. Sail boat designs from ancient times. Heck, sailing routes.

If anything, I would say that our programs should, instead of offering fancy ways to escape the consequences, programs should offer ways to "get back on the desired path" of what was supposed to be accomplished. The canonical "FileNotFound?" Don't put my program somewhere where I have to somehow set things back up to try again. Resume right here once the file has been found. (Sadly, I don't have any good thoughts on this. Just late night ramblings.)

[1] http://www.infoq.com/presentations/We-Really-Dont-Know-How-T...


Interesting in that I was thinking in terms of Python versus C++, but it's also true in real world languages. One could argue that the precision of Latin helped preserve the message of Christianity through the ages, though I am not an expert on the matter.

I do think it's reasonable to have technical environments where language choices are done for support and debugging purposes, with the highest powered tools only pulled out on occasion.


Of course, Chomsky has many books on the subject.


Previous discussion:

https://news.ycombinator.com/item?id=4423031

Here is the link to the transcribed portion of the talk (starting at 00h:30m:02s):

https://www.youtube.com/watch?v=wt-iVFxgFWk&t=1802


Carmack's aside about his 7 year old son learning to program brings up an interesting idea: What would happen if you taught a 7 year old to program with Haskell. I don't have any children, but I started programming at 7 so at the very least I can conduct a thought experiment about what it would mean if Haskell was the tool I had at my disposal instead of opaque C manuals and a hex editor I used to hack the Mech Warrior 2 binary when I couldn't beat a scenario.

My thoughts:

I probably would have intuitively grasped algebraic datatypes. It often frustrated me that I could not define new things like True and False to suit the game worlds I wanted to create.

I would have had little to know interest in math or recursion for its own sake. I'm pretty sure I would not have done well with those.

I would have probably written almost all of my code in the IO monad, with game rules defined outside.

Graphics programming in Haskell would have been no better than with any of the other languages.

Overall, I don't think Haskell would have stuck. I think I would have been easily frustrated, despite being able to structure programs better than I was able to in Basic. For proof of this, I look to my early abandonment of C/C++ for game development (even if the manuals were pretty much terrible). On the other hand, if I was presented a game written in Haskell (with source), maybe the outcome would have been different. I certainly managed to use a hex editor at that age, so if I had some goal I wanted to accomplish (like modifying the AI of a game), I think maybe anything would have been possible.


I tried learn programming since I was 11. First I played around with graphical programming in Game Maker, Construct and RPG Maker. Later I tried out VBasic but quickly felt it was boring. After that I bought K&R and tried learn C, which was way too hard to program games above simple parsers and text adventure. After that I played around a little bit with Python and Scheme. It never got too interesting since I wasen't able to write any "real" programs. Later, when I was 15, I got invited to a private IRC network of cool and friendly programmers. One guy did a Haskell course and sent the slides from the lecture to me. I did some exercises with it and everything I didn't understood I looked up in LYAH and RWH. After I was able to complete all exercises, which by the way was very simple ones, I started reading Write Yourself a Scheme in 48 Hours. It told me how more about the practical use of monads, I learned how to use external libraries (Parsec in this case), and basically how you design a program from scratch. After that I tried to implement macros for it but failed, I wrote a simple IRC bot that used my Scheme implementation to evaluate expressions, I wrote a simple blog powered by Scotty and Blaze-HTML. Now I'm contributing to a recently released game engine which utilizes FRP, written some really small games with it and have contributed a little bit of code.

What I think made Haskell stick was that it was fun and easy to start write "real" programs with. :-)


I'm curious, what makes a program 'real' and what about Python and Scheme got in the way?


I don't consider simple mathematical algorithms as "real" programs, nor do I consider simple text adventure games real programs either. I donno if anything got in the way with Python and Scheme, I just wasen't able to write anything useful or fun so it got very boring. In fact, learning Racket using "How to Design Programs" might have got me started much quicker, but it was already too basic I needed some more challenging while not being too difficult.

When I think about it, Scheme is probably a great option for beginners. It's simple and functional, got a great development environment for beginners (Dr. Racket) and two great beginner books(SICP, HTDP2). Also, since it's not purely functional the user who learn it won't need to learn abstract concepts like monads to do useful stuff in it. :-)


It sounds like the docs you had focused on functions and console I/O and never got to graphics/mouse/web.

The lessons of How to Design Programs are really well done, in the sense of reliably taking people from beginner to real programmer, yet everyone who's offered on opinion to me found it kind of tedious. I get the impression it's great for classrooms and there's a need for a resource like it aimed more at the self-motivated. Maybe that's http://realmofracket.com/


I don't think the tools available really limit a beginning programmer. They'll hack away in Visual Basic or using a hexeditor on their favorite programs.

What they really need is a mentor, learning materials, and the drive to create something. The smartest kid in the world isn't going to build anything useful or interesting if he doesn't care about it. He'll be assisted big time if he has someone to give him a sense of direction. It's amazing how much a 'dumb' but determined kid can get done, even if underneath it's a horrible jumble of spaghetti code.


I'm far from being a professional programmer, and until the last year of my life I'd never written anything work-quality. However, you've pretty much nailed it on the tools needed for a beginning programmer: They'll learn anything to do whatever it takes, if it's for a game or something else they're equally passionate about.

That's exactly how I started programming, with QBASIC gorillas, savegame hex editing, and Robot Battle AI programming. I eventually muddled my way through QuakeC, because it was for fun. It was only really at this point that I was ready for proper training materials.


When you're that age you're entirely goal oriented. You don't care about structure, you don't care about speed. You wanted to hack Mech Warrior 2. You had a very specific goal, so whatever language helped you do that the easiest would be the right one for you. Whatever language and toolset is the easiest to get to the point where programming is fun is the right language for a kid.

Actionscript was actually great for me. I wanted to make flash games, and back in the AS2 days you could add code to frame 0 of movieclip objects. It was ugly, but it was the easiest way to get something to happen. It also tricked me into learning a form of object oriented programming.


There exists at least one UK university which uses Haskell as a teaching language in its CS department. Not quite 7 year olds, but certainly a good percentage of fresh programmers.

Does it result in better programmers? Perhaps. It certainly forces you to understand recursion.


My university used Haskell as the intro language for the math department. People who where already good at programming and people who had never programmed before picked it up quite quickly, but people in the middle who knew a bit of programming coming in had a terrible time with it.

However people who'd never programmed before the Haskell course had a very hard time transitioning that knowledge to Java and similar languages, which was used in some other courses.


I remember learning Haskell at Imperial College London - a very nice language. I had never tried a functional language before and just loved how "natural" if felt after getting familiar with it. I would love to re-learn it but I am thinking that from a career point of view I am better off learning Scala/Clojure than Haskell.


Haskell might actually be pretty useful in the long run. (Disclaimer: My whole career so far has been spent doing OCaml and Haskell.)


I wasn't a CS student but I remember all the CS students at Bristol moaning about Haskell giving them a headache.


I started coding when I was 10. I think what made the difference for me was that I started out in QBASIC and it had excellent documentation within its help menus.

I didn't have any books or anyone knowledgeable to learn from at the time, but I was able to constantly look through that documentation to try to figure out what I wanted to do.

I had to make meticulous notes on everything, and make sure I really understood something in order to use it to solve a problem I was getting stuck on.

I think that hard-won knowledge was excellent, and it really taught me the value of API documentation and being able to find the answers myself just by reading that documentation.


I don't know. That is kind of like asking what if you try to teach a 7 year old calculus? Should you teach them calculus with math? Or try to just explain it through word problems? It makes no sense.

Sure you can take short cuts by skipping discrete math and teaching the "act" of programming without the understanding. But in the end the seven year old isn't really learning programming then, is he/she?


If you don't teach a seven year old calculus, are you really teaching them math?

I understand you can't teach everything upfront; that's not my point. I remember my 5th grade class, where several people said they wanted to be mathematicians when they grew up. The reality usually sets in by college, that all the 'math' we learned in the classrooms was just a tiny, tiny corner of a vast field of study. So similarly, I thought the 'programming' I was learning as a beginner was the real deal. When you're starting out, you never see the invisible, the unknown unknowns, the hidden complexity; you only see what's in front of you, and it has to make sense all on its own.

So there's this huge, intangible mess of complexity that is 'math' or that is 'programming', and you can't explain it all at once. They won't have any of the prerequisite knowledge, and you can't waste time covering all that. So you have to choose 1 concept out of the 7000, and you have to try to teach it.

There could be C programmers that know nothing about monads, so it stands to reason you could teach lots-of-C before you teach anything about monads. Is the opposite true? Are there Haskell programmers that know nothing about pointer arithmetic?


Good point regarding the math analogy, I feel it was a rather poor one now.

But I feel that the difference is between knowing and understanding. You can know how to get the browser to display a rectangle with the right copy/pasted html.

But without discrete math there really is no foundation to understand the underlying algorithms. Without understanding it is impossible to know how to extrapolate those examples to solve different but similar problems, probably the most important skill a programmer can have.

Teaching C before discrete math is like teaching english without a vocabulary. Sure you can understand the syntax. But without understanding algorithms or data structures you are going to be lost very quickly. You might not need monads for C - since it is such a primitive language. But try writing C without iteration or recursion.


Sure. If Haskell is your first language (like they teach it at some universities), why would you necessarily know anything about pointers.


You're right. What I was trying to say by Haskell programmer was more like an industry veteran. 10+ years experience in Haskell/etc, zero knowledge of pointer arithmetic.

There seems to be this ineffable force tilting the scales to create more C veterans that don't know monads (I could find you one) than there are Haskell veterans that don't know pointer arithmetic. Age, popularity, libraries, etc all tie into this, but I still think there's significance amidst the noise.


An argument based on mind share alone can explain the difference. (I.e. almost everyone will come into contact with C.)


Carmack is this living legend of programming, hacking, making magic happen with the computer. And yet his writing and the concepts that interest him are often simple and approachable.

How does he manage to look at this pile of hacks with beginner's eyes? Somehow he does, and he sees it for what it is without judging it too harshly or prematurely. I love reading his blog.


He's someone who knows his stuff so completely that he's able to explain it clearly to non-programmers. I think theres a famous quote by Einstein: “If you can't explain it to a six year old, you don't understand it yourself.”


> With the NASA style devel­op­ment process, they can deliver very very low bug rates, but it’s at a very very low pro­duc­tiv­ity rate

There's some more, somewhat old and non-technical, information about NASA's software development efforts here [1].

[1] http://www.fastcompany.com/28121/they-write-right-stuff


Great article! As an aside, my, Fast Company has really gone down the tubes...


Go to youtube, search 'quakecon 2013 keynote part 1'. It's a great talk/keynote by John. One of his best.


I wonder when Carmack and Rob Pike sit down and save the sw engineering world...


I'm becoming increasingly convinced that the way forward in software engineering isn't "better" languages like Haskell or Clojure, but with better tooling.

I think Carmack's talk shed light on the fact that as humans we have very limited cognitive capacities when it comes to programming and "smarter" languages help us a bit, but we need automated, "smart" tools - like static analysis tools.

I think of Visual Studio and Resharper and how much ReSharper finds that I can learn from and how it frees me to concentrate on business logic.

Years ago I remember reading an article or an interview with a Sun research scientist where she described programming of the future where languages and tools will be much more lenient of your mistakes and programming will be much more of a two-way conversation with your tooling.

That and tools like language work benches are the only ways I see to go forward. Everybody using Haskell is just a modest (if that) step forward.


programming will be much more of a two-way conversation with your tooling.

Do you have any experience programming in Haskell? I ask because this statement exactly describes my experience with it. The language goes extremely far down the path of "static analysis" with its type system. Couple this with simple tools such as ghci, flymake and haskell-mode for emacs and you have a very interactive system with an enormous amount of feedback.


Better languages _are_ better tooling. To make it more concrete: Thanks to the difference in languages, ghc can do much more static analysis than gcc. More refactoring support would also be possible, even if that's not done in practice for Haskell, yet. HLint is nice to toy around with, though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: