Hacker News new | past | comments | ask | show | jobs | submit login
Coding is not the new literacy (chris-granger.com)
323 points by oskarth on Jan 26, 2015 | hide | past | favorite | 146 comments



In Zen and the Art of Motorcycle Maintenance, the author talks about the importance of being able to carve up ideas in different ways to understand it more thoroughly. I'd say knowing how to code, as in going through the process of understand how to think through a logical sequence of steps, is a highly effective means of handing someone a proverbial scalpel.

In my experience, most people cannot break things down logically nor do they appreciate what it takes to provide clear instructions. There is an emphasis in the world on communication which is a fine start, but being able to convey an idea clearly really only becomes helpful once you understand the idea clearly.

Besides programmers and mathematicians, the only people I've met with the faculties to properly break things down are lawyers*. Law school trains you in the ability to approach an idea from different angles in order to identify attack vectors. As such, lawyers tend to be highly capable of feats of logic but are equipped in such a way that they are almost entirely adversarial and deconstructive.

So while I completely agree with Chris's point and understanding that this is a banner he raises constantly, I'm a little disappointed to see him attack this initiative to get more people into coding. I'm all for better modeling tools and better teaching practices but I think teaching people to code is a wonderful solution in the interim. Framing an argument like this will only stir controversy in an otherwise worthy cause.

edit: I've met plenty of people who aren't programmers, mathematicians, or lawyers that are logically capable. But as a general rule of thumb, those are the only three professions I've interacted with personally that have consistently shown that aptitude. Either way, please don't let this personal anecdote distract from the overall message, I only left that comment in there because I wanted to emphasize how rare the ability is across most disciplines.


> I'm a little disappointed to see him attack this initiative to get more people into coding.

That's because I'd rather people have the freedom to be mathematicians, nurses, lawyers, physicists, writers, and accountants while still being able to leverage computers. Instead of forcing people to become professional programmers to get the computer to model things for you, we should focus on teaching people how to model in general and build more tools like Excel that let people do that.

Hell, I'd stand behind a movement to just teach people Excel. With Excel, we don't have to spend the next several years worrying about what flavor of MVC we have to use. Instead, we can focus on doing actually important things like curing cancer.

> Besides programmers and mathematicians, the only people I've met with the faculties to properly break things down are lawyers.

My point is we should fix that, instead of focusing on teaching people Python.


So much this...

The goal should be for programmers to provide tools with interfaces that one can use without actual knowledge of "the machine".

Imagine driving a car would require you to have intimate knowledge of combustion engines, mechanics and the electronic systems therein vs just pedals + wheel.


> Imagine driving a car would require you to have intimate knowledge of combustion engines, mechanics and the electronic systems

So, for this analogy to be fair, we'd need to have computers which you can't operate unless you know how to use transistors to build logic gates, how to build your CPU out of those, how this CPU executes instructions and so on. Of course, that's "vs just keyboard and mouse".

Of course there's room for improvement, but current computers are not orders of magnitude harder to use than cars. I'd say that they never were, but right now they're not for sure: 3 years old kid won't drive a car, but he can have some meaningful and fun time with a tablet.

Learning programming is secondary issue in my opinion. A "new literacy" is something else: it's ability to stop and think about how the pieces fall together, how they work, how you can make them do what you want. Programming is certainly a way of learning this ability, but it's also full of pointless ritual and irrelevant things and operates on completely wrong - for the normal user - level of abstraction. I don't know what is the most efficient way of teaching this to people, but I strongly believe that we need to teach them this. If we have no better way, then let it even be via programming, it's still better than nothing.


Ofc there is some amount of hyperbole in the analogy.

Computers ARE orders of magnitude harder to use than cars. A car has (at the simplified level) a single wheel that goes either left or right and 3 pedals. The hardest concept relating to hardware is what a gear is and why you need to shift it (among a few others).

It is correct though that software CAN be more simple than that in their use. Those pieces of software tend not to solve complex or any problems in our real world though.

When it comes to 'coding as literacy' this has not much to do with using a highly simplified UI. It has to do with the fact that if I want to use most of the features of the machine we call PC, you HAVE to be able to program.

Sticking with the car analogy my kid can have great fun if I show her how to use the horn in the car. It will not allow her to do anything meaningful with the car (getting from A to B). All she can do is use the car as a toy.

I agree with you on the core problem though. Things are abstraced away from everyone in our daily lives in such a way, that people sometimes are unable to be 'precise'. By that i mean the ability to fully describe a problem and formulate an executable solution. I personally don't think programming is the solution to that. Personally I learned that concept in school in philosophy class.


Every computer program ever written constitutes such a tool.


To add to this, I'm a developer and I have relatively little knowledge what is really happening with the machine. Where a layperson knows something like 0.1%, I know instead something like 1%.

The level of detail exhibited by some in the recent "what happens when you type google.com and hit enter" thread made me further realize just how little I understand.


That's most likely because you're 'late to the party'. I was lucky enough to be born just when all this stuff got underway so I saw it progress bit-by-bit in slow motion and over 3.5 decades I could absorb the changes. If you're dumped in on the deep end in 2015 or so then it can be a bit overwhelming, but keep the faith, it will all clear up in the longer term and you'll be so much better positioned to actually achieve something. Higher level stuff really is higher level, you get more done with less writing (even if that comes at a price in the form of a loss of contact with the lower layers).


This is a bit scary. I would not be able to write functional programs without knowing at least some of the infrastructure the code runs on (albeit not down to the nand gate). If I would be a dick I would guess web programming :P.


If I would be a dick, I'd guess you haven't been in the field long enough to know everything that you don't know?

Even if you're, say, a competent assembly programmer, I'm sure there are plenty of areas you're unfamiliar with. Perhaps cryptography, the details of JIT compilers, or BGP? There's just a lot to know. Or perhaps you're just unusually talented.


Don't be dicks, either of you.


Word. Tried snarky sarcasm, forgot how people react differently, failed. Apologize!


> we should focus on teaching people how to model in general

Classes on logic and/or learning theory can accomplish that goal without any tools whatsoever. In fact, I'd argue we should teach people visio instead of excel if the only thing we're shooting for is mental modeling. I've always thought the world would be a much better place if people leveraged concept maps more frequently.

> My point is we should fix that, instead of focusing on teaching people Python.

Again, I really get your point about not wanting a generation full of programmers but having gone through programming classes in HS I think it's a non issue. I think I'm the only person from my HS programming class that went into programming as a trade. If anything, I think the thing worth attacking is the typical emphasis on the programming language as opposed to the process of programming. My HS 101 programming class was done with an overhead projector, mapping out what a loop was actually doing and visualizing the memory allocations being modified step by step. That's the sort of thing we need more of. Just shoving syntax down a kid's throat will do no good, on that we are absolutely on the same page.

P.S. I think you're doing amazing work. Thank you for all you're trying to contribute to the field and the world.


> That's because I'd rather people have the freedom to be mathematicians, nurses, lawyers, physicists, writers, and accountants while still being able to leverage computers

All of the mentioned professions make heavy use of computers. I do agree teaching basic skills with tools like spreadsheet programs, etc, are fantastic. But at some point in time, it will become a roadblock to not know how to write a basic script to automate some repetitive/remedial task.

That's not to say everyone should learn how to write massive enterprise software, or have deep understanding of how a kernel works. Rather, it's to say people need to learn how to make the computer work for them in the way that suits them best, not just how some Vendor thinks they should work. Being able to write some quick script simply requires problem solving skills.

There is a significant advantage to the journalist who can write their own SQL and query data out of a massive database by themselves without having to depend on someone else. There's a massive advantage to the lobbyist who can better understand statistical trends visually because they learned how to represent, consume and model the data with R.

There are fields where non-programmers are the only ones who fundamentally understand their domain well enough to write software for it. Take the various sciences. How can a non-chemist accurately write a chemical modelling and simulation software without first having to learn how things work? Not very well... even if they designed the software based on some spec, it's unlikely all of the professional chemists domain knowledge from 30+ years of in-field research will accurately translate into a hired-gun's handy-work.

The main issue I see in practice is people are either afraid to begin, or get some glossy-eyed look when the word "programming" is brought up. To me, that is incredibly frustrating, since programming is nothing more than writing down some words in a grammatically correct structure... "you can form complete sentences, right? Good, so you can now write a complete sentence but follow a slightly different rule."


> The main issue I see in practice is people are either afraid to begin, or get some glossy-eyed look when the word "programming" is brought up.

In non-technical companies I see turf protection by IT departments as part of the root cause of this. They set themselves up as wizards who are the only ones able to make the computers do their magic and make the task seem much more difficult than it is. After all, if journalists, lobbyists, customer representatives, and such were able to write their own SQL queries, then Bob the SQL Guru™ would be out of a job. Plus, you would have to give these unwashed masses access to the systems.

I'm being a little harsh to IT departments in my caricature, but my experience has been that 90% of them make the other 10% look bad. IT at my wife's company is particularly bad; if she wants data out of their database she has to request a database report from a group in Kuala Lumpur (she is in Dallas). She has to basically write the SQL query in plain English in the request in order to get exactly what she wants, because from what I can tell this group is basically an English->SQL translator. If the data is not what she wanted she has to do it all over again. Thus it takes her days to get what she could get in minutes if she had her own MySQL client. On other things I have offered to show her how to use Python to automate things, but those conversations always end with "IT won't let us install anything". In that kind of environment I'm not surprised that people who have a natural reluctance to programming would have said reluctance amplified.


Wherever I've worked, the IT people protect their turf because they deal with end users who get MS Access or Excel, and then those end users get data from the company through force of management. They then create their own computer systems on their desktops, then the department becomes dependent on them.

Then when that person leaves, or confidential data gets out, or an OS upgrade screws up the ad-hoc system they created, who's responsible? IT. IT has to now learn about, repair, and support this system they didn't know about or budget for. It's even worse when a non-IT area hires their own programmer who thinks that IT is "protecting their turf", and so that dev does some skunkworks thing without any consultation.

If you've ever managed corporate IT, you know how these little systems come up. And you learn why IT wants to control it. Because the average person has their job to do, and they're learning computers on the side, and only enough to make something that barely works.

So a little bit of training early on could indeed be a good thing. It would solve many issues. :)


My experience has been that these little systems come up because we can't get IT to do what we want and have to work around them to get our jobs done. That's not to say all requests or IT customers are reasonable; some ideas are stupid or infeasible. Labelling those ideas as such without addressing the underlying business need that birthed them does no one any good. I rarely see IT organizations try to understand that; they judge requests primarily on technical merits.

> So a little bit of training early on could indeed be a good thing. It would solve many issues. :)

What we are talking about in the context of the article is a lot of training. It would give non-technical people the ability to go beyond something that barely works. It would increase the number of independent micro-systems, unless IT departments are willing to start really listening to their customer base.


> Plus, you would have to give these unwashed masses access to the systems.

This is unfortunately the kind of thinking I see with people I know who are tasked with system administration. They take pride in their work, setting up infrastructure to run unnoticed in the background, but I've noticed a tendency to simplify the job to extreme by making sure said infrastructure can't really be used. After all, if no one uses your system, they won't break it and you won't have to fix it. Users, instead of being customers, become adversaries.

I don't know anyone who does that on purpose, but I see the tendency to go there unconsciously. Restrict this, limit that, block everything. Principle of least privilege. It all makes sense from security POV, but when applied internally, users have to fight with support to just get their job done. I especially dislike this when it's happening in the educational context (schools, universities, hackerspaces) - computer systems there should serve as opportunities to tinker, learn and explore. Strict limitations significantly hinder usefulness of the infrastructure while saving only little work for admins.


Exactly. I have yet to deal with an IT organization[1] that does not have at least some level of adversarial relationship with its userbase. The level of adversarial behavior seems to be directly proportional to the computer knowledge of the user, as well. It is much better to be completely ignorant, or feign such, than to think you know what you are doing.

[1] IT groups that aren't large enough to be organizations, say 0.5 - 3 people, don't seem to have this issue as often or as badly.


>> "But at some point in time, it will become a roadblock to not know how to write a basic script to automate some repetitive/remedial task."

Isn't that the opposite of what's happening? 20 years ago it was necessary to know stuff like that to make full use of a computer. Now we have programs like IFTT.com or the more powerful Automator on OS X which allow us to do those things using a GUI without having to understand ho the scripts work or are written. Why can't we continue building tools so that people don't have to waste time learning the nitty gritty? Computing has been getting easier and easier so much so that 1 year olds and 90 year olds can use computing devices. Why does everyone seem to think that in the future we will all be coding when that has been becoming less necessary as time has passed?


Interesting discussion by Seymour Papert & Bret Victor: http://worrydream.com/#!/MeanwhileAtCodeOrg

The point is using computers to help people reason (because experimenting & seeing results in real-time is what enables people to create new & better ideas), not teaching coding for coding sake (to create more programmers)


I agree that people should learn the value of modeling and know how to do it.

But in order to model, people will need tools. Natural language, even when very carefully phrased, is insufficiently clear or precise. And predicate logic isn't sufficiently numerical to support quantitation, which should be central to most models.

Some kind of formal notation is required, both to specify and as well as exercise the model -- vary its parameters and evaluate results.

I see no alternative to using some sort of programming language here. Personally I'd prefer it be a lot simpler and purposeful than Python. (Perhaps Z or UML?) But there's a lot of value in being able to manipulate everyday data as programmers do, now that data is omnipresent in everyone's personal and professional lives.

Possessing basic skills with which to fix your car or repair a shirt has sufficient value that I can't see why anyone might prefer not to know how to program. It's just a matter if matching the right cognitive impedence in choosing the tool.


If literacy is solidifying our thoughts such that they can be written, then the first way we teach children is by finger paint. Here the seed is planted, but literacy is not that disconnected for put words on the page. Literacy demand more than just expressing thoughts and require that we do so with common tools.

Teach people Excel can be a first step towards computer literacy (modeling), but the first way we commonly teach children today is by gaming. World building, puzzles, RPG, and many more are areas which creates a representation of a system that can be explored or used. Some games like minecraft can o both literacy and computer literacy, where thoughts and models are both present.

At this point is where I disagree with the article concept, since you then need to teach children the common tools which composition, comprehension, and modeling are practiced with.


Seriously! Below some people are worried that Excel, "while effective", will make people write unmaintainable VBA scripts. Who cares?

We aren't training people to one day invent NP-complete algorithms. We want their life to be better with a computational understanding.


A friend of mine is a lawyer, she's been talking about this lawyer-programmer similarity for a while. She tried to organize laws from the more textual representation into a more diagrammatic one, which in a way is a step step towards representing laws like programming code: http://www.lexagraph.com/category/diagrams/


I am, however, sympathetic to Chris' point that just teaching people how to code doesn't necessarily accomplish anything.

I think this concern actually strikes at the core of the difference between programming as a "trade", versus the more engineering or scientific side of programming (under its various names). One may be perfectly proficient at working with, say, the entire iOS API, while still having little idea (or concern) about representing any sort of non-trivial problems.

Having taken both college-level and subsequently university-level engineering courses in programming/CS, I think the difference between the two roughly aligns with the trade vs modelling point-of-view.


I did a critical thinking A level when I was 17 and I remember thinking "why aren't we taught this from the beginning?"


Analytical philosophers are also fantastic at this. Though formal logic could also be argued to be a branch of mathematics.


Why do you think people have trouble breaking things down logically?

I think one, emotions demand immediate action, no immediate action is apparent, more emotion. Breaking things down is a detour - if it wasn't I don't think I'd hear "a journey begins with a single step" or "baby steps" nearly as often as I do. Two most people don't have as strong of an urge to analyze which is the ability to break things down and look at them individually.

For these two reasons I think that's why it can be so helpful to talk over with someone else - they're not feeling the emotion so they can be more objective and help break, or they have more experience to help fill in the gaps in your breakdown.

Also, When you say most people don't appreciate how to give clear instructions ... That's a tricky line. On one hand, people should (and don't) think through enough to shape their general idea. I'm a growth guy and surprising how many people basically say "we need more users, can you help?" But why...? How is their retention, activation, monetization? If those are not good you're just throwing money away.

But I don't think you should expect people to spell out line by line. At a previous job I served briefly as temp product manager and I felt irritated how I had to write the algorithm for the engineering team. When you start crossing into "how", I think you're giving too much instruction.


>Besides programmers and mathematicians, the only people I've met with the faculties to properly break things down are lawyers. Law school trains you in the ability to approach an idea from different angles in order to identify attack vectors.

Well, reporters (good ones) are like that too. And people doing commissioned work based on customer specifications (graphic designers, jingle composers, etc) learn to attack a problem in many ways (at least as far as their domain is concerned).


> As such, lawyers [...] are equipped in such a way that they are almost entirely adversarial and deconstructive.

I wonder how many lawyers (out of the total) are involved writing contracts or help draft laws? Both of those require using that "what could go wrong" intuition in the service of building something strong.


Agreed on rarity of the ability to logically break things into steps.

I'd like to add... I've met a few construction project managers that appear to have it.


There is a reason that the two top scoring majors on the LSAT are engineering majors and philosophy majors.


Your experience seems pretty shallow and narrow if the only people you've met who are capable of "properly" breaking things down are programmers, mathematicians, or lawyers. I've known plenty of people outside of those disciplines as well as within them who are quite capable of thinking about a problem logically and from many different angles.


> Besides programmers and mathematicians, the only people I've met with the faculties to properly break things down are lawyers.

Did you just forget the entire engineering profession or was that an intentional omission?


I think there's a class of engineer that absolutely fits the bill of being able to break a problem down to its atomic parts and then show how it all comes together. In practice, I've found many gloss over the details (a variant of the 80/20 rule). In my case, industrial process engineering, I've found many talented, smart engineers develop solutions that trip up from missing a detail that would have been obvious from a deep inspection of the problem.

Of course, you can find such individuals any where, like the parent noted. But I can say that in my experience, engineers tend not to have that level of logical rigor. (Edit - or: 'that level of pedantry')

Note: I don't work with Professional Engineers, which is literally a different class of engineer. Nor am I implying these engineers are bad at solving technical problem - just that such programmatic thought is not a typical tool in their (sizable) heuristic toolbox. At best computing/programming is a tool of last resort.


I'm mainly attacking the fallacy that programming is something special with respect to rigorous problem solving and decomposing problems. All fundamental engineering rests on that ability. Those abilities are fundamental to circuit analysis, electromagnetics, finite element analysis, fluid flow calculation, load calculation, the list goes on. It is a fundamental skill in engineering. In fact, I will go so far as to say that a "real" engineer is one who needs to exercise this skill in order to do their job properly.


Or philosophers, or linguists, or arts majors. Honestly, the whole view that only programmers know how to break things down is incredibly myopic, and somewhat ironic given the topic at hand.

The new literacy is the same as the old literacy. Supposedly new paradigms of thought are not new, but centuries old. Those who forget history, I suppose...


OR basically any science/engineering discipline that is rooted in math + lawyers.

That should be the 2 populations.


There are also philosophers, which don't fit here: if anything it's both maths and law that are rooted in philosophy. And hell, philosophers are crazy good at this, too.


This is completely pedantic. If you want to argue that coding is different from modeling, true, but there's a great deal of modeling implied in coding. I don't think anyone would refer to someone who could only write if statements and for loops that didn't model anything as a coder.

And if we're going to be this pedantic: when you're saying "we don't need to teach coding, we need to teach modeling," you're saying the equivalent of "we don't need to teach reading and writing, we need to teach offloading data storage." How exactly do you plan to teach modeling in the void? There are other ways to model, but coding is by far the most expressive and widely applicable. The way you learn to offload data storage is by learning to read and write. The way you learn to model is by learning to code.


>This is completely pedantic. If you want to argue that coding is different from modeling, true, but there's a great deal of modeling implied in coding. I don't think anyone would refer to someone who could only write if statements and for loops that didn't model anything as a coder

You'd be surprised. That's what most "introduction to programming" courses do. And for online posts and tutorials it's usually even worst.

At best, they use the BS "cooking recipe" analogy for programming, but they are total failures on teaching people to model and understanding analysis.


> > This is completely pedantic. If you want to argue that coding is different from modeling, true, but there's a great deal of modeling implied in coding. I don't think anyone would refer to someone who could only write if statements and for loops that didn't model anything as a coder

> You'd be surprised. That's what most "introduction to programming" courses do. And for online posts and tutorials it's usually even worse.

I don't think we should refer to someone who only took an "Introduction to Programming" course as being able to code either.

I'm not sure why you think I would be surprised that "Introduction to Programming" courses teach only if statements and for loops. That's exactly what I would expect. You start off kids reading and writing with alphabets, phonics, spelling words like "cat". You aren't teaching anyone to offload complex information storage at that point, you're just teaching them the building blocks of reading and writing. You wouldn't hand Crime and Punishment to a first grader or ask them to write about Foucault's panopticon. Likewise, you don't teach people about modeling banking infrastructure or webpage layout in their first course. You start them off with the basic building blocks: input, output, choices, repetition.

There are some intro courses that do actually focus really heavily on complex modeling as intro courses: SICP for example, was used to teach CS 101 at MIT, which is somewhat incredible to me: either those kids are a lot smarter than me, or they really, really struggled with that. I just recently finished working through SICP and I've been working in the industry for 7 years, and it was hard. That may work for the elite students at MIT, but to expect that from the average person starting programming is a bit unrealistic.


This is the essay of a man trying to sell a 'visual coding' tool.


>My comment was a statement of fact.

"He's trying to sell a visual coding tool" would be a statement of fact.

"This is the essay of a man trying to sell a 'visual coding' tool", is a snarky statement that insinuates of Chris' motives.

Just like "X is a jew" is a statement of fact, but "X's essay is the essay of a jew" is an insinuation of some (sinister?) plot.

>Maybe he was inspired to sell the visual coding tool BECAUSE of his beliefs, not the other way around.

Or maybe we should just discuss the ideas and beliefs, and leave what he wants to sell out of it? One can agree 100% without wanting to sell anything, and is disrespectful to even hint at attributing that to marketing.


Yeah, because marketing explains everything.

I, for one agree 100%, and I'm NOT selling a visual coding tool.


My comment was a statement of fact. Maybe he was inspired to sell the visual coding tool BECAUSE of his beliefs, not the other way around.


I could show you some of my ex-collegues work. There was plenty of code, but not a lot of modelling. It was a pain to decipher it.


To follow the reading/writing literacy analogy: If we were talking about reading/writing literacy, you'd be looking at people's ability to write novels as a measure of their literacy. That's a pretty high standard, one by which I doubt either one of us could be considered literate (I've tried to write novels--haven't finished one yet). There are some real masterpieces of code: I'd point to the Lua or SQLite codebases as examples. But the majority of novels are not at that level and the majority of code is not at that level. I certainly have no respect for, say Stephanie Meyer (the author of Twilight) as an author, but I'm not going to look at her writing and say she's illiterate. In some sense, one could argue that it's more important a measure of code literacy for people to be able to read code than to write it.

One area where this is very important is in management of software development. A core problem I'm trying to solve right now is that I'm essentially self-managed: my managers aren't code literate, so they can't actually tell whether I'm doing a good job or not. I've written reams of shitty code in a short time because of time pressure and gotten better reviews than when I spent the time to do things right, even though from my perspective I could see that the bugs that came out of doing things quick and dirty cost us more time in the long run (or even in the next few days). It's something I believed in theory before, but now I'm in a much higher-pressure environment and I've actually made the mistake a few times. If my managers were code literate, they could read my check-ins and sign off on them. And since they're not, they're essentially glorified middle-men, and I'm not sure what function they actually serve.

The above problem would be solved if they were like your ex-colleagues. They don't have to know how to model, they just have to be able to understand how I modeled something.


The analogy to writing holds here; a lot of "literate" people produce barely comprehensible writing.


>There are other ways to model, but coding is by far the most expressive and widely applicable.

What about math?


Okay, you have me there. I think coding and math fulfill different roles within modeling, but I'll agree that speaking in very general terms, it would be hard to say one is more expressive and widely applicable than the other for modeling.


Not as approachable. Not interactive.


Weird syntax, frankly.


It is supposed to be both. In fact, you need far fewer resources to start doing math than you do to start programming.


Coding is certainly a good way. However, there's also physics, statistics and probability, geometry, and I'm sure there are more. All of them are heavily mathematical in nature, but when taught well they are certainly lessons in modeling.

Unfortunately for students in the US, they are rarely ever taught as lessons in modeling.


What Chris Granger is explaining is pedantry. It's a difference without a distinction.

I used to teach AP Computer Science. On the first day of class, I'd hand out paper and have the class make paper airplanes. I'd then ask them to write out the instructions they used and would play "human parser" as they turned them in to me, building lopsided airplanes by interpreting their instructions too literally. Sure, modeling will teach them the necessary skills as well as any programming language. I would argue that my students were coding. The difference between coding and algorithms is important, but it's mere semantics as far as this discussion is concerned.

Yes, coding isn't literally "the new literacy," but his point destructs the discussion in the way pedantry usually does in political spheres. In politics, being right often gets the wrong result. Coding is an awesome, interactive way to explore algorithms and how technology works.


The simplest possible distinction: coding is writing code in a programming language[1][2][3], modeling is breaking down systems without regard for computer or programming language and is not specific to any domain.

When we say "coding" the accepted definition has to do with giving instructions in a programming language to a computer. I'm saying that the fundamental skill for people has nothing to do with Python or computers at all.

[1]: https://www.google.com/webhp?sourceid=chrome-instant&ion=1&e... - "write code for (a computer program)."

[2]: http://en.wikipedia.org/wiki/Computer_programming - "commonly referred to as coding"

[3]: https://www.google.com/webhp?sourceid=chrome-instant&ion=1&e... - everything on the first page is talking about writing code as instructions to a computer.


Well then, by those definitions coding is to programming what typing is to writing a novel. Mostly uninteresting and barely taught at all in my UW CSE courses.


Just to be superclear, my problem isn't Chris's thesis, but that somehow learning programming is "learning how to code", which, given any decent computer science education, is definitely not the case (maybe a few weeks in an intro CS class, that's it).

So with that in mind, I don't get how they are going to shortcut 2 or 3 years of intensive specialized training with this realization, given that we already base our education on this anyways?

He also brings up something like Sherry Turkle's* software bricolage argument, which I also totally agree with. HCI was also founded on these premises (starting as a fork of the PL community focusing on VPLs). Obviously there is not much new under the sun, it's all in the execution, and I hope they succeed with that.

* not surprising since...MIT


> modeling is breaking down systems without regard for computer or programming language and is not specific to any domain

And what would you teach students to describe their models to others? Some modeling tool or technique? Yet the skill of modeling is not dependent on the particular tool used. Similarly, the skill of programming is not dependent on the specific programming language. Certainly when teaching programming, you're going to teach using a specific programming language—but if the student learns the language, and doesn't understand programming outside of that, then the teacher failed. Just as if you teach modeling, and the student learns the modeling tool used, but not modeling as a concept.


In a technical sense, you're 100% correct. If there's a better way to teach the same concepts without actually programming, that would be really useful. In a pragmatic sense, it's not helpful to the cause.

For engineers, it's easy to get concerned about the trees and miss the forest. If a time-strapped journalist comes across a highly-upvoted blog post on a tech site, the headline won't be , "Engineer suggests strong alternatives to coding in school." It'll be closer to, "Coder suggests kids shouldn't code."

Pedantry is a luxury our profession heaps on the world, often without regard for how our subtle nuance is completely missed.


> If there's a better way to teach the same concepts without actually programming, that would be really useful. In a pragmatic sense, it's not helpful to the cause.

These seem contradictory. We can either teach a narrow swatch of people to "code" or we can search for other ways to impart the important bit: modeling. The entire argument of the essay is that we should be looking for those "really useful" ways.

> If a time-strapped journalist comes across a highly-upvoted blog post on a tech site

I'm not sure focusing on getting a message out to someone who won't read the entire post is a worthwhile pursuit.


Maybe I'm being too wordy. It's not what you say, but the way you say it. If you're suggesting an alternative means, don't sell it upfront as "don't teach coding."


Yes, coding in your very broad sense is necessary to modeling in Granger's sense. The bigger problem is that coding isn't sufficient for modeling. We can and do get lost in the details of programming languages and tools. We design "learn to code" programs that teach typing and data-entry as opposed to abstract thinking.


I'm normally very annoyed by pedantry, but in this case we face a very real risk that the pedantic difference will cause a huge difference in outcomes for a huge number of children. If it turns out that a large number of national governments simultaneously embark on a huge effort to teach kids "coding" without much if any understanding of how to teach them "modeling", then we are setting ourselves up for a generation of failure and frustration with regards to computing. (Think "computer literacy" classes that actually teach typing and maybe a little MS-Office.) Maybe few extra kids will be nudged towards learning computation on their own. But, the overall movement will miss its mark in a tremendously disappointing way compared to it's potential for improving how people learn and work in the near future.


I would agree if you mean learn what an algorithms does but not how to build it. I program everyday and rarely do I have to build an specific algorithm but knowing of them and what they do I need daily.


Just because a guitarist plays covers doesn't mean he wasn't playing guitar. Every time you build a syntactically-correct program, you were making algorithms. No one is suggesting you have to master linked lists to graduate--only that you should understand the very basics of how computers process information.

Just because your college course was called "Algorithms 101" does not mean those are the only algorithms or that your own code is not an algorithm.


> We build mental models of everything - from how to tie our shoes to the way macro-economic systems work. With these, we make decisions, predictions, and understand our experiences. If we want computers to be able to compute for us, then we have to accurately extract these models from our heads and record them. Writing Python isn't the fundamental skill we need to teach people. Modeling systems is.

I think he's missing something. Modeling is only a small part of what people learn when they learn to program. The real lesson of programming is to be able to to create models that stand up against the cold asphalt of reality.

They, first of all, have to be consistent enough to execute and secondly, they have to be tested with data.

The hurdle everyone goes through when they learn to program is the unforgiving nature of the machine - it's not what you think you are saying that counts, it's what you are actually saying. And, that's an important lesson.


> The hurdle everyone goes through when they learn to program is the unforgiving nature of the machine - it's not what you think you are saying that counts, it's what you are actually saying. And, that's an important lesson.

Spot on, it's also a lesson everyone would benefit from as it makes you learn to think and communicate clearly.


It's a good lesson, but it also goes against the grain of how humans operate. A humane computer system will tolerate ambiguity and imprecision better than any modern day programming environment currently does. The trick is giving the human an environment to explore and learn, which can evolve into one that is robust and precise. The act of creation is a back-and-forth, but we are using tools whose roots lie in a model where if you literally sit down and type out the solution into a terminal, that is the idealized model of creation.

Imagine a way to create models that is context sensitive to the capabilities and knowledge of the user, both in general and specifically in light of the problem being modelled. Modern day programming is one-size-fits all and forces you to overspecify too many details correctly before you get a working program. (some languages and paradigms more than others)


> It's a good lesson, but it also goes against the grain of how humans operate.

Which is why they need to learn to think better.

> A humane computer system will tolerate ambiguity and imprecision better than any modern day programming environment currently does.

I have no interest in such a thing, it's a fantasy.


A lot of junior programmers write shit code because they code and don't bother modelling anything before hand.


s/junior//g


true


> The real lesson of programming is to be able to to create models that stand up agains the cold asphalt of reality.

I would argue that's just a matter of modeling enough to be good at it. I indirectly talk about this in a few places, while discussing how we iteratively craft models and in the section about kids (where I bring up debugging specifically). It's very important to work out assumptions and to be able to explore your models to find out where you've gone wrong. Programming is one way of doing that in terms of the computer, my hope is we'll find better ways of doing it in the future.


> It's very important to work out assumptions and to be able to explore your models to find out where you've gone wrong.

I agree and that's what I was getting at. To me, it's 'programming' when create something that has to be consistent enough for us to be able to explore it with real data. Thought experiments are something we can easily get lost in.

We can have interactive environments that we don't see as programming and they can be useful, but the discipline of being able to create those models and know that they are doing what we intend for the reasons we intend - that's programming, and people gain that discipline through environments that make them work for it to some degree.


I've been running a weekend programming club for grades 3-5 using Scratch (scratch.mit.edu) and notice some of Chris' points first-hand. For those unfamiliar, Scratch provides a LEGO-like programming environment--it's roots go back to LOGO.

After much initial enthusiasm, the club is down to a core group of dedicated students. The drop-off started when I switched from directed instruction to self-guided projects. Although Scratch contains a relatively small set of core programming blocks, the amount detail involved in building moderately-complex programs (multi-level games, scrolling, interaction) is a lot to overcome for someone without experience. I think some of it relates to the "fundamental disconnect" discussed: A student might be thinking "I want the background to follow the goblin," but needs to translate that into code blocks not part of his/her everyday language. It takes practice and experience to learn the computer's vocabulary for describing a process.


practice, experience AND intelligence


I would venture to say that what you focus on in your comment runs explicitly counter to what the GP was trying to emphasize in their post. For the people who teach kids of grades 3-5/ages 8-10, I doubt that they are in the business of dissuading children who are apparently "not intelligent" by whatever metric you imply. Kids at that age deserve every open door that can be given to them, and implicitly shutting them out of a computer science club because they can't immediately grok how to do the described task is doing the entire community a disservice. The lesson to be learned isn't that some kids can't (or won't ever) cut it, it's that the learned elders need to formulate better ways of passing along programming knowledge to the younger generation.


Great points. It's very important not to classify at such a young age. Kids in the club are encouraged to try new things, experiment, make mistakes and have fun without worrying about assessments or grades. There are no success rubrics and the computer, in this situation, is just a tool for creative expression--like LEGO or clay. Scratch works well since it's very approachable and is a fun introduction to computer programming as creative tool.

The drop-off effect is normal, just as some people choose to stop art and music classes after an initial introduction. For some, there is a "fundamental disconnect" when expressing themselves creatively with music or art and they will naturally gravitate away from those areas. However, a child should not be excluded from art class for not being able to draw a perfect portrait.

My hope is that being introduced to programming concepts in this club forms mental models that can be referenced later on when learning other subjects and, perhaps, motivate some students to further explore computer science. Either outcome is a success.


i didn't say anything about screening kids out at that age -- i'm not sure where that came from?! i was simply pointing out that there is another very important factor in explaining success at cognitively demanding tasks like programming .


Object permanence for children is interesting but not the best example. Far more interesting is the fact that children fundamentally don't understand geography at a young age because they can't see it and have to imagine it, and geography breaks something down into types (city, village, country etc.) and there's a hierarchy. Understanding geography is close to the sorts of abstract thinking required of programming.

Small children have a great deal of problems with "Paris is in France. We are in Paris." They also have almost no concept of distance as related to geography because they have no internal map of the world. So when you drive out of Paris you are still in France, but if you drive over the border into Belgium you are in another country.

So, teach kids geography.

I experienced this while teaching LOGO to small children (yes, this was years ago). Given a LOGO turtle and a set of instructions it's fascinating to watch them work out how to draw a square.

PS You might think showing small children a map or satellite view would help, but it doesn't help (much) because they have no internal way of breaking down the data presented into what they experience.


> geography breaks something down into types (city, village, country etc.) and there's a hierarchy. Understanding geography is close to the sorts of abstract thinking required of programming

It's one of the tools, but - probably because being heavily promoted by books and articles on Object Oriented Programming - it seems to be abused to the point of making some programmers dumber than they were before.

What one really needs to grok here is that categories are arbitrary and rated only by their utility. With your example - yes, we divide countries into cities and villages, etc. but not only there are tons of exceptions breaking the perfect tree-like structure - the whole hierarchy exists only on the map, not in the territory.

So yes, I'm all for teaching people hiearchies - followed by explaining that they're totally arbitrary, that real world is fuzzy, and that best ways to categorize do not always yield tree-like structures.

(and maybe then people will stop listening to stories like "European Commision believes snails are fish" and then assigning stupidity to the authorities instead of themselves)


I never mentioned OOP and never would.


> So, teach kids geography.

Haha exactly! That's a really interesting example I hadn't heard before. There are lots of other fun ones: number permanence, volume conservation, etc.


It strikes me that zooming around on Google Earth might be a good way to show the relationship between what we can see around us and the fully zoomed out globe.


To paraphrase William S. Burroughs from his 1978 lectures, "Creative Reading,"

In teaching creative writing there is the implicit assumption that the student can write; that is put words to paper.

He instead eschews the traditional method of teaching a student to write well and instead focuses on teaching them how to read well on a hunch that it will help the student to recognize good writing and why they appreciate it.

An idea I've been revisiting (as other luminaries such as Donald Knuth and Peter Siebel have gone before) is why code is hardly ever read creatively. Is it something about the procedural epistemology of certain languages? Are inductive or deductive styles easier to comprehend? Why is the artifact of our labor so divorced from its realization (source vs. execution)?

Siebel says we should approach the reading of code as the Natural Philosophers of their time approached a new specimen. Knuth, as far as I know, still advocates literate programming but admitted in Coders at Work that he doesn't read that much code.

And a new idea being popularized is data-oriented programming which suggests that the design of the program should be divorced from its implementation which should model the data flow as closely to the hardware as possible -- implying that readability is a secondary concern; further divorcing source code from intent and purpose.

Literacy when it comes to code... appears to me at this point to be an oxymoron.

It's not a novel idea to teach kids to code. I rode the tail end of a hysteric push to teach kids to code in the 80s. A difficulty for me is that the languages we have today are far more complicated. But we also have some better tools... scratch has worked well when one curious niece and nephew asked what I was doing.

Programming need not be this intense mental task with textual interfaces and machine-oriented semantics. It's a useful skill to have: abstracting ideas to symbols and applying transformations to them. A skill worth picking up, some cynical types postulate, if you want to have a job in the next 10 years.


Thank you. Seriously, it'd be so much better (and easier) for people to master Excel than Python. Excel is interactive, undo-able, documented, shareable, graphical, etc. It's a much better tool for modeling. We don't need people to master mechanical engineering to drive a car.


Managing a spreadsheet does not equal programming by a long shot, and given large enough complexity, you'll run head first into VBA macros and abominable one-liners with so much different branch points that it only makes sense to its author, who could have written a program that seperated the code from the presentation instead of munging up the two.

Sure, it'd get the job done, but the successor would have to rewrite it, as it can hardly be documented, plus, from personal experience, people who are versed in Excel trying to tackle data problems at companies have come up with rather brow-raising solutions. (What's this MySQL you speak of? Does it work with Excel, too?)

Programming is not just hammering away in [language], it's deconstructing a problem into smaller, less complicated problems, to generify and abstract away common properties to allow for repetition in similar contexts. It stimulates efficiency, speeds up rote administration tasks, and allows the programmer to learn something new each time he tackles a new problem.

More so, the writing of code should enlighten a person to write readable functionality, as code is read, compiled code is executed. Using clear and descriptive terms to state the moving parts in his solution, using verbs in function names, not abbreviating unless absolutely necessary, are all tasks that should allow for a person to better state his problem domain and its intended solution.


> Programming is not just hammering away in [language], it's deconstructing a problem into smaller, less complicated problems, to generify and abstract away common properties to allow for repetition in similar contexts.

This is exactly what you learn in Excel. How to take some data, transform it slowly (column by column), check your work, save intermediate results, etc.

In minutes you can go from your idea to some crude version of it. Then you refine it.

Look, Excel is like LogoWriter. We're not asking people to design rockets with it. But it's a hell of a lot better than sitting them in front of a terminal and asking them to learn about variables, loops, print statements, etc. Excel is type and go.


Oh boy, we don't need any more Excel abuse then there already is...

You wouldn't believe the kind of behemoth Excel monstrosities there are out there. Because when your only tool is a hammer, everything looks like a nail...


We want people to learn computational thinking, and then complain when their amateur creations aren't up to our standard? Let it be a gateway.


Except that their amateur creations end up as being used for really important stuff. With hugely disastrous results. Excel misuse seems to be rampant in the business world, from what I've seen from my customers.

Perhaps "teaching them Excel" will lead to a deep understanding. Perhaps "teaching them Excel" will just be like how I "learned" Excel back in high school- a teacher standings in front of the class just saying "press this button, now press this button."

Excel abuse was cited as a part of the reason JPMorgan lost 6 billion dollars.

I don't know...


I don't understand your argument. Let's say someone learns a proper programming language like Python, creates some spaghetti code (as any beginner will do), and that gets used by the business. How is that any different than a spreadsheet that does the same?

Excel's strength is that people are actually productive enough to start using and make something useful. Whether people make substandard houses with a hammer isn't really the tool's problem. Being too difficult to pick up is.

You might argue there is some quality bar that should be met before a creation is put into production, which is fine, but that's independent of the tool used to make it.


Interactive, undo-able, graphical, sure, but what do you mean by "documented"?


Let's say you're a new programmer and want to do something simple, like take the average of some numbers.

In python you'd search the official docs (right? Why should I google around random forums) and get presented with this:

https://docs.python.org/2/search.html?q=average

Nothing helpful at all. A regular person thinks "take the average" and programmers think "build up a result from a set of primitives, like loops and arrays". Total mental mismatch. Way too low on the abstraction chain, I just want a friggin average.

There is literally no article or tutorial in the python docs explaining how to do this grade school operation. Imagine how frustrated you would be.

In Excel you type average and get inline help in the IDE, an example (you can copy-paste), related functions, etc.

http://imgur.com/ZZGHlH2

In 3 seconds you have contextual help extremely relevant, and you can see syntax errors as you type.

For a new programmer, Python is effectively undocumented.


You're kidding, right?

Excel is awful, here's proof: If I open up the documentation for Windows, there's nothing about average. I hit the Start menu, I type "average(1,4,6,8)", and I get, nothing.

What you like about Excel is that it is an IDE. And IDE that makes simple things simple, and hard things back-asswards (VLOOKUP??? CONCATENATE?? filters)

http://docs.scipy.org/doc/numpy/reference/generated/numpy.me...


I must be missing something. Why should Windows OS (or Mac OS, in my case) handle help for an app? When you're in Excel, you get contextual Excel help, and only Excel help (which is a good thing). Do you want every mention of "average" in your email to show up?

Sure, SciPy can take an average (not Python proper). How is the user supposed to know to visit a non-standard Python module site and search there? How about the next commonly used function, like charting. If I search "bar chart" which Python module site should I magically know to visit?

VLOOKUP (Vertical lookup) isn't the best name, sure, but "lookup" is in the name. It's another keyword to learn, like "foreach".


Excel works for numerical modeling.

Try modeling a multi-actor process. It's representable, but not a model. Better to represent in a swimlane diagram.


The problem with calling coding "Literacy" is that it implies it's necessary for everything else in life. It's not. Even 50 years from now, technology will evolve to empower those who can't program. (One can argue that the advance in technology has come from it being easier to use, not harder.)

That said, programming is a form of screening for IQ and rigor than perhaps a Philosophy degree was in the past. People who can program are used to dealing with Right and Wrong answers, difficult problems, and working around rules that might seem wrong. It's a great skillset, but by no means mandatory. (Neither is modeling) This is one reason why CS majors are in demand for a lot of non-CS jobs.


I've thought about this just the other day. Most people here would probably agree that teaching kids to code in school would be pretty cool (never mind the usefulness of that for a second). I think one mistake we all make is imagine a world in which everybody can and enjoys writing code as a result of that. That makes no sense at all. A lot of people would probably hate it, and even more obvious, most people would just forget. I've learned latin in school (don't do that). Equus means horse, that's what I remember.

---

BTW: We need higher-level languages on top of maths (that are not general purpose programming languages) so that people can talk about complex processes and what not.


> I think one mistake we all make is imagine a world in which everybody can and enjoys writing code as a result of that. That makes no sense at all. A lot of people would probably hate it, and even more obvious, most people would just forget.

I thought the idea of teaching kids to code is to show them that it can be immensely rewarding and fun, thus encouraging those that "click" with it to explore it further on their own.

As a guy who gets bored often, I consider myself to be extremely lucky to have discovered a "passion" for computer-y things early on. I'm pretty sure I'd have been bored out of my mind if I had pursued any other discipline.

Coding/Computing/... is the best game there is: we play "Simon^H^H^H^H^H Coder Says" with machines all day long. We'll never completely win it, but the mini-bosses along the way make it totally worth it. I feel like a lot of my classmates would probably get it as well if they were guided into it, but instead "computing class" was basically an MS Word tutorial. Something like Logo (or Sketch, nowadays) would have been much more effective and fun.

If I had to teach high schoolers to code, I'd absolutely approach and present it as a game ("Imagine if you were the benevolent dictator of the universe..."). Because it is. It'll stick with some, it won't with others, and as always - the delivery method will affect the outcome more than the actual message (ever had a $SUBJECT_YOU_DISLIKED teacher that was so awesome that he actually made you perform well in $SUBJECT that year?)

EDIT: forgot this:

> BTW: We need higher-level languages on top of maths (that are not general purpose programming languages) so that people can talk about complex processes and what not.

If it isn't evaluated/compiled, then we'll classify it under the "Maths" nebula anyway. See mathematical logic/boolean algebra, etc.


I agree with every single paragraph.

Showing kids that coding is very rewarding and fun is great, as long as we don't waste the other kids time for no real reason (that's a means-end question and we're clearly biased, which is why I think this is so important to consider).

2D games are a great way to learn about trigonometry and these kind of things, no question about it.

It has to be a interpreted/compiled language, of course, otherwise it wouldn't be useful at all. But that doesn't mean it needs to be Python. I wonder what a modern-day Scheme would look like. Or maybe visual (think node-based) programming would be more suitable.


> Showing kids that coding is very rewarding and fun is great, as long as we don't waste the other kids time for no real reason (that's a means-end question and we're clearly biased, which is why I think this is so important to consider).

But a major percentage of any educational process will waste time for every student. Not the same parts for each student, but invariably there will be some subjects that offer you nothing, while they will trigger something in other kids, e.g. Art. Coding class would hardly be the first to do this, nor do I see any way around it in general.


Reading/writing and comprehension/composition are the same thing. The author is just attributing specific meanings to those terms to draw superficial semantic differences. Reading is just another word for comprehension.

Only a narrow view of reading would be limited to the physical act of moving ones eyes across a page.

Similarly, the word coding is often used to describe a plethora beyond typing if statements and for loops.

The author does have some good points despite this problem, but the entire hypothesis is based on a weak argument of semantics.


Chris is correct that being able to model is a prerequisite for effectively doing a lot of things, coding is one of them. Sort of like saying that schools need to teach 'critical thinking.'

BTW, I especially like the section starting with the Alan Kay quote. I have recently started (yet one more time) hacking in Pharo Smalltalk which is one of the better programming environments for exploring code and understanding it. I am sure that Chris was strongly inspired by Smalltalk when he created Light Table.


I have a BA and a MA in English (don't get your pitchforks out yet) and I'm learning how to code.

I'm not doing this because I want to get a job as a coder. I do well enough for myself with my existing business.

I'm learning to code because I see all these wonderful apps and products and games and experiences, and I feel left out. I have tons of ideas. I just want to see them come alive.

Whatever business potential these ideas have is just added bonus.

At the same time, even though I have been around computers since I was 10 and know enough of math, logic and basic programming to pick up coding at a competitive level, I do recognize that this isn't something "everyone should do"

You need to learn how to write and read because otherwise you can't write a check or read a contract.

I doubt we'll come to a point where you will need to look at the source code to run Word on your computer.

So no, coding really is not the 'new literacy'.

Coding is hard work, and 30 hours of Python will equip you to build nothing, do nothing of note beyond looking at a Github repository and wondering "gee, what kind of modules are those? And what's that fancy trick with objects?"

If you are going to do this, you'll have to buckle in and do the time. 3 months of intensive, 40 hours/week is the bare minimum dosage to build a decent web app.

Multiply that by 10x to actually become an engineer and not just a programmer


There are all sorts of analogies we could use to explore this problem space, but I'd say music is an ideal one. In music, the "material" for consumption is sound and the tools for creation vary wildly, from the intuitive to the complex, but the act of creation and consumption of music is open to us at every level.

One detail that perhaps is overlooked with the popularity of Excel is that it is visual. Not just by using a GUI to control its logic, but that the results of changes are always made apparent. As an Excel beginner you might not know why something is broken, but you can figure out that something you did was wrong (logic errors can creep in, but can be minimised through breaking down concepts to contain fewer points where assumptions are encoded). Knowledge is then shared with other users in order to get Excel to perform as desired.

I'm rambling a little bit here, as I'm thinking out loud, but wouldn't an ideal starting point for a better Excel be Excel with cell types? If you think about it, a type system for Excel cells solves so many of the problems you get from trying to use Excel at scale. I could elaborate if the potential improvements are not obvious.


I've been teaching programming to non-CS students at Stanford for the past month...they're all clearly smarter than me. But what they lack, as does virtually every non-programmer I've worked with, is the ability to (initially) understand a for-loop.

It's not that they don't get that there is such a thing as a "loop", or that a task can be broken up into iterations...they just don't know what it means to design something that can iterate across a collection and, for each member of that collection, perform a task on it. It's not merely not understanding the syntax, or the overall result...it's not comprehending that you can design and control such a thing.

I've worked with non-programming professionals in which I've taken a repetitive task, such as extracting a bit of text from each page of thousands of pages of documents, and boiled it down to a program that saves them days of work. The effect of such a program is greatly appreciated...but time and time again, these non-programmers are delighted/astounded when I perform the same task in another scenario...what bothers me is all the times when I'm not there to recognize how such a problem can be abstracted, and they dive in head first into a meaningless, repetitive chore.

The abstraction of a task, so that it can be performed in a loop...it's so fundamental that once you've done it, it's hard not to think in those terms. And I wish it were the case that very intelligent people could just get it, in the same way that those who've never played an instrument can at least appreciate Mozart. But I just haven't seen it...we programmers take for granted the ability to think in this profound way, and even though it's fairly basic, as far as ComSci education goes, it really is a gamechanger in all areas of life and work. This is why I think programming deserves consideration as a kind of literacy.

Edit: To focus on something from the OP:

> We don't want a generation of people forced to care about Unicode and UI toolkits. We want a generation of writers, biologists, and accountants that can leverage computers.

Well, sure, if you boil coding down to "Unicode" and front-end web development, then yes, it's not "literacy". In the same way that if I boiled down traditional literacy to iambic pentameter and the debate between prescriptive and descriptive linguistics, I could make a compelling case for not learning to read and write.

Edit 2: I'm willing to consider programming as not a new literacy, but "just" a branch of math...but math practiced in a pragmatic, visceral way, not in the tepid, abstract way that it is forced upon us in primary school. I do think that programming is math, but with programming, you are actively building and testing something using the principles of math. If there's a way to teach that kind of math without programming, I'm game for it.


I'm the only dev (or person with any IT-related responsibilities) at a workplace full of very smart people, and I've noticed this repeatedly. They see a task as being monumental ("We have to do the same task for 3000 people/articles/etc.!") and are surprised when I point out that a computer would take just a second or two to do all of the work. These are for tasks that are obviously repetitive.

This is more the kind of "literacy" that I want: the knowledge to know what is possible, even if the skills of actually accomplishing the task aren't taught. It astounds me how much time people waste because they don't know how to recognize when something is a good candidate for automation.

Of course, part of it could be that some people are afraid of automating themselves out of a job...


> Of course, part of it could be that some people are afraid of automating themselves out of a job...

Actually, the real tragedy is these people don't even know how much of their work is "loopable"...i.e. at risk of being replaced by automation.

The way I think about teaching programming is: the goal is not to become more like a machine, but to recognize what is human, i.e. to understand what is not easily computable. And such comprehension, from my pessimistic view point, is nearly impossible to have without a level of programming literacy...and even many programmers may not stop and think about it (just as many of us who know how to read and write underappreciate the role of literacy and communication).

Edit: I think the idea of automation can be scary -- and for good reason -- but mostly because the people who have the ability to institute automation (or rather, their bosses) may not be able to properly judge the consequences. However, if society as a whole better understood what automation actually is...I'd feel more confident that we could take the long view on it, and use robots and computers to lift us all up, rather than leave the non-lucky parts of society in the dust.


I encounter programmers after they learn loops, but I have noticed that many programmers have trouble with recursion and something there's no word for, but it's the ability to imagine oneself on both sides of an API. Real cognitive hurdles there for many people.

Re loops, I wonder whether we're better off just going directly to collection operations - maps, folds, etc. We're doing a lot of work in the industry to lift ourselves up from mutability. It makes me wonder why then we feel that we have to start there in education.


>I encounter programmers after they learn loops, but I have noticed that many programmers have trouble with recursion and something there's no word for, but it's the ability to imagine oneself on both sides of an API.

Do you mean dealing with multiple levels of abstraction? Like thinking of employees as intelligent tools that complete tasks for you, and then thinking of them as bundles of motivations and beliefs that you must aim towards the task that you need done, using the power of incentives and values.


It's not multiple levels of abstraction, it's more like technical empathy - putting yourself in the position of being on the other side of an API.


Perl as a language has "unless" as a keyword and a lot of people, who are not inexperienced with programming, have trouble using it at first. Some even rebel against its use and claim they hate it

So it is interesting that even similar seeming things can still require some time to internalise and comprehend


Am I missing something about this `unless` keyword? From what it looks like, it's just syntactic sugar for `if not`, and it makes much more sense…

    (unless (string= a b)
      (fail))
    ;; much more readable than
    (when (string/= a b)
      (fail))


I notice the same thing with Ruby's 'unless'. I think it has to do with the mapping of English to programming. Sort of like the fact that 'or' can be inclusive or exclusive.


> If there's a way to teach that kind of math without programming, I'm game for it.

That's largely what pure math or abstract math is. This isn't really encountered unless someone is getting a degree in math.


While the overall tone was a bit antagonistic for me, I tend to agree with the premise of the title of this article. As someone who runs coding training programs for adults, I don't think you can teach someone coding in a matter of weeks, anymore than you can teach them Spanish in a matter of weeks. However, if you're going to make coding your career, a course that covers the basics and puts you on a good footing, especially if linked with work experience, is the best way to embark on the learning and exploration that the author discusses.

Nevertheless, I don't see why we could put say HTML in the same category as Excel- why shouldn't everyone in the world be able to make a webpage? We teach reading + writing, not to make everyone a journalist or a novelist, but so that we can express our thoughts and communicate our ideas. HTML (and even CSS) are basic enough for any professional to pick-up and are empowering skills to learn; without relying on third-party means to publish your own information on-line.

One last thing - in the UK, Excel is one of the skills listed under I.T. apprenticeships, if you can find a company who offers that.


This article is deeply wrong. Literacy and universal schooling got masses of people out of poverty because they were able to get more "advanced" jobs. Most of them didn't have composition skills etc, yet at one point we had rows of typists in every office of better part of 20th century (remember this job requires just very basic skills centred around literacy).

Writing was art for a long part of human history but it's not anymore. Same is going to be true for programming.

Nowadays programmers like to think about themselves as special, but in reality the ability to write simple programs to process input and give output will be just as a normal skill in the future as writing a report is now.

Programmers will only be as special as book writers are nowadays - just a little better in putting words-code together so their work has bigger impact.


Programming and literacy are fundamentally different.

Programming is about building machines. It doesn't have to include modelling, although some of the most useful machines do. (How much modelling was needed for the HN code?)

Reading and writing are about sharing experiences, learning from them, making moral judgements about them, and persuading, or being persuading by, fellow humans.

Only people who don't understand either believe they're equivalent.

At best, programming makes sense if you teach it as a practical add-on to math and strict logic. It has its uses there, although I'm sure that by the time one of today's five year olds is enough to entire the job market, knowing school-level Python and jQuery is going to be as useful as knowing BASIC.

Programming teaches you nothing about verbal or written comprehension, allusion, metaphor, political and social history, or relationships of all kinds.

Learning to read and write is the only way to start exploring that world. (Video-making and photography give a taste of it, but they're still distanced from it.)

Programming - which is carpentry and architecture for binary machines - lives somewhere else entirely.


I think the title was perhaps a misnomer, and it's distracting from the message. The author is arguing that we are miss-framing the literacy involved in programming, and that the real literacy is modeling. He's specifically comparing 'writing' as the act of putting words on paper to 'coding' the act of typing special words on the computer. So we're in agreement that there is an important literacy here to teach the masses.

Importantly, the author highlights 'composition' and 'comprehension' as the real literacies behind the mechanical front of 'reading' and 'writing'. He argues the parallel is 'modeling' as the real underlying literacy here, and 'coding' as the mechanical facade.

The article focuses on the parts that will be challenging for a long time, tackling the problem, getting ideas out of your head. This is much preferable to 'cargo cult' teaching of language syntax and keywords alone, although those are learned incidentally in the process, they aren't the goal.

Paul Lockheart's 'A Mathemeticians Lament' shares many similarities in the field of Math, and is also, I think, applicable to software engineering. [PDF] https://www.maa.org/external_archive/devlin/LockhartsLament....


> Writing was art for a long part of human history but it's not anymore. Same is going to be true for programming.

While I want to believe this is true, at least here in the US, I have trouble accepting it. Primarily because so few people you talk to on a day-to-day basis outside of tech are even capable of understanding the most basic concepts. I've even met plenty of people within tech that can't. Introducing people to programming at an earlier age may help sure but we do that with writing and literacy and yet the list of people generally considered "truly great" at writing is still a very small percentage of the population. The big difference being that when I open my old, 1950's typewriter, and decide to type up a little short story that would probably make Stephen King weep for his art, a billion dollar company doesn't crash, a few million credit card numbers don't get leaked, or any of the other countless things that go terribly wrong when programmers screw up.

I really do hope we get past the point where stuff like that is a regular occurrence, where computers and networks really are secure, and programming is taught at a young age. I just have trouble seeing any part of it becoming a reality any time soon.


Nowadays programmers like to think about themselves as special, but in reality the ability to write simple programs to process input and give output will be just as a normal skill in the future as writing a report is now.

Where input and output respectively can have inordinate amounts of meanings and manifestations.

Now, sure, someone writing a quick 4-line script using a hugely abstracted Python library to obtain a value from some document format will probably be a normal skill.

Writing just about anything substantial requires large amounts of domain-specific knowledge, specialized skillsets and lots of patience.

It's something that actually requires an individual to have legitimate interest in their field - the same way a writer would probably be far more invested in linguistics than a layman, and make it an aspect of their expertise and life skills.

In contrast, the average person would use code as a means to an end, likely under the confines of a system that makes the act of writing code be as separated from the workings of the machine as much as possible. Someone parsing CSV doesn't mean they can now write a distributed file system, for instance.

The fact of the matter is that we simply do not have the tooling to make programming as easy as 1-2-3. Comparing it to writing isn't even adequate. The analogy would only make sense if the average person needed to know dozens of grammars (protocols) to express themselves in language, among other constraints. They don't. There's a world of difference between practical computing and some sugary ideal of everyone coding like it's their second nature.

There's nothing wrong with people picking up programming if they need it. I'm far more content with letting people discover it themselves rather than diluting it by making it a compulsory school subject.

Programmers will only be as special as book writers are nowadays - just a little better in putting words-code together so their work has bigger impact.

I'd wager this devaluing mindset of computation as an end goal where you simply piece together this magical substance known as "code" (or rockstar/ninja it, whatever) is a major reason why our software is so entangled, brittle and unable to communicate as it is today. An end result being that it makes programming more and more esoteric and inaccessible - something the "learn to code" evangelists should be concerned about.


It's the difference between a code monkey and an engineer.


Besides the title being pedantic, I feel this article is alienated from the average person I know who is always modeling and dealing with symbols and representations in the world, but less often takes a step back to realize the practice of their modeling and to judge its efficacy.

To me, modeling seems equivalent to consciousness. To me cave paintings of animals seem significant because they illustrate 41k yo humans creating representations of their world (for mental practice, entertainment, ceremony, who knows...), whether they realized their practice is another question.

But today, there are many people who model and recognize it, think of manipulative salespeople/lawyers/cads, great fiction writers, constipated philosophers and art/film/lit critics, scientists, artists/architects/designers...

You don't __need__ any technology built after 1980 (or whatever) to explore the world... Even thought experiments can shake an academic field (Maxwell's demon, Schrodinger's cat, EPR paradox...)

I agree that there may be better tools to convince people to contemplate environments and unpack/model/break down confusing things into collections of simple theories than to send them to a liberal arts college (where this and goofing off is really all there is to do in my experience)...

But programming is still the new literacy IMO. I feel like all the white collar underpaid people I know could work 2-4 hours a day of their 10 hour/day jobs if they wrote the most limited, banal, imperative script that abstracts no concepts and in completely un-generalized (and therefore fails at modelling). And I feel just this fact, that there would be a great liberation of 1st world mankind's time if every non-programmer who uses a computer for their job up and wrote a severely shitty program that automated some aspect of their responsibilities, is enough to say that a coding is the new literacy.

Think of what kind of intellectual shift a machine-literate public that just cut their work week in half would result in...


For modelling to work generally (in most/all domains), you need some underlying paradigm all these models can be translated into.

For excel, it's grid based dataflow, (for example, you can call it differently).

I saw you tried dataflow already but stepped back from it.

So what paradigm are you trying to use for EVE?


Let's assume reading and writing is to literacy as is suggested at the start of the article. I think that's not too contentious. Given that, there's an odd assertion in this article around Excel.

To quote the article: "We create models by crafting them out of material. Sometimes our material is wood, metal, or plastic. Other times it's data or information from our senses. Either way, we start our models with a medium that we mold."

With literacy, that material is words.

When we start "modelling" with the written word, we first have to be able to understand a vocabulary, sentence structure, and ultimately a paragraph and narrative structure (beginning, middle, end). Normally this happens in the spoken word way before we need to express this through writing. We are also exposed to many, many, examples of paragraph and sentence construction in the written word before we are expected to write anything more than simple sentences. We first start writing by learning to express our vocabulary (spelling), with simple grammar, then putting our ideas together into sentences, paragraphs, and with narratives, and refine this modelling skill over a lifetime.

With programming, that material is a programming language. When teaching, ideas of programming should ideally be discussed - loops, conditions, functions, variables, etc. Then move on to reading code - functions at a time, long before a line of code is written. Something like Scratch seems to be highly valuable for this, much in the same way that children have always had games where they are expected to put the words into a sentence in the correct order, usually before they are expected to be able to write the sentence themselves.

For me the real disconnect, is that the material of modelling, the medium, for programming, is a programming language and code. Excel is NOT the medium of modelling for programming. Excel is a medium for modelling Excel. The constraints in Excel help people understand... Excel. It won't really help you understand data modelling, because the modelling you do is free-form. The constraints of Excel are unique to Excel.


You could call this article UML editor vs text editor. What it may point out would be the lack of a modeling tools. UML is old fashioned, java/entreprisy, and new ones like omnigraffle aren't specifically targeted towards modeling domains. Whereas text-editors are evolved, bright and shiny.

On a another level I agree that what's truly fascinating with programming is not mastering the quirks of a specific language but translating an external model to the shape of rules expressed in a language which has it's own internal logic. Not just to have a static interpretation like painting or literature would provide but a dynamic one which evolves with user and data interaction.

This is really unique in human history


My advice has always been if you're not willing to commit 3 years full time to coding don't do it. It just takes that long to be useful on a team. Instead, focus on whatever area you want to excel at and leave the engineering to the engineers.


In the late 1970s and early 1980s, domain-specific programming languages sprung up. Languages for simulating economic systems, for a company's financials, for modeling biological systems, and many more.

Now, in the twenty-teens, functional programming is gaining new traction as developers learn that, at their core, they start with defining a domain-specific language.

The tools for creating domain-specific models are orders-of-magnitude too abstruse, complex, arcane, and difficult. If Eve can make creating and playing with models even one-order-of-magnitude more accessible, it will help shift much of programming up a layer of abstraction.


I seem to remember some project (I think it might be a Microsoft Research project led by someone famous) which has been going on for 10 years or something like that which has the ambitious goal of doing something similar to what I understand Granger to be doing, namely some kind of non-scripting programming. Taking blocks of knowledge or computation or whatever and arranging them in way intuitive to non-programmers. Anyone know the project I'm talking about? FWIW, I think it's been mocked at various times for being vaporware after even 10 years or whatever.


Good points from Chris, but of course this depends on where one is. Growing up in a country that has shocking levels of illiteracy, I assure you that being literate, even in its most basic form, is already a tremendous step forward.

If you see it in this light, the goal of the movement is clearer. The movement is seeking that crucial, qualitative initial step and the rest is depending on the person, the social env, and so on.


The author spends a considerable portion of the piece attacking a straw man. As he mentions in the article, people who stress the importance of reading and writing skills aren't merely advocating for the ability to transcribe speech to paper and pronounce written text; they are also advocating for the communication skills that are taught as part of a good literacy education. Similarly, people who stress the importance of programming are not merely advocating that people gain the ability to transcribe a perfectly detailed spec into executable code, they're also advocating for the modeling skills that give you the skills to write the spec.

His real point is that he believes teaching people modeling and computing skills using Excel would be more useful than teaching them modeling and computing skills using python.

Depending on the audience and the amount of time they have to dedicate to the topic, I might agree with him. Many undergrad business programs, for instance, require a course in Excel/VBA, which seems sensible. However, I think many people who say things like "computing is the new literacy" think that computing and modeling might be important enough skills that most people should dedicate more than the equivalent of 1 semester in college to them. They might instead suggest 3 semesters, or maybe even that high school students take a computing course every year. And if he wants to replace all THAT with Excel, I think I would argue that you've got enough time to add in some python too.


Programming should be seen as a medium in which one can solve a problem. I agree that people have been approaching the idea of "teaching code" wrongly. At the end of the day, aren't the professionals trying to model their problems and then apply their knowledge to make sense these immensely difficult problems?


Granger makes a strawman of the coding-as-literacy concept by focusing on mechanical coding (as in "coding technician": someone who receives the requirements for a subroutine and implements it without any imagination).

Someone exploring topics in computing in an interactive environment isn't doing that.


I actually mention the magic of LISP and Smalltalk environments - they do greatly add in the exploratory creation process. But my "coding as literacy" strawman is born out by the industry: http://code.org/ and http://www.codecademy.com/ are great examples. By saying you can learn to code by following through some online lessons that teach you how to write some Javascript, we're doing people a great disservice.

Moreover, my real argument is that I don't think coding should be literacy. From the fundamental disconnect section:

> We don't want a generation of people forced to care about Unicode and UI toolkits. We want a generation of writers, biologists, and accountants that can leverage computers.

I don't think people should have to worry about exactly what the computer is doing if their problem isn't related to it. Specifically teaching general purpose programming as we think of it now is largely teaching people to build apps or command line tools. Instead of a generation that thinks of that as computing, we need a generation of people curing cancer with it.


Chris, as someone who respects you and your work, I'm interested to know how you would describe the "movement" aspect of coding-as-literacy. That's where I began to smell a strawman (even if not intentional).

Follow the money: what is coding-as-literacy but a government-supported effort to bring down the price of software developers, disguised as a public service campaign? That in itself is a strawman, yet I suspect that in talking about a "movement" we've taken for granted that leaders within the field (such as Alan Kay), and not just celebrities are advocating for this. Do you know of any?


There is a strong 'movement' towards pushing this concept for sure. See code.org pushed by Zuckerberg, and even used and promoted by the President: http://www.whitehouse.gov/blog/2014/12/10/president-obama-fi...


Coding is not the new literacy, it is the new masonry and we will all need a place to live eventually


Being able to read and write are the minimum skill you have to have to be called literate. I think that the ability to write and read code is a minimum skill for you to truly benefit from our new information society.


What are some good resources ie) books to read on software modeling?


I really hope this isn't going to devolve into an argument between top-down and bottom-up methods


No, Vim is the best.


most people are not smart enough to be even mediocre coders




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: