While risking the typesetter trap I would say that to program is to tame complexity and I don't think that's going anywhere. Programming is not only a skill but an ongoing research unfolding on the verge of what is hard and seemingly impossible.
Training to become an experienced typesetter requires skill, a good eye, and lots of knowledge but once you're at it then it's mostly just that field. You can become better in it but in the end you still do, well, typesetting.
Programming, however, changes over time as you abstract out the mundane, repetitive -- or "typesetter" -- stuff out to make more mental space for dealing with harder complexities. Programming has always been hard and by definition it always will be.
For example, modern languages with type-inference and automatic garbage collection have relieved many from the banal pain of designing explicit type declarations and arranging malloc()/free() calls manually. Languages such as Haskell now allow for solving more complex problems. That is because Haskell programmers don't have to waste their wits to arrange for and solve problems related to programming itself and can rather focus on the more novel, original problems.
I don't mean to say that programming doesn't involve automation; in fact, it uses a lot of automation to abstract away and obsolete old tasks. It's just that, unlike typesetters we haven't (usually) had, for example, memory management specialists who only write memory management systems for projects and who would become unemployed as garbage-collected languages gain popularity.
Any "typesetting" involved in programming is generally automated to free the same programmers to work on bigger problems. Programming doesn't end, it just changes and become harder and harder.
> Programming doesn't end, it just changes and become harder and harder.
I'd say the programming becomes easier, but then there are always harder problems than the ones that are currently within reach (eg. AI is still out of reach), but I think this is what you meant.
For me the best example of this was writing some PHP after learning Scheme and Haskell: This problem would be so much easier with a lambda keyword.
Comparison with typesetters is not valid. Typesetting as a job may have disappeared but design of text and fonts certainly has not.
In my last year of university, I had a class on data modelling. The lecturer wisely told us that we should abandon learning code because CASE systems would take over within 2 years. That was 1993, and CASE slipped beneath the waters in a few years.
It's true that with improving languages and tools, many of the things that used to take time are automated away, but this leaves more time to work on the higher level design and problem solving. Until HAL gets here, the programmers job is pretty safe.
* Typesetting as a job may have disappeared but design of text and fonts certainly has not.*
Indeed, text design is in a sort of golden age. Never have the tools of typography been so highly available to so many people.
The very word "font" used to be a highly abstract technical term, known only to experts. Now there are large audiences of people who collect fonts like baseball cards.
Until HAL gets here, the programmer's job is pretty safe.
Assuming, of course, that HAL doesn't create demand for software just like humans do. "Dave, I could work much faster if I had a spreadsheet that was better designed for nonhumans. But I don't have time to work on that because I'm too busy in meetings during the day, and at night I want to practice my chess. I want to do better in the tournament next month."
Typesetting is nowhere near as complex as programming. You have much fewer freedoms to play with and hence much less room for creativity. This comparison is not just invalid: it doesn't make any sense.
He uses the fallacy of the false dichotomy: either something is intellectual work, or it requires little thinking. Since typesetting was 'intellectual work' and isn't required anymore, so will programming vanish. The fact that programming is just about the most complex activity known to man, spanning more layers and orders of magnitude than any other activity, is completely ignored.
I'm not actually trying to say that programming is the same as typesetting. I just wanted it as an example of a 'profession' that came and went purely because of technology.
That's why I went on to describe a model that could minimize programming and heavily reuse any earlier efforts. It is the possibility that the model is implementable that could cause a major reduction in the need for skilled programmers. Typesetting just lays out a historic pattern.
I agree that programming won't vanish, but certain categories might (or nearly so). There used to be large numbers of assembler programmers; now there aren't.
"Software would no longer be a set of features in an application. Instead it would be millions and millions of simple transformations, which could be conveniently mixed and matched as needed."
Right, so instead of writing lines of code, we'd mix and match from a pool of "transformations"... which would require some means of describing how those transformations should be combined... which is pretty much analogous to writing code.
This prediction has been made repeatedly since 1950 and it has never happened. There are just as many good explanations as to why, so I won't bother enumerating them. I have no idea why new people keep coming up with this same idea.
Todays typesetter is the sysadmin not the programmer. As the authors improve software to lower maintenance and hardware is moved to the cloud one sysadmin will do more than the IT dept could ever do. Help desk function will also shrink as users become more savvy, workstations more secure, and recovery automation ubiquitous. But remember type setters had a good run in the west, 1450 - 1960's.
CASE didn't cut it because business users, the REAL target customers for CASE, weren't savvy enough yet. Now, business users are just as likely to look at code as they are business requirements. I see this often in my work. So, the original idea of replacing unkempt engineers with smooth business people can probably move forward soon.
But, don't get me wrong, it won't be because business users are now savvy enough to take on hard-core programming. Rather it will be due to an active open-source community, who have been working to give business users big chunks of modular functionality that they can use like Duplo blocks to build the software solutions they need.
In the corporate world, the future role of software developers, to me, is seeming to tend towards a Business Analyst/Systems Engineering role-combination.
The problem is businesses are never satisfied with the duplo blocks, it gives them no competitive advantage. They nearly always want highly customized software that precisely fits their needs; no, programmers aren't going anywhere any time soon.
Neal Stephenson was really on to something when he likened programmers of the future to digital construction workers. I certainly feel that way whenever I have to grind out a CSS layout.
This post resonates with me. Without getting too hung up on the typesetter extinction, I think the general notion that the task of "coding" will change is almost a given. Even the early assembly coders of the 60s would be hard pressed to imagine the life of a modern coder working in an IDE.
At a high level I agree with the general change vector the author proposes. I've been code doodling with some of these notions in my spare time for quite a while now. In particular, I think
contexts:
are nested and form a graph (not necessarily a tree)
can exist in isolation outside of any other context (e.g. a double entendre)
are some times shared (a conversation)
transformations:
*can* have conditional processing
may not be deterministic (sometimes probabilistic feels better)
I think other languages can (for all practical purposes) return lists
I think some other key aspects are missing from the description, particularly:
On a few occasions I've almost written a similar blog post, but maybe this will spur that along. Many of these notions also seem to fit comfortably in LISP (from what I understand).
Even the early assembly coders of the 60s would be hard pressed to imagine the life of a modern coder working in an IDE.
That has nothing to do with changes in the task of coding. It has to do with changes in the relative popularity of various programming styles. As witnessed by this sentence of yours:
Many of these notions also seem to fit comfortably in LISP
That's probably true, but you're talking about a language which was invented in 1958 and implemented in the early 1960s. The Lisp hackers had the equivalent of IDEs and modern programming environments in the 1970s, if not before. They are not especially impressed by our modern programming tools. (And many of us actually prefer to do our programming using tools descended from theirs, like emacs.)
Even the old Lisp hands who helped invent (e.g.) Java are not particularly impressed. As Guy Steele, co-author of the Java spec and former author of the Common Lisp spec, famously said of Java:
"We were not out to win over the Lisp programmers; we were after the C++ programmers. We managed to drag a lot of them about halfway to Lisp."
The outer surface of programming has changed a lot over the last fifty years, and Moore's Law, the buildout of our infrastructure, the increasing sophistication of computer users, and the general growth in the population of programmers has changed the nature of the problem space many times, but the essence of the task hasn't really changed all that much. You can learn surprisingly relevant things by reading papers and books from the 1960s.
Oh, I'm right there with you on the IDE v editor issue (maybe vim though). My point is that the daily life of a (typical IDE) programmer in 2009 is very different than one of an emacs developer of the 60s/70s. I agree there has been a lot of useless reimplementation going on (make works just fine for me thanks). But on the whole I think we tend to think in higher level concepts (regardless of our editor choice).
I've also noticed an increasing trend of migrating code to data (though I'm not sure the ones doing it realize this). In the LISP world they are one and the same (as I understand), in most modern programming languages, there's a pretty binary boundary between code and data that people are trying to work around all of the time by "evaling" code or introspection and various other meta-programming techniques.
I believe it should be a continuous space. I think LISP supports this better than most languages. I agree there's definitely a back-to-the-future notion here. But I think the back swing is likely to have an alternate UI to it (in additional to the normal S-expression syntax for us die hard give-me-a-text-editor people).
However, much of what I've done in corporate IT is basically like plumbing. You connect this here with that there, and make sure all the fittings are secure and compatible. In the case of data, you have to sometimes transform the form data takes. This will become less of an issue as interchange standards take hold in the most computing intensive industries. These will grow into defacto data standards, then you'll only have n transformations to build where you once had n^2.
I know of people who used this n^2 to n reduction to tie together corporate data at a major energy company, producing "miraculous" results. It's only a matter of time for the economics to overcome corporate cultural inertia. (Which is often actually heavier in IT than elsewhere!)
I am also seeing this at play at a major bank I'm doing work for. It's changed the internal culture of IT from:
This system is my territory, which I shall defend
viciously.
to:
I gain fame and respect by publishing my data as Web
Services and being useful to as many other groups in
the company as possible.
It's changed this major bank's IT from the Cathedral to the Bazaar!
The article is correct in that as time goes on and technology (along with application design) evolves, program types that were once written "by hand" in a textural programming language are later created using higher-level tools (not higher as in superior, but working at a higher level of abstraction from the hardware).
However this only happens in known, established domains and it happens gradually. What Ruby on rails has done for the CRUD web app is a perfect example.
"New" software will always be written in the native tongue of the hardware, which to this day remains assembler. This in turn builds languages, which can then lead to more "automated" tools, but until the fundamental nature of the hardware changes, how can the method of programming it?
I've heard this before - dBase was supposed to let all the people who knew nothing about computers build their own databases. Of course what you actually got was a bunch of databases made by people who knew nothing about computers... That was what, 25 years ago now?
It seems that the article should a little less categorical.
The typesetting example shows that instruments have changed (typesetting machines(or whatever it called?) to desktop computers), but it still exists - there're many people very skilled in typography these days. And many of them still have jobs in design companies, newspapers (online ones too), etc. I'm not very familiar with that industry, but I've seem many employed people doing typography full-time (on computers, of course).
The ramifications you're talking about in case of programming seem to be already here pretty much.
But they don't mean the end of programming. They just mean that libraries would be developing and much of stuff that can be automated - would be, but you still need to go low sometimes. Take a look at Yahoo Pipes - that's a very high level of "programming", i.e. - take that, transform, output. But still, there aren't many people using it, because more often than not it's: "take that, BUT ...." and that means that you need to go lower INTO the library. The best libraries let you automate the MOST COMMON tasks, but none of the libraries or programming languages available is the ultimate solution supposed to cancel programming.
Just like invention of computers didn't obsolete typography, it was just transformed (the instruments changed).
Someday we (programmers) will be obsolete, but it wouldn't be because of better types or transformations (pointed out in the article), it would be because somebody develops some kind of "brain-in-a-jar" that can transform natural language commands into acceptable electrical currents. (Or whatever kind of energy we would be harnessing at that point).
I.e. "get me some coffee" and the coffeemachine turns on and wheeliebot starts moving; "when I say curse word - deduct 5$ from my banking account" and bank's "brain-in-a-jar" "understands" command from your home's "brain-in-a-jar".
But seeing that isn't going to happen in at least next 5 years, I feel safe :) (5 years, because they recently started growing organs in jars.. who knows when it's simple enough to grow brains in jars). But my guess is that will make a lot of jobs obsolete. And market competition would get fierce!
Maybe. It'd be some sort of planner, and while it's still early days wrt problems at this scale, there are some that can hold knowledge just like a human does, to escape the exponential explosion of the naive method.
Training to become an experienced typesetter requires skill, a good eye, and lots of knowledge but once you're at it then it's mostly just that field. You can become better in it but in the end you still do, well, typesetting.
Programming, however, changes over time as you abstract out the mundane, repetitive -- or "typesetter" -- stuff out to make more mental space for dealing with harder complexities. Programming has always been hard and by definition it always will be.
For example, modern languages with type-inference and automatic garbage collection have relieved many from the banal pain of designing explicit type declarations and arranging malloc()/free() calls manually. Languages such as Haskell now allow for solving more complex problems. That is because Haskell programmers don't have to waste their wits to arrange for and solve problems related to programming itself and can rather focus on the more novel, original problems.
I don't mean to say that programming doesn't involve automation; in fact, it uses a lot of automation to abstract away and obsolete old tasks. It's just that, unlike typesetters we haven't (usually) had, for example, memory management specialists who only write memory management systems for projects and who would become unemployed as garbage-collected languages gain popularity.
Any "typesetting" involved in programming is generally automated to free the same programmers to work on bigger problems. Programming doesn't end, it just changes and become harder and harder.