Hacker News new | past | comments | ask | show | jobs | submit login
Roots of 'Program' Revisited (acm.org)
24 points by rbanffy on May 27, 2021 | hide | past | favorite | 17 comments



It never occurred to me that the "pro" in "program" referred to the future!

So, if "program" (πρόγραμμα) means "pre-written", like a royal edict that is posted publicly before it is to be obeyed, should we rename the field of "programming by demonstration" (for example, editor keyboard macros, spreadsheet formulas, or GeoGebra constructions) to "postgramming" or "epigramming"? (I suppose "epigram" already has a conflicting meaning.)

Because, in programming by example, the "program" is a log of the sequence of operations that were carried out, and perhaps why, so that we can repeat them, rather than being a prospective future plan for a sequence of operations to carry out in the future.


And then should "live programming", where the program is running while edited, become "gramming"?


Oh hi! What a pleasant surprise to see you on this thread :)

I guess what I was thinking was that when you're doing that, you're largely "postgramming". Some of the recent versions of Subtext that cater to test-first development are more "programming" in the etymological sense, but the "program" in that case is what we'd normally call a test suite—although other recent versions of Subtext turn the execution trace, complete with recorded results, into the test suite, which is more a question of "postgramming".


Of course that computers, computer programs, and computer science all evolved from previous machines, techniques and concepts. But that's almost meaningless, a truism, (almost?) everything can be traced back historically from something else.

The abstractions that underlie computers, computer programs, and computer science were discovered by analyzing and playing with concrete objects that had a historical lineage, but the abstraction exists independent of the objects. It is the abstraction that's important, not the objects.

When we talk about the first computer and the first program we are talking about the first physical realization of an abstract idea.


I read the "Why This Matters" section twice, but I still don't get why the etymology of the word "program" rather than the history of the concept of a "program" is of any importance. Does somebody think they understood the argument, and can explain it in a less convoluted manner?


The idea is that the etymology points to a previously unrecognized bit of history. The word comes from some parts of engineering that aren't usually thought of as programming history.

It's not that we should study that instead of the concept of program, but that it adds a new thing to study in addition.

Is that actually important? Only in the sense that history is important at all. They trot out the "doomed to repeat it" cliché, which is true enough if a little scattershot. I mean, sure, all history is good history to study and we never really know what's going to inspire somebody to think in a particular helpful way.

So rather than try to justify it as "No, seriously, this is important", I'd say that it's better as "Hey, here's this cool thing we just realized." Which is probably more like finding an easter egg than anything of really cosmic importance.


I am very confused by this article.

When we use the word 'program' in computing, we pretty much always mean 'a program for a (previously or implicitly defined) computational model', I don't see in what sense this is comparable to 'program clocks', which are basically timers/alarm clocks, if I understand that paragraph correctly.

Similarly, when we say 'code' we mean 'textual representation of the concrete syntax', clearly distinct from 'code' as in 'morse code', which we would also call 'encoding'.

> This strengthens a computing discipline where one often cares more about formalism than about actual programming

This is (perhaps) a (somewhat) fair criticism of (parts of) computer science as an academic field of study, but I don't get how the pieces fit toghether.


> Similarly, when we say 'code' we mean 'textual representation of the concrete syntax', clearly distinct from 'code' as in 'morse code', which we would also call 'encoding'.

You're thinking in modern terms. When you had to program in octal, "encoding" was exactly what you were doing. Assemblers and, later, compilers made that less true, but originally that was in fact the meaning.


> When we use the word 'program' in computing, we pretty much always mean 'a program for a (previously or implicitly defined) computational model', I don't see in what sense this is comparable to 'program clocks', which are basically timers/alarm clocks, if I understand that paragraph correctly.

It seems that a "program clock" was a clock that you could, more or less in the modern sense, program; you could put a plan into it, and then it would carry out the plan. Like those electromechanical outlet timers that you can set to turn on your floor lamps at certain times of day when you're not home. The "program" for those is the little slidy things around the dial that define when to turn the light on or off.

In the same way, a computer program is a plan that you can put into a computer, thus programming the computer to carry out the program. That concept isn't just comparable to that of the program you put into a program clock; it's identical, though there are some things the program clock can't do.


it's worth teasing apart where words come from. the term "dynamic programming" does not make a lot of sense until you understand the historical context.


Even knowing the context, it still doesn't really add anything to my understanding. :( Is a case of a name that is intentionally not useful to describe what it named. So frustrating.


I wonder what terminology was used to talk about planning the work of human computers, for large-scale calculations that required lots of people (such as drawing up tables of logarithms etc.): was that a program too?


At least they acknowledge that the British spelling for a computer program is "program". A decade or so ago I had a boss who insisted on "programme"...


Someone non technical once told me they "programmed as well" and proceed to talk about planning and scheduling things.


Isn't this the connection to linear programming?


Jacquard was the first to build a stored program machine. It didn't compute, but it did produce valuable programmed output. Thus the people who punched the cards for looms were programmers.

When Henry Maudslay designed, built and set up his 16 machines to make pulleys for the Royal Navy, he was also doing programming.

I think the main point of the article is that we need to greatly broaden the idea of what programming means. It doesn't have to involve a program counter iterating through a list of instructions.

Spreadsheets don't have a program counter (though of course, we use them in the process of making them work). Field Programmable Gate Arrays also don't have a program counter, and they do everything in parallel, yet they are programmed.

Programming is the process of breaking a problem down to a series of steps that are then mechanized.


I also thought of Jacquard as I was reading this. But that's not the point of the article at all. I'm with you 100%, it is useful and constructive to consider other programming-like disciplines and potentially consider retroactively naming them 'programs' (e.g. as a child first learning to code I was fascinated by my grandmother's knitting patterns, they seemed very similar to the programs I was learning to write). But that's not what is under discussion here: this is about actual uses of the word 'program(me)' in computing contexts and their precursors, and specifically whether credit for invention of that term belongs to the theoreticians or the engineers, which has broader implications in how we approach the field of computing today.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: