Hacker News new | past | comments | ask | show | jobs | submit login
The Essence of Programming (2021) (gingerbill.org)
96 points by jwdunne on Aug 17, 2022 | hide | past | favorite | 64 comments



The essence of programming is designing a way to transform data. You ask yourself, "what is the nature input data and how it arrives" "what is the nature of the desired output data."

When I started learning programming from books, the internet and later formal study this wasn't stressed nearly enough. A program transforms data. So much else starts falling into place about all the various abstraction techniques useful to perform that kind of transformation (procedural, oo, functional etc) and technologies used to assist, (testing, debuggers, etc) once you keep that thought front and central in your mind. At least I feel it really did it for me.

Reading code is the same. Where are the places the data coming in? Where is it going out? Sometimes this is a better place to start than "main"

I'm sure that's really obvious to many but I'm not sure it is to the author in a way that is being successfully communicated:

”programming is a tool to solve problems that you have in the domain of computers”

Taking a baseball bat to a computer, on some level can meet the suggested definition of programming, but I don't like it that it does. I also don't much care for the use of "problems" there either.

The above is especially opinionated because I'm particularly interested if people disagree!

edit: I'm definitely still learning programming here.


Author of the article here:

I completely agree that the essence of a "program" is transforming data. In my previous article which I linked to in the main article, I specifically state "The purpose of a program is, and ought to be, something that transforms data into other forms of data".

But the essence of "programming" is different, and I stand by statement that "Programming is a tool to solve problems that you have in the domain of computers". And this article was an extension of the previous article meaning that taking it out of context has caused this confusion. But the term "programming" embeds the concept of "a program" already: programming produces a program. Programming does not start with hitting the computer with a baseball bat.


I've argued for a long time that everything in programming comes down to:

1. Data. It can be in many different forms and structures, more or less strictly typed, and a big part of writing a program is determining what it looks like, and what it should look like.

2. Procedures to work on that data. They can transform the data, remove it, reshape it, convert it, or do any number of other operations on it, simple or complex.

That's all there is to programming.


Maybe we should differentiate the essence of programming from the essence of software development:

Software development, in my understanding, is about designing and implementing tools to solve particular (classes of) problems. Exactly what is being described in the article.

I think Programming in general is about writing instructions that (maybe not only) a computer can execute to produce (the desired) effects. These instructions may or may not involve input data and may produce side-effects along the way and may not even produce output data (although one could say that “nothing” itself is data similar to how we use 0 as a number).

I don’t think that we need a purpose for writing programs. We can program just about any nonsense and we don’t even have to ever execute it. I think “problem solving” is already too specific.

Even though it makes a lot of sense to think about programs as mathematical functions that transform data, underneath our abstractions are building instructions for the computer to execute and they tend to produce the desired output if we are doing it right and there are no interfering environmental circumstances.

A simple substitution of input data with output data would be the most primitive example, although it would be trivial. To produce non-trivial outputs, we need to replace the substitution with a (more complex) algorithm. Algorithms are tools that solve particular (classes of) problems. So here we are, doing software development, because we actually need a tool.


Pedantry: the only exception I could think of is a timer (delay, alarm clock), where the output is when, not what. sleep 10

Yes, information technology is for processing information.

Any transformation can be decomposed into lossy and lossless components (aka bijectve and non-bijective; reversible and non-reversible). The nice thing is that the lossless component can be automatically checked for that property.


Reminds me of this comment from antirez back in 2012: https://news.ycombinator.com/item?id=4560591

> This is one of the few programming quotes that is not just abstract crap, but one thing you can use to improve your programming skills 10x IMHO.

> Something like 10 years ago I was lucky enough that a guy told me and explained me this stuff, that I was starting to understand myself btw, and programming suddenly changed for me. If you get the data structures right, the first effect is that the code becomes much simpler to write. It's not hard that you drop 50% of the whole code needed just because you created better data structures (here better means: more suitable to describe and operate on the problem). Or something that looked super-hard to do suddenly starts to be trivial because the representation is the right one.


True and useful way to think. Such a matter of course to me that I would also forget to mention it.

Output can also be an action performed through an actuator (robotic arm, welder, coffee pot, ...). Could be considered as data (signal coming to the actuator) but the "problem solved" is the action itself (and usually involves a feedback loop).


> The essence of programming is designing a way to transform data

I don’t agree here - in my experience data-oriented programming is only a subset of all useful programs.

For example, a game is in my opinion not like that. Sure, very pedantically it may be considered a “mathematical function”, but then so is every human interaction, and that’s not a useful pedantism.


Why is a game not like that? It transforms a stream of input events into stream of output events (video and sound hardware commands). Yes there is lots of state, but that can also be thought about as data that is transformed, step-by-step.


I'm confused. What do you think games do with those tens of gigabytes of data?


I would agree to some extent, but I would add to this particular sentiment that it's also about the representation of data.


Ugh. Programming is not craft. It becomes craft when journeyman that haven't yet matured into being able to understand the whole system see the application of holistic approach with success and think they've reached some sort of enlightenment.

Approaching the issue of solving problems with incomplete information, they end up with circular conclusion, like the essence of programming being operating the computer.

At some point some realize that programming is modeling a domain and it's transformations into a format that is computable and then engineering starts to happen.


Author of the article here:

"Programming is not a craft." It is a craft and ought to be treated as such. Not doing so leads to the mess in software we have today. Software that is no more complicated than software from 20 years ago but runs 100x-1000x slower than its older counter part. This is the direct result of not understanding what art and craft of programming is and not treating it as such.

I highly recommend reading the previous article to this one posted (Pragmatism in Programming Proverbs) to get a better understanding of what I am expressing, since "The Essence of Programming" is a sequel to that article.


I actually agree with GP.

Counterpoint: “the mess in software that we have today” (we at least agree on this) is precisely generated by the fact that people treat programming as a “craft”, where everytime a (apparently) new problem arises, people feel the need to “craft” together some new framework or way of doing things, only looking at the small problem at hand and without considering the bigger picture.

In other words, people treat programming as an artisanal activity instead of what it should be: an engineering discipline and possibly still a scientific pursuit.

Let’s say this: Programming could be approached as a craft by the people who apply it to real world problems. BUT, there is a very different job, which is the one of the Computer Scientist (the name has a meaning), which should instead treat programming as a science. This is currently not happening, or not fast enough, exactly because so-called computer scientists treat programming as a craft, without truly exploring the fundamentals.

I have expressed this view several times in past comments here on HN, but we are still far from having a proper, theoretically grounded framework to do coding, and crafting more short-sighted solutions to pile up on top of each other won’t help…


I think the contention here is the meaning of "craft". I don't think we are actually disagreeing about programming itself but rather how people use that term.


Neh. Computational problem can and are being solved with engineering solutions. Just because a crafty cousin can program something that almost work often enough to build a business on top of it doesn't mean that engineering programs is impossible.

"Programming is a tool to solve problems that you have in the domain of computers"

Again, this is wrong. Computers are tool with well defined set of operational constraints. You use such tool to model problem _outside_ of the domain of computers, and while it's true that programmer primary effort is in understanding the model and writing transformations that produce useful result, that process is a craft only if the practicioneer is an artisan.


> Computational problem can and are being solved with engineering solutions.

I don't even understand what you are trying to say here. What you trying to tell me that I am wrong about? Please tell me exactly what I wrote to which you disagree with.

> Again, this is wrong.

Tell me something where I can do programming (a program, not a programme) which isn't a form of computer? (Be that a digital or analogue computer, i.e. something that _computes_).


In the approach. Programs, whether the actual instruction set, the ux, the workspace they implement, what have you, are not essential to programming.

They're the output of model building, in so far that with model precise enough, you get tooling to create programs for you. Hence the program can't be the fundamental building block, the essence of computing.

We are limited in expressing our ideas in the language the computer speak when tooling doesn't help with higher abstractions, and this happens layers after layers after layers, from microcode all the way to bpel.

But the essence of using a tool is not using the tool, it's achieving the goal the tool was built for, using the tool is, at best, incidental to the tool not being automated enough.


> In the approach. Programs, whether the actual instruction set, the ux, the workspace they implement, what have you, are not essential to programming.

And I ask again, where do I ever make any of these claims? I've asked specifically to tell me exactly what I wrote to which you disagree with and you just reply with a strawman.

Regardless, I hope you have a lovely day and thank you for reading the article!


Again, it's the circular thinking that programming is to solve problems in the domain of computers. I even quoted you the passage and everything, I understand you being full defensive but the idea is quite specific.


This seems on the right track. On the notion of ”programming is a tool to solve problems that you have in the domain of computers” though:

Writing a computer program serves two separate purposes: to communicate a set of functions to the machine, but also to communicate the purpose of these functions to a human. The former is for execution and the latter is for code review and, later, debugging.

Any old clown can get a computer to do something approximating a solution to the problem at hand. If it has bugs then the author — or more likely some other poor sap down the line — will be on the hook to first understand the intent, and then make the code match the intent.

In a professional shop you will have code review where this “debugging” step happens first. Your reviewer will read your code to infer your intent and upon agreeing with that, check your style and composition for inconsistencies or lack of clarity.

To that extent, a large codebase is akin to a mathematical proof where each new module and function represents a lemma on which the final proof depends. No serious scientist would present their thesis without a progression of isolated and carefully laid out lemmas and corollaries, and no serious programmer throws up a thousand line function with 15 arguments, global state, and — if in a dynamic language like Python — inconsistent return types.

Your code is a proof to your reviewer that you are on the right track. The fact that it happens to execute without bugs and produces the correct output is far less important than the code being readable, comprehensible, and consistent as part of the larger system.


> The fact that it happens to execute without bugs and produces the correct output is far less important than the code being readable, comprehensible, and consistent as part of the larger system.

It's more important that a function looks good, than that it actually works?

I'm sorry, but that sounds stupid. A computer is a functioning machine first.

If it worked from the start, it wouldn't need to be fixed later.

You can always study someone's work to figure out how it works. There's no point studying a system that doesn't.


> If it worked from the start, it wouldn't need to be fixed later.

True. But you ignore the fact that NO SOFTWARE IS EVER DONE. Software always has bugs, and even if it didn't, it will bitrot as the business needs change.

In theory, it's better to have 100% working software. In practice, that never happens (or only happens for a few weeks at best). Eventually the software needs to be changed. In that case, software that is "written for humans" will always be easier to change than "software that used to work, but now we need to change it, but nobody understands it".


SOFTWARE CAN AND SHOULD BE DONE. Complicated and bloated systems are never done, but not every system (program/tool/etc) needs to be big, bloated, ugly and complex. It does not matter if the software is 'written for humans', if it's too complicated it won't be changed and/or fixed ever (basically)

On that note, Zawinski's Law:

“Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can.”

This is mostly true, but it does not have to be. :(


  "Software is never rewritten. Projects last longer than expected; programmers get bored or burned out; management moves on the newer challenges. The attitude of ‘good enough’ reflects reality.

  Instead of being rewritten, software has features added. And becomes more complex. So complex that no one dares change it, or improve it, for fear of unintended consequences. But adding to it seems relatively safe. We need dedicated programmers who commit their careers to single applications. Rewriting them over and over until they’re perfect. Such people will never exist. The world is too full of more interesting things to do.

  The only hope is to abandon complex software. Embrace simple. Forget backward compatibility." - Chuck Moore


> We need dedicated programmers who commit their careers to single applications. Rewriting them over and over until they’re perfect.

I love this.


Pseudocode is the easiest to change, but also literally does nothing.

Software in practice needs to run to classify. Without that it's just a .txt file that won't compile.

Otherwise you're talking about a standard, a protocol, a document with no program. Instructions for humans how to program computers.


> True. But you ignore the fact that NO SOFTWARE IS EVER DONE.

that is just false. I've seen plenty of one-use software - made specifically for trade shows, exhibitions, etc. which were never reused again because the entire software was the logic specific to that particuliar exhibition.

there is also a few thousand gigabytes of old game ROMs and abandonware on the internet which are all perfectly done software.


All things being equal in code correctness: the preference is cleaner parseable (by humans) code, as opposed to for example: having a while(true) loop with no exit condition, no way to get out of said loop - and have the code work perfectly exiting the loop due to esoteric features that take weeks to figure out, and are brittle to system changes (no code is static).

So yeah: code readability becomes the deciding factor for both what to accept: and what will be (most importantly) the most efficient use of development hours on fixing…

Or a better way of putting it… We are monkeys that bribed rocks and copper to think for us with electricity: design code and systems that even a monkey could maintain.


> I'm sorry, but that sounds stupid. A computer is a functioning machine first.

It is not stupid - just counter-intuitive, which should give us a pause to think. You say, "A computer is a functioning machine first" - yes - but consider why SICP says, "Programs are meant to be read by humans and only incidentally for computers to execute." (see a discussion on the quote https://news.ycombinator.com/item?id=16431701).


> Programs are meant to be read by humans and only incidentally for computers to execute.

A silly thing to say while sitting at the top of the tower of abstraction, ignoring the fact that programs that actually work provide the foundation for the entire system.

Computer programs were originally intended for humans. Entire office buildings of humans with the job title "computer" would examine paper documents and carry out instructions therewithin.

Your linked quote talks about efficiency and style of code, which is a far cry from "execute without bugs and produce correct output" - an unavoidable necessity for a functioning computer.


Software spends a significant amount of its lifecycle in a state of "not done" and/or "not working". The possibility of getting software to "done" or "working" hinges critically on the ability of a human to read, understand, predict, and change that software.

Once the software is done, sure, throw away the source code entirely (if you dare).


Author of the article here:

I highly recommend reading the previous article to this one posted (Pragmatism in Programming Proverbs) to get a better understanding of what I am expressing, since "The Essence of Programming" is a sequel to that article. I originally wanted to write the previous article in normal prose article, regarding the topic of "Pragmatism in Programming", however I thought I’d experiment in style by writing in a proverbial style.

> The fact that it happens to execute without bugs and produces the correct output is far less important than the code being readable, comprehensible, and consistent as part of the larger system.

I partially disagree with this. If the code executes without bugs (very rare) and produces the correct output, then that code solves the problem. The next task is to make it readable, comprehensible, and easy to maintain whilst not trying to introduce other bugs.

I recommend reading Casey Muratori's article on "Semantic Compression" https://caseymuratori.com/blog_0015

This article is a good overview of the idea I am trying to express in this reply. It's very important to get things working BUT then actually improve it, don't just stop there!


sorry, going to have to go with a joe armstrong truism here -- make it work, then make it beautiful, then if you really must, make it fast. note the order of the first two

a large codebase is akin to a mathematical proof iff you are using something like an lcf/λc environment; the curry-howard isomorphism does not yield meaningful programmer-level insights about the average monad transformer lens abomination implementing a rest api endpoint

mathematicians do not have to worry about dependency hell in their proofs, or whether they should be importing wiles's proof of fermat's last theorem at version 3.8.21-rc2 or -rc3

if i write a window manager, its source does not primarily function as some indication that i am on the right track to competently wrangling the x11 protocol, unless i am trying to impress a headhunter from red hat. it is, in fact, primarily meant to produce a program that competently wrangles the x11 protocol. making the code clean and sensible is akin to the courtesy demonstrated by not rendering the blueprints to an air conditioner in chicken scratch with a crayon, and ensuring that part sheets are easy to follow to the assembly

framing code as in the service of the reviewer or debugger takes what is a self-evident prerequisite for getting anywhere, and promotes it to a counterproductive mind game

“oh i'll break this function out into multiple that are called exactly once, even though all that does is make the reader scroll a bunch to follow the logic, because i know jeff is gonna get this one, and jeff is a cargo cultist who thinks robert martin is the second coming of christ”

> Any old clown can get a computer to do something approximating a solution to the problem at hand.

in my experience, far more clowns are capable foremost of endless bikeshedding -- they adore giving off the impression that they are contributing something to the effort, and in the absence of insight, they substitute a performance. “hah, bet you're glad you had me to tell you to write every item in this array on its own line” (the items in question are numbers averaging two digits)

yes, code is for human beings to read, but experience has made me wary of anyone who shows up to passionately insist to that effect


I prefer Kent Beck's version:

> Make it work, Make it right, Make it fast


Very nice. For the jobbing engineer, no one should ever see your code until after step two.

Make it work, [take a break], Make it right, [push it for code review, gather feedback, make changes, push to master], Make it fast.


I disagree. A program is meant to produce useful output. Yes it should be written well (code clarity is a very strong correlator to the validity of a program), but the primary goal is that it should work.


You're channeling Dijkstra (who famously didn't even have a computer until his colleagues forced him to get a Mac so they could send him email.)


certainly an intriguing case, Dijkstra. I smell pedantry...but I could be wrong. A person so deeply involved in CS and not having a computer??? That's like saying being an expert cook but not wanting to taste food.


> A person so deeply involved in CS and not having a computer???

I looked for a reference just now and couldn't find one. This mentions it:

> Dijkstra was famous for his general rejection of personal computers. Instead of typing papers out using a word processor, he printed everything in longhand.

https://www.mentalfloss.com/article/49520/retrobituaries-eds...

> That's like saying being an expert cook but not wanting to taste food.

Oh ho! Don't let him hear you say that, eh? You'd get a scolding. He's the one who said, "Computer science is no more about computers than astronomy is about telescopes" and "Calling it computer science is like calling surgery knife science."

The analogy would be more like "an expert chef who refused to eat frozen dinners" maybe? :)


The difference as I see it: computer science (sorry, "informatics") is a mathematical discipline, and hence tends to concern itself with, out of a given class, the minimal (and maximal, when existent) object(s). Programming is an engineering discipline, and hence tends to concern itself with, out of a given class, the intervals within that lattice that are optimal by some suitability function.

In principle, the suitability function would be evaluated over the entire lattice; in practice, that function, whether explicitly or implicitly, includes a strong weight for "distance from existing solutions". In either case, this split in focus between the interior and the boundaries of the solution space means that programmers are often highly concerned with specific details that do not even appear (because they have been abstracted away) in the objects with which informaticians work.

As an example: theory people love to use 1-ary trees (induction steps cost nothing in proofs, but cases are expensive) and they will use 2-ary trees (sometimes even without pressure to sympathize with the machine) but systems people and programmers use k-ary trees (where, if it's been determined by measurement and not by compatibility, k depends upon "the" bandwidth-delay product between the storage hierarchy levels for which the tree is optimized ... or at least what the bandwidth-delay product had been at the time of writing).


So well put! I bow.

- - - -

FWIW, when I was poking around last night looking for references for the "forced to get a Mac" story I found this.

Tony Hoare:

> The first time I visited Edsger in Eindhoven was in the early Seventies. My purpose was to find out more about the THE operating system, [Dijkstra, May 1968.] which Edsger had designed. In the computing center at which the system was running I asked whether there was really no possibility of deadlock. "Let's see" was the answer. They then input a program with an infinite recursion. After a while, a request appeared at the operator's console for more storage to be allocated to the program, and this was granted. At the same time they put a circular paper tape loop into one of the tape readers, and this was immediately read into buffer file by the spooling demon. After a while the reader stopped; but the operator typed a message forcing the spooler to continue reading. At the same time even more storage was allocated to the recursive program. After an interval in which the operator repeatedly forced further foolish storage allocations, the system finally ground to a complete halt, and a brief message explained that storage was exhausted and requested the operator to restart operations.

> So the answer was YES; the system did have a possibility of deadlock. But what interested me was that the restart message and the program that printed it were permanently resident in expensive core storage, so that it would be available even when the paging store and input/output utilities were inoperative. And secondly, that this was the very first time it had happened. I concluded that the THE operating system had been designed by a practical engineer of high genius. Having conducted the most fundamental and far-reaching research into deadlock and its avoidance, he nevertheless allocated scarce resources to ensure that if anything went wrong, it would be recognized and rectified. And finally, of course, nothing actually ever did go wrong, except as a demonstration to an inquisitive visitor.

https://history.computer.org/pioneers/dijkstra.html


People don't do computer science for the mere theory of it. In the end, CS should yield utility--solve problems in the real world and those problems are solved through the medium of computers.

There is a reason it is called COMPUTER science while astronomy is not "telescope science". The analogy is faulty as well:

>> "Computer science is no more about computers than astronomy is about telescopes":

You can't compare astronomy to CS--astronomy needs both computers and telescopes and spaceships--but in CS, the computer is the central figure. If not, can you think of ANY other tool that represents CS?

>> "Calling it computer science is like calling surgery knife science." is another faulty analogy. The knife does not have the same amount of critical importance both as a mean and an end in surgery, as does the computer for CS; in other words, a knife is merely a means to an end in surgery, but in CS, the computer is both the mean and the end: in the latter case, the computer is a sort of representative of all the knowledge and practice in CS, in a given time. The sophistication of the computer and what it can do, represent, to a large extent, what we have achieved in CS--but you can't say the same about a knife vs. surgery. Capisch?


First of all, you're a brave man going after Dijkstra. Even two decades in the ground he's still going to win this argument.

The domain of astronomy is the starry sky and the Universe it reveals. The domain of surgery is anatomy, physiology, metabolism. In Informatics (not everyone calls it "Computer science", eh?) the domain is formal systems.

In each case the instruments (telescope, scalpel, digital computer) are not the main focus of investigation, they are tools, not the domain of study.

> the computer is the central figure

This is precisely the misunderstanding that Dijkstra tilted against.

> can you think of ANY other tool that represents CS?

Yes. The human brain.

I'll leave you with another joke, one of my favorite, although I don't know who said it, "Computer science could be called the post-Turing decline in the study of formal systems."


Considering how it were the Greeks that prevented Calculus from being discovered for 2000+ years, I would rather err on the side of Descartes and Leibnitz and still ask questions like these.

I think you mistake my using the term "computer" for the machine that everybody is using nowadays--but that is just an instance of the Class of computers. The ultimate goal of formal systems is making better Class of computers that should solve real-world problems more efficiently (any other formal systems digression into logic and linguistics always boomerangs back to machines).

Consider how Bayesian probability was looked down upon for decades before computers became powerful enough to reveal how the academic world was wrong about dismissing it--big names from the Frequentist school, just like EDK is in CS....

Even if you still disagree--which you will--there is no denying the fact that not using technology when you ARE an expert in the said technologies is rather odd, and perhaps a bit silly. Have you seen astronomers shunning mathematics? Math is a tool that simplifies a great deal of issues not ordinarily possible with a "naked" mind. So does the computer (as an instance of the computer Class); that someone did not even want to use a typewriter let alone a computer is bewildering to me.


> People don't do computer science for the mere theory of it.

Are you sure? If I were to pick any arbitrary computer scientist (even stipulating it won't be EWD himself, this would still be "demonic choice", from your point of view), are you prepared to argue that whomever I pick does/did not do cs for the "mere" theory?

Exercise N: Was Euclid's GCD doing computer science?

Exercise S: Is watching TikTok doing computer science?

Hint: Gurfr dhrfgvbaf ner zrnag gb vyyhfgengr gur fhssvpvrapl naq/be arprffvgl bs pbzchgref gb qbvat pbzchgre fpvrapr.


Yes, very very sure, but let me clarify.

When I say "don't, I mean "should not", but if they do, hey, that is what pedantry is for isn't it?

Exercise Q: Is doing theory for the sake of theory not ultimately about better theories that, in the end, should always yield utility in the real, applied world of computers solving hard problems?

Answer N & S: Yes, in a way, but it still, ultimately has sth to do with computers. Re TikTok I am not sure how you classify "watching". It can have something to do with CS and therefore computers in the sense that the original Tiktok source-code is written ON A COMPUTER--and when users watch Tiktok, the real data is analyzed ON A COMPUTER.


I don't think you can do astronomy without telescopes: they are the basic ingredient. :)


I don't mean to be lame, I see your smiley, but c'mon, it's like you're not even trying... People were doing astronomy before the invention of the telescope.

And historically most of what we call computer science was developed before the advent of the computer. Turing, Church, Boole, Quine, Haskell Curry, etc. Wittgenstein, Russell & Whitehead (Principia Mathematica), etc. I could list names all day, none of whom used a mechanical computer.


I don't think programmers, let alone beginners, worry that much about the best way to solve a problem. Usually they are struggling with any way to solve it at all.

You may get the impression from reading some online Q&A sites that everyone wants "the best" solution because of the way a lot of questions get asked. But in fact, phrases like "a good way" or "the best way" are more often than not purely rhetorical devices whose purpose is to appear competent.

The person is trying to ask the question in a way which suggests that he or she does know how to solve the problem in ways that are not so great, and is just looking for a better way. In fact, the reason they are asking is that they don't know even a bad way to solve the problem. They may have a partial solution, which has hit some roadblock or whatever.

People in this type of field sometimes have a hard time admitting they don't know how to do something. Particularly if they are not such beginners and have a track record of solving problems on their own, which has become part of their self-image.


I worked with this guy who repeatedly said "There's lots of ways to do it that don't work."

I try an approach, but I run into X. I try a different approach, but I can't make it work because of Y. I try a third approach, but Z. So I ask around about a good approach - one that won't run into these issues. When I do, I'm really asking for an approach that avoids X, Y, Z, W, and however many more are out there. I'm asking for something close to a best practice.

If you buy that approaches that are free from major pitfalls are rare, then I think you and I are in agreement.


Author of the article here:

All I ever get from beginners are questions about the "best tool" by which they usually mean "a set of rules and procedures so that I don't have to think about the problem".

> The person is trying to ask the question in a way which suggests that he or she does know how to solve the problem in ways that are not so great, and is just looking for a better way.

It's pretty much always the exact opposite since they rarely understand the problem that they are trying to solve to begin with. And that's the issue! They want a "tool" to just do it for them, even if that tool is not even appropriate for the task at hand. And the questions asked are usually in the form of an XY problem: "how can I use X to do Y?". You might be able to bodge the tool X until problem Y is solved but that doesn't even mean it is appropriate to begin with.

For example: I get a lot of questions at the moment about Entity Component Systems (ECS) and asking what is the best way to make one (or which to use) because they want to make a game. But their games don't even require one in the first place since they have such a low number of entities and components that an ECS would probably be detrimental to them in the first place (performance and productively). But to clarify, if the individual wants to just learn about the topic rather than apply it, then I usually recommend them to learn about relational databases first and how they are implemented.

Many people want to use a tool because they've heard someone else (usually a semi-famous programmer or a big company) using it and then following it in a "flash fad" rather than asking whether or not it is even appropriate to use. I will state that this perfectly _rational_ to do by the way since you are trying to outsource wisdom to others that appear to have it. However, programming as a profession is a best 70 years old, and as a result there has been not been enough time for evolutionary selection pressure to reveal the wisdom, so many people (especially beginners0 are effectively walking around following whatever the latest Fad is hoping for wisdom and a set of rules to follow.

> People in this type of field sometimes have a hard time admitting they don't know how to do something. Particularly if they are not such beginners and have a track record of solving problems on their own, which has become part of their self-image.

Humility is a virtue but also a very difficult one possess. For beginners reading: don't be afraid to make mistakes and look foolish. It is a heck of a lot better to look ignorant but honest than to look ignorant but arrogant.

I highly recommend reading the previous article to this one posted (Pragmatism in Programming Proverbs) to get a better understanding of what I am expressing, since "The Essence of Programming" is a sequel to that article.


> the questions asked are usually in the form of an XY problem

My usual approach to an XY problem is to treat it as an AZ problem, asking myself if there are any Ms for which I think I might be able to solve AM or MZ ... but it just occurred to me that this process may not be as intuitively obvious for anyone who has grown up with GPS navigation!


The author said that he found that most beginners spend exorbitant amounts of time finding the best way - I don't have the same experiences; It goes mostly like this: Imagine wanting to play a game of chess but not knowing the rules - of course, you won't be playing much chess, maybe you will be moving some pieces, but that would not be the chess, you don't know what moves/games are possible - it will feel daunting and impossibly difficult; now if you ever watched chess master play the game, it feels so super easy, mostly because he has the idea/intuition about the game; he doesn't have to think about every single possible move, only a portion of them that he knows (through a pattern matching - he played tens of thousands of games, and probably more) that are good; now replace 'game' with programming; beginners are handicapped by abstraction and complexity that is today's world of programming, and not the by the 'perfectionism'. Once they understand the scope of the problem they are trying to solve, and the possible ways they can go around solving it, the implementation is just trivial, even for beginners. For example, take junior web devs for example. How can you not expect them to waste time finding the 'right choice' (not even the best one), when there are hundreds of new web frameworks, css toolkits, html preprocessors etc...

Also, about the 'best way' - given the 'right' set of constrains, there is the best way to do almost anything. Unfortunately, most of the times, there is almost no constrains.


Author of the article here:

To clarify what I mean (which I also wrote here: https://news.ycombinator.com/item?id=32507580)

All I ever get from beginners are questions about the "best tool" by which they usually mean "a set of rules and procedures so that I don't have to think about the problem". It's rarely actually finding the "best way" to solve the problem but the "best way" to not have to understand the problem at hand.

Beginners want the shortcuts to solving the problem whilst at the exact same time not even knowing nor understanding what their problem is in the first place.

For more clarification, I highly recommend reading the previous article to this one posted (Pragmatism in Programming Proverbs) to get a better understanding of what I am expressing, since "The Essence of Programming" is a sequel to that article.


I just got my revelation on this when I got to writing scripts to manage servers.

Since I am developer I was always starting from position where I need "reusable and flexible" script. This approach got me stranded in doing stuff like updating TLS certificates by clicking in IIS settings - because I did not want even to start if I did not have "perfect script".

At some point I went "OK I am just going to do script for this single server" - well script is reusable because it is simple and for another server I just change variables inside of the script. It saves me a lot of time already.

Lots of business line software does not need flexibility but developers tend to have need to build a framework because somehow they feel like they are not real developers.


Any claimed "essence of programming" has as much validity as an "essence of art". It is each generation's responsibility to overturn their spiritual parents' essence and invent a new one.

Of course nothing is lost, and we are all richer for the adoption of new ways of seeing and thinking.

The experience of the '90s and the then insistence on primacy of "paradigms" should stand as a red warning to us all. Programming is nothing if not a creative endeavor, with as much scope for original thought and insight as any other. If you are not learning new "essences" on a regular basis, you are wasting your life.


Anytime you are learning new X on a regular basis, X is (by definition: these have been technical terms for ~2.5 millennia already) "accident", not "essence".


"Data dominates. If you've chosen the right data structures and organized things well, the algorithms will almost always be self-evident. Data structures, not algorithms, are central to programming." - Rob Pike, "Rules of Programming"

The essence of programming is data.


Author of the article here:

I quote Rob Pike along with many others in the previous article I wrote (which is linked in the article): "Pragmatism in Programming Proverbs" https://www.gingerbill.org/article/2020/05/31/progamming-pra...

And to be clear, I am making a distinction between the essence of "programming" and the essence of a "program". I agree that with Rob Pike that (and to quote myself) "The purpose of a program is, and ought to be, something that transforms data into other forms of data".


If you've chosen the right data structures and organized things well, the interfaces will be self-evident too.


or, from ~15 years earlier, with vocabulary to match: "Show me your flowcharts, and conceal your tables, and I shall continue to be mystified; show me your tables and I won’t usually need your flowcharts: they’ll be obvious."


aka "Representation is the essence of programming" - Fred Brooks, Mythical Man Month


Unless your tables and column names are constrained to 8 characters. =/




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: