Hacker News new | past | comments | ask | show | jobs | submit login
Software development: should we stop? Maybe we should (blog.spencermounta.in)
363 points by enz on Nov 13, 2020 | hide | past | favorite | 272 comments



There's a lot of false premises and a lot of things that don't follow in this.

> Maybe the most marvellous, utopian idea for software was Unix program design, in the mid-1970s. This is the idea that programs will all talk together via unstructured streams of text.

It was a great idea in the 1970's. But we had OLE and COM and CORBA and GNOME and KDE and now PowerShell where you can load .NET assemblies and pass around structured objects in the same process. We've had Cocoa which exposed Smalltalk style control of objects in applications since the 1980's.

This isn't a problem of technology. It's a problem of incentives. It's the same incentives that drive programs to try to look different from each other instead of fitting smoothly into a set of user interface guidelines. It requires a producer of software to find it beneficial to them to fit into the ecosystem around them.

> The promise of a computer is to reveal truth, through a seamless flow of information.

Seamless flow of information requires connecting the underlying semantic domains. Semantic domains rarely exactly match. Humans use enormous amounts of context and negotiation to construct language games in these situations.

> music was around forever but in the 16th century, if you made music, you made it for god, or the king.

This is empirically false. We have records of dance music, of broadside ballads that were sung in the streets, of the music people played at home. And there were lots of student drinking songs about being young and free.

I think the real solution to the author's angst is to go study the field more broadly and get out of whatever ghetto they are living in.


> This is the idea that programs will all talk together via unstructured streams of text.

Curious. To me this is the worst thing you could ever do. Talking via streams to say `cat file.txt | grep ERROR | wc -l` is cool. But you could do SOOO much more, if programs would actually output structured data streams. You could connect standalone applications much in the same way as Visual Scripting, where you plug inputs and outputs together and mix them with operators (think of Unreal Engine's Blueprint, just for command line tooling).

It's a true shame that Linux did not develop a well defined CLI metaformat that defined exactly what parameters are there, what's their documentation, their completion, what outputs does a program produce based on the parameters you provide, etc. You could do true magic with all this information. Right now you kinda still can, but it is very brittle, a lot of work and breaks potentially with each version increment.

I think it stems from the design failure to build your app around a CLI. Instead, you should build your app around an API and generate the CLI for that API. Then all properties of structured data streams and auto-explore CLI shells come for free.


> Curious. To me this is the worst thing you could ever do. Talking via streams to say `cat file.txt | grep ERROR | wc -l` is cool. But you could do SOOO much more, if programs would actually output structured data streams.

A lot of people have had this thought over the decades, but it hasn't really happened -- powershell exists for linux, but who's using it? The genius of the primitive representation (stringly typed tables) is that it has just enough structure to do interesting processing but not enough to cause significant mental overhead in trying to understand, memorize and reference the structure.

Case in point of the difficulties of adding more structure without wrecking immediacy of manipulation is json.

For anything with more than 1 level of nesting, I do stuff like

    blah | jq . | grep -C3 ERROR
probably a lot more than I do

    blah | jq $SOME_EXPRESSION
because it's just so much less mental overhead -- I don't have to think about indexing into some complex hierarchy and pulling out parts of it.

I'm not saying it's not possible to get out of this local optimum, but it appears to be a lot more subtle than many people seem to think. There may be an simple and elegant solution, but it seems it has so far escaped discovery. Almost five decades later, composing pipelines of weakly structured and typed bytes (that, by convention, often are line separated tables, possibly with tab or space seperated columns) is still the only high-level software re-use via composition success story of the whole computing field.


Very few use Powershell for Linux because it doesn't pre-installed on a Linux box. Otherwise you can bet that people would be using it in large numbers. And yes I would prefer your second "mental overhead" way as it involves less typing. Unfortunately powershell is more verbose than bash not less.

Powershell is unfortunately not the shining example of a shell that best leverages structured/typed input/output succinctly.

But on Windows, sysadmins use powershell heavily. Nearly every IT department that manages windows machines uses Powershell.


> Very few use Powershell for Linux because it doesn't pre-installed on a Linux box

I don't buy that. On a GNU/Linux box, there's few things that are easier than installing a new shell, if you prefer a different shell than bash it's two commands away. Bash does the job people expect it to do and would probably be _very_ alienated it they'd had to start messing around with .net gubbins.

>And yes I would prefer your second "mental overhead" way as it involves less typing

Maybe for the first time you would. Maybe if you were to accomplish this specific thing. Anything else? Have fun diving into the manpage of your shell _and_ the programs you want to use, and you better hope they share a somewhat common approach to the implemented (object) datatype or well, good luck trying to get them to talk with each other

>Powershell is unfortunately not the shining example of a shell that best leverages structured/typed input/output succinctly

I would just remove the last part, then agree with you: ">Powershell is unfortunately not the shining example of a shell"

> Nearly every IT department that manages windows machines uses Powershell

I mean, what other choice do they have there? cmd? Yeah right, if you want to loose your will to live go for it


>I don't buy that. On a GNU/Linux box, there's few things that are easier than installing a new shell, if you prefer a different shell than bash it's two commands away.

When you are SSH'ing into one of 10k containers for a few commands, you will only use what is already there. Bash is there and works and that is what one will use 100% of the time. No one is going to permit Powershell to be bundled to satisfy personal preferences.


You're both moving the goal posts (if powershell were superior I and countless other people would absolutely chsh it for our accounts, since we're already not using bash anyway) and not making much sense. Many sysadmins tend to spend a fair amount of time doing command line stuff and/or writing shell scripts. If powershell offered significant enough benefits for either, of course at least some companies would standardize on it, just like your hypothetical company presumably standardized on using containers to run their 10k services rather than installing some custom k8s cluster to satisfy the whims of one individual infra guy.


When one doesn't have control over the repositories used in build and service machines since they are locked down nor have control over what goes into docker images (only secured images allowed and good luck trying to get your custom tools in), one will use the stuff that is already present.

This is far more common than you think in enterprise corporations. I work at the hypothetical one, which doesn't use k8s. (yet to upgrade cloud infrastructure of native data center)

If power-shell was bundled by default in Linux distro LTS releases, a lot of sysadmins I know would start using it, since they are already familiar with it for windows and write all their scripts in the same.


Meh for argument. Its not like 100% of people are "SSHing into 10k containers to run few commands". Its probably less then 0.00001%.


VBScript is still used by Windows admins


> And yes I would prefer your second "mental overhead" way as it involves less typing.

1. It doesn't, just use zsh and piping into grep becomes a single character, like so:

    alias -g G='|grep -P'
2. Even apart from that I'm a bit sceptical that you can conjure up and type the necessary jq invocation in less time than you can type the fully spelled out grep line.


Yes, jq is probably the wrong example to use here. Never seen such a tool with un-rememberable CLI syntax.


Not only that but a fixed object format also "forces" me to parse the data in a particular way. Think of representing a table in JSON. The developer of the producer will have to pick either row-major or column-major representation and then that is how all consumers will see it. If that's the wrong representation for my task I will need to do gymnastics to fix that. (Or there needs to be a transposition utility command.)

Text gets around that problem, in a sense.


Obviously JSON is not suited for tabular data, but perhaps another format could be used. Ultimately, the user shouldn't care about JSONs or tabular objects.


Text just forces you into row-major, I don't see how that's getting around the problem.


IMHO, I need both text streams and metaformat. YMMV.

Just like GUIs should but usually are not gracefully, responsively scaled to user expertise, the developer experience should but usually are not gracefully, responsively scaled to the appropriate level of scaffolding to fit for purpose to the requirements defining the problem space at hand. I need more representations and abstractions, not less.

Metaformats drag in their own logistical long tail that in many use cases are wildly heavyweight for small problems. Demanding metaformats or APIs everywhere and The Only Option trades off against the REPL-like accessibility of lesser scaffolding. API-first comes with its own non-trivial balls of string; version control between caller and callee, argument parsing between versions, impedance mismatch to the kind of generative CLI's you envision, and against other API interfaces, etc.

The current unstructured primitives on the CLI, composable into structured primitives presenting as microservices or similar functions landing into a more DevOps-style landscape, etc. represents a pretty flexible toolbox that helps mitigate some of the risks in Big Design Up Front efforts that structure tends to emerge in my experience. I think of it as REPL-in-the-large.

As I gained experience I've come to appreciate and tolerate the ragged edge uncouthness of real world solutions, and lose a lot of my fanatical puritanism that veered into astronaut architecture.


> But you could do SOOO much more, if programs would actually output structured data streams.

Like even treating code and data the same, and minimizing the syntax required so you're left with clean, parseable code and data. Maybe in some sort of tree, that is abstract. Where have I heard this idea before . . .

> I think it stems from the design failure to build your app around a CLI. Instead, you should build your app around an API and generate the CLI for that API.

Now this I am fully in favor of, and IMHO, it leads to much better code all around: you can then test via the API, build a GUI via the API, etc, etc, etc.


As my sibling dead comment says you should try powershell.


Even the original post said so in the third paragraph:

> and now PowerShell where you can load .NET assemblies and pass around structured objects


> I think the real solution to the author's angst is to go study the field more broadly and get out of whatever ghetto they are living in.

He has another post about VS Code where he expresses amazement about its basic refactoring ability. So yeah, fully agree.


That post is about a lot more than just "basic refactoring ability". Why are you trying to discredit the author?


renames a file

> yeah, it renames the require statements, when you rename a file.

??? IntelliJ has probably been doing this since day 1 (in 2001), more have probably done so for far longer. Granted this applies to Java, not JS, but still. Speaking of IntelliJ, it is completely absent from the author pathetic history of IDEs, adding to the fact that he ought to get of the ghetto he's living in.


That post literally expresses amazement about "basic refactoring ability". It would blow author's mind of what a proper IDE can do, if they get a chance to use it.


You'd be surprised how many professional developers that don't have experience using a proper IDE.

One of the biggest unsolved mysteries in our field.


Could you please share some examples of proper IDEs and what mind-blowing things they can do?


Maybe not mind blowing, but just a few examples from my (very normal) day:

I wrote a statement in Rider that used a class that didn't exist. So I hit Alt+Enter with my cursor there, and had it create me a class. Then I hit Alt+Enter on that class and had it move it to a separate file. Then I added the base class it should inherit from, hit Alt+Enter, and had it scaffold out all the methods I need to override. About fifteen or twenty seconds with a modern IDE and didn't require any of my intellectual capacity to actually execute.

I realized that another class in this multi-GB codebase had a typo in its name, and hit Shift+F6 to rename it. Typed in the correct name, and twiddled my thumb for two or three minutes while it renamed every instance in the codebase.

Found a file that used a declaration style that's against our coding style. Hit Alt+Enter on one example, told Rider to configure the project to always prefer the other style and replace all examples in the file.

None of those are particularly magic, but having so many of them that are completely reliable a context menu away makes an enormous difference. Also with a recent file list popup and really excellent code navigation, I find that I don't keep a file list or tabs open at all. I just jump to symbols and toggle back and forth between the last couple of files.


Do you know how many times each line if code is read relative too how many times it is written over a life-time of a product?

So much auto-generated boiler-plate code reminds me of nightmarish Java codebases


Quick, someone raise his blub level by showing him Lisp Machines and Genera in their heyday.


text streams are a wonderful idea in 2020 too. Exchanging opaque blobs (objects) requires a fit too fine. It is like Kalashnikov (not very precise but works everywhere) vs. some finicky contraption that is more efficient by some metric but it can only be used in a sterile environment without frustration(e.g. all software version must be just so for it to work satisfactory).


Or using an electronics analogy, text is wire and objects are connectors.

Sometimes all you need is to temporarily connect things together, the environment is benign, and your requirements are not demanding.

But Mouser.com shows over 2 000 000 entries (over 390 000 datasheets) in "Connectors". High current, high voltage, high frequency, environmental extremes, safety requirements, ...

There are all sorts of situations where crimping wires together won't do the job. Same with text. By the same token, there are all sorts of situations where text will do the job, but people are tempted to over-engineer things.


it is high cohesion vs. loose coupling (old concepts that are also as useful in 2020 as ever). If something is too inter-tangled, it probably should be inside the same program/system where you can guarantee a tight fit between components.


I have to say, having started using PowerShell recently, it's better. It's frustrating because I have decades of muscle memory in ksh and bash, but that's not enough to prevent me from recognizing that the CLR for loading components into the same process space and being able to work with and pass objects in that process space in your shell is clearly the right way forward.


It looks like you just need a proper general purpose programming language such as Python unless your environments prescribes a different language.


Unix uses byte streams, not text streams for stdout/stdin. It can contain embedded zero bytes, so it's not even like a C string.


Your comment was a very interesting read up until that last paragraph. Seems a bit harsh despite your many valid* points.

Edit: Made typo on mobile.


Perhaps it was too harsh. I could have expressed that they are projecting the problems of their local milieu too broadly more kindly.


He's not wrong. Look further down thread, you can find people pointing out examples of domains where there is a dearth of software, and a definite need to keep writing it. All due respect to the expertise shown in articles posted on HN (it's part of the reason I keep coming back), but the Dunning-Kruger effect hits hard. As Einstein put it, "the horizon of many people is a circle with a radius of zero. They call this their point of view."


This isn't about whether @madhadron's point in that paragraph was wrong, it's about civility. The usage of "ghetto" was gratuitous.


I'm really sorry, but have you looked at all the pictures in the author's story? I came up with the same association and took madhadron's remark to be about those pictures, not the author.

If your first reflex is to be 2nd hand offended, maybe you should relax a little and try to not to see hostility everywhere ¯\_(ツ)_/¯


Yes, I've looked at all the pictures in the author's story. And it never occurred to me to assume those are pictures of a ghetto the author lives in, as implied by the words "the real solution to the author's angst is to go study the field more broadly and get out of whatever ghetto they are living in" (emphasis mine).

But after I read your comment, I went back to the story and did a Google image search for a few (but not all) of those pictures. And guess what? They all come from other sources.

¯\_(ツ)_/¯

It's not like I was looking for hostility. I was reading the comment and nodding along and then the hostility came out of nowhere and smacked me in the face.


Obviously OP was speaking of a figurative coding ghetto, not their literal physical surroundings.


"Code ghetto" is actually a term I've heard bandied about before, usually used to denigrate an ecosystem someone doesn't like, but in this case I think it's wholly appropriate terminology[0], and I don't think the offense taken at it is justified.

[0] This comes up top of Google for me: https://www.laurencegellert.com/2012/12/software-ghettos-a-f...

I feel that "code ghetto" conveys the myopic POV of the article perfectly.


God stop it with the Dunning–Kruger bollocks already. In my experience, people who refer to it are themselves rather shallow in their analysis of _any_ phenomenon at hand. It seems the thought form here is that anything even remotely falling out of your (very broad, apparently, huh) horizon is bound to be ridiculed and discarded by you as unintelligent nonsense. You're reading a poem, man. It's very clear from the reading that the author does not imply we have to stop writing _all_ software completely, might as well go back to the woods, etc. The poem deals with a very particular brand of software, or rather, software development in its own right; and it dissects the subject beautifully. Don't ever attribute anything you simply don't understand to stupidity. Now having read your comment I must say it's you who comes off as obnoxious, especially with them Kruger and Einstein references.

Watch it.


I'll stop using Dunning-Kruger when it becomes irrelevant. And no, I don't consider myself any less prone to it than other people, that's when I've found it most useful. And Dunning-Kruger is not "attributing stupidity" (talk about a shallow take), it's actually most insidious among those who are experts in one field, hence they've spent so much time there they have zero experience in other specialized domains (again, something I've recognized as one of my biggest weaknesses; thanks Dunning-Kruger!)

In my experience, people who get offended by Dunning-Kruger don't even realize they are prime examples of it.


I assumed it was just a tongue-in-cheek joke because of all the pictures with trash in them.


"Code ghetto" is actually a term I've heard bandied about before, usually used to denigrate an ecosystem someone doesn't like, but in this case I think it's wholly appropriate terminology[0], and I don't think the offense taken at it is justified.

[0] This comes up top of Google for me: https://www.laurencegellert.com/2012/12/software-ghettos-a-f... and I feel that it conveys the meaning, even if you disagree with the particular ecosystems being denigrated.


>> This is empirically false. We have records of dance music, of broadside ballads that were sung in the streets, of the music people played at home. And there were lots of student drinking songs about being young and free.

We have recordings from the 16th century? Do you mean "recordings" as in sheet music for automatic pianos or organs etc?


They said "records", not "recordings". I took this to mean references in historical records.


Ah, my bad. I misread the comment above. Thanks for the correction.


I enjoy writing software, but I approach it more as an art form, or a craft, than a trade.

I love that there are so few limits in "software world." No pesky laws of physics. No enforced values by one or another interest group.

Like any craft, it takes years to master, and efficiency comes from doing it so often, that it becomes inculcated. Nowadays, I make massive architectural decisions every day, without writing down a thing, and, here's the "hit": these decisions turn out to be correct, even though they often mean a lot of work (for example, this entire week has been spent re-testing -and fixing- an app I'm working on, in response to an internal refactoring job). It's like a wood carver, encountering an unexpected knot in their material, and having to work around it, or maybe even incorporate it into the design; treating it as a nice surprise and advantage.

I enjoy writing software. I think I'm fairly good at it. I'd not want to stop.

But the rant reminds me of this old gem: https://www.internetisshit.org


> No enforced values by one or another interest group.

I wish. Try working at almost any company, and you'll have to betray your users on an unrealistic arbitrary deadline.

The industry doesn't appreciate people who think, or talented people, or good engineers. It appreciates those who can haphazardly slap together a subpar something out of ready-made libraries to meet "business goals". And this is depressing af.


I've been using the phrase "Dung-beetle programming" for that dominant mode for some time.


I love that!

I am a "dependency skeptic." A lot of folks think I'm against them, but nothing could be further from the truth. Most of my software is composed of dependent modules.

Mostly written by me.


I don't oppose the whole concept of using libraries. I just look closely at each library I'm about to include in my project to understand what exactly it's doing, what it's abstracting away, what are the tradeoffs I'm making, and whether its scope is too broad for whatever I'm trying to achieve. It's sincerely perplexing to me that not everyone does this.


The other biggie is legal and fiduciary exposure. That's usually neglected by tecchies.

You can't ask a lawyer, because they will always say "no," but I have seen some really bad things happen, because people didn't take this stuff seriously.


I never take this stuff seriously, but then I never really worked, or ever had the desire to work, on anything this proprietary, or in a company this bureaucratic. When my first employer was acquired by a giant corporation and I started seeing it transform into this monstrosity, I quit.


The problem can be when something untoward happens with a dependency, or a buried dependency, like a security breach, or a licensing issue.

Lawyers will generally chew up the food chain, and applications that use dependencies can get caught in the blast radius.

That can happen with big shops, or small shops.


Anyway, I've never in my life consulted a lawyer about my or someone else's code. The whole idea feels weird.


I will bet you don’t live in the US.

Around here, lawyers are a dime a dozen.


I don't. I'm Russian.


> It appreciates those who can haphazardly slap together a subpar something out of ready-made libraries to meet "business goals". And this is depressing af.

OTOH, those who can do this while still maintaining some integrity get a lot of respect, at least in communities like HN. I'm reminded of Spolsky's "smart and gets things done."

And just to push back a bit: reuse is a good thing. Indeed, by looking first to make sure you aren't re-implementing the wheel, much of the spurious software that this article bemoans can be avoided. I know it sounds like an imperfect compromise, but that's because it is. Sometimes you just need to get shit done.


Dependencies are great. It's how small teams do big stuff.

Dependencies are awful, it's where jurassic-scale disasters are born.

Whether we like it or not, dependencies are here to stay. It's like calculators are now an integral part of every student's book bag. I'm old enough to remember when you would get thrown out of school for bringing a calculator.

It's just that when we develop and publish infrastructure, it needs to be held to a much higher standard than the apps that are built on top of it, and I'm not sure we're at the point where we can implicitly trust infrastructure. Being good at vetting and choosing dependencies is still an extremely valuable skill.


It's using dependencies without understanding them which is apparently the norm these days, and this really grinds my gears.


> I love that there are so few limits in "software world." No pesky laws of physics.

Actually...

Laws of physics are very important to software engineering. Stuff like speed of light and ability to disperse heat have a tremendous influence on how computation can and is done, and how we design software systems.

Then there are laws entropy and complexity, math laws & cryptography, social laws (how do people cooperate), game theory, economic laws and legal laws. And bunch of other stuff that I forgot about now. All greatly limiting / shaping the way we can create software at scale.

Software engineering is one of the most (if not the most) complex forms of engineering out there. A lot of people don't realize it, since they routinely work with tiny parts of the whole picture. Writing a typical REST API backend, or making a RPi blink an LED is like civic engineer building a popsicle stick bridge. Sure it's fun, but that's not "the whole field".


> Actually...

> Laws of physics are very important to software engineering.

there are practical concerns abound, especially if you're going to focused as you have on on industrialized software production.

but to me the defining nature of computing & the digital is that it is very near to unlimited, that it is about how we think, what abstractions we create, out of vast vast vast potential & endless capabilities.

yes there are some limits. but growing up, I wanted to build boats & houses & submarines & planes. much of the attraction of digital technology was a near into complete freedom from physical constraint. the 486 I had was a machine with infinite potential, something I could put myself into endlessly, ever becoming more, ever imagineering further frontiers to build & advance into, & everyone else could do the same, & we might still never cross paths. computers are an "unmaterial" wonder, something wholly unlike all else.

I implore us to think not just of the limits, laws, but of the endless spaces the mind might be free to travel & explore amid the digital.


I suspect that most engineers do not regularly come into contact with the laws of physics, or it's dressed up as "xyz is too slow".


Software development is a field with a lot of breadth and depth, and a lot of engineers do not regularly come into contact with things that are "far away" from them.

The other day I saw a comment on a forum about a videogame saying something along the lines of "an experienced programmer would wonder why there are two different compilers at work here".

The industry has expanded enough that we can have people with a decent amount of time spent programming without ever encountering things that other people with the same amount of time consider normal.


Importantly, it's vital people don't look down on engineers which don't work at the very lowest levels of abstraction. There is quite a lot of value gleaned from abstracting the electrons away.


I've never worried about the state of software because I've never believed it will solve all of our problems, and I've developed hobbies and an identity outside of my career. If software causes you existential angst, it's a sign that you're putting too many eggs in one basket.

Software isn't perfect, there are perverse incentives, and everything gets messed up -- because it's a human endeavor, just like everything else.

But I enjoy it and it pays well. I try to point the ship in the right direction when I can.


What prevents one from using this argument with everything? In a sense, I don't disagree, but what sets apart active recognition and embrace of the limitations of an endeavor, and passive resignation to the shortcomings of circumstance, which smells more of indifference from a fatal desire for sleep and forgetfulness?


I believe the answer to that question is simpler than most would think. To untangle competing desires and figure out what you really want, you have to cultivate more awareness. Listen to your feelings and your soul.

You might determine that you've been working too much, fighting battles unfruitfully, while neglecting the things in life you actually care about, like family. Or you could decide that software really is your way to contribute to the world, and find a way to optimize for that.

Either answer is fine. Either way, if you figure out what you ACTUALLY want instead of what society tells you, you'll be happier and find more purpose.

There are no answers in life. Just strategies. Find out what you actually want, and find the most effective strategy.


I like that.

In my case, I love designing and writing software. It's my hobby. In particular, architecting software, in a fashion that is "organic." My architectures morph throughout a project.

If you look at my GH repo[0], you'll see that it's solid green. I don't really take weekends off. In fact, I often get more done, over the weekend, than I do during the week.

What I can't stand, is the "baggage" attached by folks that don't love developing software as much as I do. Team dynamics add overhead, but that is not necessarily a problem. Insecure managers or team leaders, on the other hand...

[0] https://github.com/ChrisMarshallNY#github-stuff


“Put all your eggs in the one basket and — WATCH THAT BASKET.”

- Mark Twain, from Pudd’nhead Wilson


That's the teaser quote I use for this story: https://littlegreenviper.com/miscellany/testing-harness-vs-u...


> there are so few limits in "software world." No pesky laws of physics.

Computability theory has a similar flavour, though. There are hard rules about certain things not being possible.


Thermodynamics and computability are pretty directly related, actually


Can you say more? Is there a thermodynamic interpretation of, say, the Halting problem? (Or P vs. NP?)


This is in relation to the ‘say more’ part, not specifically about the halting problem or P/NP.

Since computation is physical, there are theoretical bounds on the energy consumption of computation. (Landauer’s principle [0]).

Physical limits computation are at least one way to think about the relationship between thermodynamics and software.

[0] https://en.m.wikipedia.org/wiki/Landauer%27s_principle


The answer to your question is not straightforward, but it is not a definite "no". Here is an interesting (albeit amateur) discussion on the subject on Stack Exchange: https://cs.stackexchange.com/questions/26049/is-there-a-conn...


What you wrote reminded me of this ('Why is programming fun?' from The Mythical Man-Month, by Fred Brooks): http://grok2.com/progfun.html


> no pesky laws of physics

To be fair we still are bound by mathematics - boolean algebra, graph theory, information theory, etc. N=NP comes to mind


Pretty sure at this point code also obeys the lows of Thermodynamics


That’s right, the entropy grows for sure. (Code also takes up any volume given to it.)


I used to have an employee that is a physicist. He would get bent out of shape, when I said that. :)


N, NP doesnt matter. Most likely O(n^100) is too large for any sizeable n.


Let's not forget the speed of light (latency) either.


> No pesky laws of physics.

Unfortunately many software projects seem to be limited more by pesky psychology, and while natural laws are reasonably universal and consistent, the diversityof human insanity/stupidity is incredible


I wish the industry was art/craft oriented. I cannot count the amount of wounds I got due to the business mindset.


Ever get an out of memory error? There go those “pesky laws of physics”! Otherwise great points about the craft of writing software — what we need is a better apprenticeship system.


> No pesky laws of physics.

While developing on a single host (especially in traditional imperative and object-oriented languages) papers over a lot of fundamental physics. The nonexistence of absolute simultaneity and the derivative concept of data gravity are very much a thing in even the most basic contemporary software systems... The most ubiquitous system to deal with it, is I would say, the "Virtual DOM".


What's data gravity?


"Data gravity is a metaphor introduced into the IT lexicon by a software engineer named Dave McCrory in a 2010 blog post. The idea is that data and applications are attracted to each other, similar to the attraction between objects that is explained by the Law of Gravity. In the current Enterprise Data Analytics context, as datasets grow larger and larger, they become harder and harder to move. So, the data stays put. It’s the gravity — and other things that are attracted to the data, like applications and processing power — that moves to where the data resides."

https://www.cio.com/article/3331604/data-gravity-and-what-it...


it's the concept that data are difficult to move and attract the code to it. Generally used, in the large to describe things like AWS S3 attracting AWS to build services on top of it. But I think it's true in the small, too like "frameworks on top of each person's browser data".


What's a virtual dom? Some sort of socially-distanced BDSM scene? What does it have to do with imperative programming and object-oriented languages?


Probably taken from: https://reactjs.org/docs/faq-internals.html#:~:text=The%20vi....

Tl;dr It's a representation of the Document Object Model in JavaScript/application land. Changes are made to it before updating the actual DOM, because real updates will cause reflows, repaints and other expensive operations that can slow down the page.

No idea what that has to do with anything else GP said though.


Can you explain how a virtual DOM solves what you're talking about? Seems to be a non sequitur to me.


There's all sorts of separate sources of truth in that system: Data that exists in the server, data that exists in your javascript frontend, data that exists as pixel intensities in the screet. At a high level, the virtual DOM exists to arbitrate the non-simultaneity of updates to all of these moving parts. It lets react model its system as a "single source of truth" with a functional "prop-drilling system". It gives react components gravity to a data source it can consider authoritative (separate from the actual DOM which attracts the browser's own sets of sub-applications) which is also separate from the applications sitting on top of your data in the server, etc. etc. etc.


>I love that there are so few limits in "software world."

I'd love to visit that world you live in. Software development is nothing but a huge heap of trade-offs based on limitation of all kinds. Money, time, quality, re usability, maintainability, and many more. I'm ready to bet there is no other job out there that is as complex and bound by so many parameters than software development. Maybe hardware development but I put them into the same category personally.

>No pesky laws of physics.

Try low level programming or high performance software. Your worldview will change very quickly. Not everything is fluffy javascript ;)

>No enforced values by one or another interest group.

I'm at a loss of words. Software is what makes the modern world move. You literally cannot avoid interest groups that want to influence you once you have a popular software that is used by millions.

Sure you can write some hobby projects and avoid all of the above mentioned, but once you do something with impact that is used extensively to help people (something everyone should strive for!), all of those issues WILL come to haunt you. No exception!


Now, be nice.

I live in a very, very real world. I suspect that it may even be the same world that you live in.

It was really a rhetorical exercise. Of course there's all kinds of constraints and whatnot, but, in my case, I started as an EE, working at a company that made fairly advanced microwave equipment, and, there, the laws of physics made it quite difficult to do what we wanted.

Software was like walking out of a tunnel, into a sunlit field. I could go in any direction I wanted.

It really is quite amusing how we like to sling aspersions at each other. I'm not here to compete with anyone. In fact, I find the fact that so many people are so much better at stuff than I am, to be quite comforting.

I love learning.

> but once you do something with impact that is used extensively to help people

You have no idea...


If you would've lead in with your background I might have been more careful ;) Just rubbed me the wrong way the way you wrote it, so i wrote my response in a similar manner.

Good software is equally intricate and complex as many other fields that superficially seem more complex, like Rocket Surgery. Unfortunately it has changed a lot over the last 20 years. Just learn to use these libraries, go to a bootcamp and call yourself fullstack developer.

Few people even know what instructions per second even means. Deliver 300MB chat applications or 40GB Games to your customers and call it a day.

Good software development is more than an art or craft, it is seriously hard and takes a lot of experience and it is NOT relaxing at all. It is rare, unfortunately and it's just frustrating to see the decline over the last 20 years.


Point taken. I am used to being around folks that know my deal, and am sometimes too casual in my interactions.

I suspect that we'd probably get along, IRL. I got a gander at your site, and it seems that we have a hardware interaction background.

I'm deliberately trying to reduce my scope. I worked for many years in a fairly major-league Japanese corporation, as part of a big, distributed team. The company was a hardware company, so a lot of our software was designed to either play with, or be embedded in, hardware.

Nowadays, I like to try keeping it to apps for Apple devices. My apps tend to be a fair bit more ambitious than you'd normally see from a one-man shop, but they are still fairly humble, compared to what our teams worked on.

However, I feel that the quality of what I do is better than before; mostly because of my craftsman approach. If you knew the company I worked for, you'd find that ironic.


> I'd love to know what world you live in. Software development is nothing but a huge heap of limiting trade-offs based on limitation of all kinds. Money, time, quality, reusability, maintainability, and many more. I'm ready to bet there is no other job out there that is as complex and bound by so many parameters than software development. Maybe hardware development but I put them into the same category personally.

Theoretical physics?

Car manufacturing?

Rocket building?

Some other forms of engineering?

Just brainstorming, I have no clue. What do you think?


> once you do something with impact that is used extensively to help people (something everyone should strive for!)

Why should everyone strive for this? It seems unrealistic at best.


Thinking about the state of the software world in the last several years always makes me think of Vernor Vinge's notion of a "Mature Programming Environment" from _A Deepness in the Sky_ (1999),

"The word for all this is ‘mature programming environment.’ Basically, when hardware performance has been pushed to its final limit, and programmers have had several centuries to code, you reach a point where there is far more significant code than can be rationalized. The best you can do is understand the overall layering, and know how to search for the oddball tool that may come in handy" (Longer excerpt with an earlier bit about rewriting things always eventually just moving around the set of bugs, inconsistencies, and limitations rather than improving them here: http://akkartik.name/post/deepness )

And and Danny Hillis' idea about the Entanglement ( excerpt from a 2012 interview with SciAm) -

"But what's happened though, and I don't think most people realize this has happened yet, is that our technology has actually now gotten so complicated that we actually no longer do understand it in that same way. So the way that we understand our technology, the most complicated pieces of technology like the Internet for example, are almost like we understand nature; which is that we understand pieces of them, we understand some basic principles according to which they operate, in which they operate. We don't really understand in detail their emerging behaviors. And so it's perfectly capable for the Internet to do something that nobody in the world can figure out why it did it and how it did it; it happens all the time actually. We don't bother to and but many things we might not be able to." (more: https://www.scientificamerican.com/podcast/episode/the-comin...)


You've written almost exactly what I was going to write, but there's a few things I wanted to add / would have said.

The author seems to be treating software almost as if it were some "thing" outside of human control and development. By this, I mean he waxes philosophical about software development as though it were some divine practice handed down by the angels themselves - not what it actually is, which is a crude implementation and abstraction of the Universe's actual programming language - subatomic particles.

We have the software we have because we as humans are so limited in our thinking and scope, and because every human has slight variations on their idea of the "ideal", whatever that ideal might be.

If that was not the case, how could we have 50+ programming languages, when the very purpose of a programming language is to express your ideas into a somewhat tangible (insofar as one can claim the electronic written word to be tangible) form that can then be communicated to others.

Maybe now, I'm the one waxing philosophical, but my background is that of an evolutionary biologist; I was not formally trained in software or computer engineering, but it seems to be the "point" of every programming language is to express ideas and the point of every piece of software is to create a tool. Humanity and our ancestors have been doing this for millions of years, so why would we be expected to stop now??


> If that was not the case, how could we have 50+ programming languages, when the very purpose of a programming language is to express your ideas into a somewhat tangible (insofar as one can claim the electronic written word to be tangible) form that can then be communicated to others.

I think part of the issue is that we don't all agree on what the point of a computer program is or how best to get to the same results for the points we do agree on.

Similarly the reason we don't understand the Internet is because the Internet is a conceptual handwave used by humans. We use it to communicate which means it's a lot of different things, very messy and a lot of it is organic.


But the universe isn't 'composed' of subatomic particles in the way that a machine is 'composed' of parts. The laws governing subatomic particles, as far as speech is capable of representing them, are probabilistic. And, as we see in practice, we have surrendered to using aspects in the sciences. When useful, we speak of light as a wave, when useful, a particle. Really it is neither or both. The same gap in the work of the software engineer to depict the world in language. Any mode of speech can only depict one aspect of the truth at a time, even if a single language is capable of more than one mode, it can only ever express one aspect.

Your being a Neo-Darwinian and your confusion at the possibility of regression or a halt in progress are the same. Once the mollusc has "conceived" of his shell as an "adaptation" to a change in condition, he has "responded" so harshly to environmental dangers that he has closed them off almost entirely, bringing the process of speciation and adaptation to a near halt. In fact, there are countless examples of "tools" "conceived" by organisms that have been so immaculate that development has grinded to halt. Not all is progress... the world is not a machine...


> Your being a Neo-Darwinian and your confusion at the possibility of regression or a halt in progress are the same. Once the mollusc has "conceived" of his shell as an "adaptation" to a change in condition, he has "responded" so harshly to environmental dangers that he has closed them off almost entirely, bringing the process of speciation and adaptation to a near halt. In fact, there are countless examples of "tools" "conceived" by organisms that have been so immaculate that development has grinded to halt.

To expand on your point a bit, this brings us around to hill-climbing optimization and being trapped in a local maxima.

Biological evolution is marked by episodes of relatively generalist organisms spreading to new niches, speciating, and sometimes re-invading the environment they came from by outcompeting the original specialized denizens (I'm not necessarily just talking about large scale punctuated equilibrium, but smaller scale species ebb and flow).

So too with software: the cycle of specialization, optimization, ossification, and displacement by a generalist competitor from elsewhere happens over and over ("worse is better" is probably the pithiest expression of this, but "premature optimization is the root of all evil" is pretty nice too).

Evolution itself has evolved to increase generativity, in order to not only speed adaptation to change in general, but to unprecedented change (especially when the changes are themselves driven or exemplified by other organisms).

So too with software, where the change that software must adapt to is often driven by other software.

So software keeps getting invented and changed to optimize for and colonize changing environments (social, economic, hardware, network, and software envs), and languages keep getting invented to improve the processes of optimization & adaptation to change, as well as generativity, for both new and old niches. And of course, the boundary between software and programming language is just as fuzzy as similar boundaries in biology, frameworks and DSLs are just two obvious examples that straddle the division.

Not often appreciated is that all of the above applies just as much to the human social/cultural practices of developing software as it does to the tools those practices co-evolve with (eg. writing/editing, testing, change control, distribution, building, ticketing/issues, deployment, etc.). And we can flip our view around and see how parallel mechanisms have always been operating on human culture from even before we were human.


> If that was not the case, how could we have 50+ programming languages, when the very purpose of a programming language is to express your ideas into a somewhat tangible (insofar as one can claim the electronic written word to be tangible) form that can then be communicated to others.

The same reason we have different specialties in the sciences/arts/etc. Even math itself has different languages to express the same ideas. Having different computer languages allows people to express ideas (solve problems) more efficiently given the domain of the problem. Very basic example: I wouldn't use zsh to write an xmpp server implementation (but it's possible) and I wouldn't use Java to call a handful of unix commands (also possible).


I agree with this but would take it even further. I think humans are varied enough in how they approach and solve problems that some programming languages just suit some people better, there is an great conference talk (I think about OCaml) from a few years ago that talks about this idea, I'll try and dig it up.


> "you reach a point where there is far more significant code than can be rationalized. The best you can do is understand the overall layering, and know how to search for the oddball tool that may come in handy"

Honestly, as someone trying to get into developing web apps, I believe we are already at this point. Because of writings like Paul Graham's (eg, http://paulgraham.com/icad.html), I figured I'd go with Common Lisp, and as an added bonus there'd be less of a paradox of choice for libraries. Not so. Already I'm looking at a half dozen different approaches for persistent storage. I used to love Perl's concept of TIMTOWTDI, but more and more I find myself drowning in a sea of options that all seem pointless.


The problem with software is that it's a pedantic genie: it gives us exactly what we ask for. That means that every little internal or external conflict or uncertainty gets reified and becomes extremely visible.

We see this in "enterprise" software where the warts are incoherencies of process or refusal of departments to communicate.

And we see this in the "gig economy"; Coase's Theory of the firm posited that companies form because the transaction and coordination costs of doing everything as atomised workers purchasing services would be too high, but what if there was an app for that? Boom! The firm disintegrates, leaving us with the taxi company with no taxis and the hotel company with no hotels.

Software asks us a terrifying, disconcerting question: what exactly do you want?


Even if you could exactly tell what you want, there are always hardware bugs that could kill your software. Software is like the psyche of the maschine but it always depends on it's physis.


The gig economy doesn't disintegrate the firm. It just consolidates lots of smaller firms into a few gigantic ones. We just call them "platforms".


I will always upvote a Coase reference.


This poem is dismissive of the progress humanity has made in computing and information technology. Tools like Pandas and Arduino and computer graphics were not mature in 1970s, and in order for data science, hobbyist electronics, or modern filmmaking to get where it is today, it was built on a massive ecosystem of CRUD and poorly designed abstractions like OOP.

But it is still an enjoyable read, and contains kernels of truth - we have abandoned the unix philosophy and created bloated, poorly engineered software. I think the comparison to music and hair cutting is very apt - the mark of a skilled engineer is knowing how to limit scope and ambition so that value can be delivered quickly and elegantly.

We all roll our eyes at the next baby sitter app, but at the end of the day we probably use some app to find a baby sitter - whether it be Google Search or Yelp. In a biological ecosystem, all things that can be, will be.


I don't think it's dismissive to be honest. In many ways, software is not so different from the rest of the world.

Industry (in the Industrial Revolution sense), underwent similar transformations (and is still going through them), from dirty/dangerous/back-breaking factories of the 1800s, to slightly-less-dirty and slightly-safer and more automated factories of the 1900s, to... whatever the 2000s will bring. I'm honestly not sure where software stands in relation; maybe software is actually closer to the 1800s-equivalent of the Industrial Revolution, than to the gleaming 2020 gigafactories of today. Maybe we are still too early.

There is probably some golden age of industrial production that industrialists look back fondly on, in the same way the author and many others idealize Unix. Likewise, complaints about rampant/mindless over-consumption, are not new for society at large, especially as we as a civilization really finally start to wrestle seriously with the question of, just how much can this planet support, and what we can take away from it or pollute it with, before the ecosystem collapses? Do we really need another factory, another plastic widget that'll never decompose, another smartphone that will end up in a landfill in a two-year upgrade cycle?

But, this progress, messy as it is, is still progress. In the aggregate, people live longer, are healthier and safer, and there are just more people, than there have ever been in the history of this planet. In part, all of these industrial processes made it possible. That doesn't mean there hasn't been waste. Now is the time to reconcile, what our waste is actually costing us, and how much waste we can actually support, as a society and a civilization. "Too much software" has potential for real harm, whether in physical pollution or in mental/emotional harm (i.e. the attention economy, the assault on user privacy, our partisan echo chambers, etc.).


The UNIX philosophy has never been followed very assiduously. The 1970s bloated crap just didn't survive until today while ls did.


Yep, I'm not sure how small single purpose utilities passing text between them are supposed to implement a GUI framework, or Kerbal Space Program.

Many of the applications we're creating nowadays are solving gnarly, poorly understood problems.


It’s just longing and nostalgia for a bygone era. Oh, how perfect software was when the sages of yore imparted them to us. But we turned away from their wisdom and left the Garden of Eden.

I think we have it pretty good. We continue to build and the stuff that isn’t good, gets rebuilt. That’s fine. What’s important is that we have the means to build good, lasting software if we choose to prioritise it. We have languages and tools that the ancients did not have.


the successor to UNIX wouldn’t have to be text. like, imagine a VR world with a lot of different tools that can operate on VR objects. Then ask yourself why desktop and phone UIs are little walled-off boxes that can’t interact with anything shared except saved files.

but well, KSP is a game, and a GUI framework wouldn’t exist in that paradigm, so I’m not sure if I know what you’re getting at.


> desktop and phone UIs are little walled-off boxes that can’t interact with anything shared except saved files

First, this is not true, as they are also able to communicate through the clipboard.

And second, the point of the file is to have a durable form for the data. A VR world is still going to use files.

EDIT : In fact, these two modalities of working with data directly stem from the hardware : "live" memory of the clipboard in the always powered Random Access Memory versus "dead" memory of the files in the 'hard' drive which keeps storing information when unpowered.

Maybe there are other potential modalities than these two, but I personally am unable to imagine them.

(I guess that there's an 'in-between' with logs stored as files, but that have their most ancient information regularly erased... which ends up happening with any information, regardless of the storage format, given enough time !)

EDIT2 : And we both forgot about the concept of databases, which tend to be more complex in abstraction than filesystems, and can either exist in parallel or on top of a filesystem, and which many apps definitely use. But they obviously have to compose too with the underlying bit-juggling hardware.


> and a GUI framework wouldn’t exist in that paradigm,

there are UI frameworks in node-based environments where you connect little boxes between to each other to create buttons, text, clickable areas, etc. One of my jobs has been about migrating off that into more traditional OOP-based languages because it just sucks ten thousands less times when your UI starts to grow.


yeah, the nest of connected lines metaphor doesn't seem to work unless you have a very simple model. I keep thinking there must be some other spacial metaphor that would be as powerful as code but engage more parts of the brain, but I haven't found it yet.


Sounds like PowerShell to me, each part isn't text but its own object that every other script can manipulate.


Lately I've been finding an analogy with food production quite useful. It's possible we have enough restaurant-style software and we should stop (but we never will, because restaurant-style software production rots if it stands still). However, we have almost zero home-cooked software production. Perhaps we should do more of that.

It's challenging, because we got the restaurants first, and we didn't have millennia of home-cooked tradition built up before the restaurants arrived. And we've all only trained in restaurant software, and our instincts don't always carry over to home-cooked software. Creating a culture of home-cooked software must happen in the margins, without capital, because capital by nature goes to restaurants.

It's challenging, but it seems worth trying.

(This comment has been percolating over the course of many conversations in the Future of Coding forum: https://futureofcoding.org/community)


We started down a road with more home cooking through things like Visual Basic. I remember seeing lots of useful things built by non-professionals during the era of VB 6, but it seems like nothing in the web era has filled that void.


I really wish there wasn't this big distinction between using a computer and programming a computer.


It's all fun and games until you expose an SQL interface to your mainframe database in Excel so that the guys and gals in accounting, purchase, and marketing finally stop pestering you with change requests to their reports, just to find out that some extra-curious and clever gal in purchase managed to bring down the entire mainframe by constructing a SQL statement using a bunch of nested outer joins...

And while the blinkenlights in the server room start to morse at you through the window in the door in a menacing pattern, the guys and gals in server ops demand answers as they point at the avalanche of warning- and error messages that started lighting up the terminals like a Jedi knight fighting a swarm of mosquitoes in the swamps of Dagobar at midnight with his light sabre.

At this moment you can only begin to fathom what happens in operations at this moment, with orders being delayed, fulfilment rates dropping by the minute, pickers and lorry drivers starting to get pissed off at the nerd in IT who must have screwed up again and they know where your car is parked.

I guess what I really wanted to say is: be careful what you wish for, it might bite you in the end :)


Mainframe? Where we're going, we don't need mainframes.


I've seen the exact same scenario play out with an e-commerce site using MySQL.


Yes. <3

The old hobbyist computers would put you in front of a prompt when they booted up.

I'm working on something like this, actually. Here are some 2-minute videos:

https://archive.org/details/@kartik_agaram


I have to jump in and plug Anvil (https://anvil.works) here - it's a Python web app builder very explicitly modelled on VB6 and other RAD systems.

Drag'n'drop UI, Python in the browser and the server, one-click deployment - we're aiming for exactly that niche!


Python, though predominantly by technical non-programmer professionals. And it doesn't have quite the same RAD aspect with making GUIs that VB6 offered. I saw a lot of non-technical professionals using VB6 to make quick applications to solve their problem. The code would make you cry, but it worked.

And it wasn't web-based nonsense that's hosted on someone else's server for a monthly cost. They actually owned the results. That's a big difference versus what companies would offer today.

TCL/TK was great for making quick GUIs, but I never saw non-technical people use it for that purpose. Still seemed to be non-programmer, but still technical, professionals.


Many languages seem to give the impression of being home-cooked early on. Ever do factories start small. But it's inevitably a mirage. If we want something that won't grow into something "industrial strength" that needs pip and virtualenv and God knows what else, we have to consciously make some scale-hostile design choices.

I've been poking at this problem: https://github.com/akkartik/mu


> Creating a culture of home-cooked software must happen in the margins, without capital, because capital by nature goes to restaurants.

I honestly think this is the root of the problem, and that there is "too much software" is merely a symptom caused by this.

As someone who didn't eat out often before quarantines, and is only a fair-to-middling cook, I can say that restaurants have crippled us in many respects. The overall cost is more than cooking at home, it's less healthy for you to eat at restaurants, fast food in particular has a bigger environmental impact, and to top it all off, these corporations are warping the political landscape with lobbying, never mind what other concerns there might be (much like the oil lobbies).

I find it funny that you claim that "we didn't have millennia of home-cooked tradition built up before the restaurants arrived" as I'm pretty sure the exact opposite is true. So it would appear that as with all victors, restaurants are re-writing history, too.

Take all of these observations and apply them to software, and it fits, perfectly. What's the commonality? It has to do with certain unchecked forms of certain economic engines . . .


> we didn't have millennia of home-cooked tradition built up before the restaurants arrived

Just to be clear, I was alluding to software here, not real food and real restaurants.


> Just to be clear, I was alluding to software here, not real food and real restaurants.

Fair. I guess I'm being dense today. It's a very good point that despite the feeling of the software industry being old, it's still younger by far than any other human endeavor (that I'm aware of). It's a field that is chaotic and still hasn't had time to settle down, really.


The lower the cost the higher it is in the grand scheme of things. I’d like people to understand food is expensive and should treat it as such: reduce food waste and stop taking shortcuts via long routes, aka importing non seasonal food from the other side of the planet. I support locally grown food and local farms as much as I can!


(I find it hella ironic that Future of Coding community uses Slack. Is there another way to participate? Or read transcripts?)


Boring, boring topic :) It's been discussed to death, but there's just not an option that the bulk of the community is willing to commit to.

There's a searchable archive at http://history.futureofcoding.org. It still has issues, but it's getting better. And it's also built with the present of coding :p


> Boring, boring topic :)

Oh absolutely, I'm bored of it too. I was just struck by the irony. :)

> There's a searchable archive at http://history.futureofcoding.org.

Brilliant!

> It still has issues, but it's getting better. And it's also built with the present of coding :p

>_< blank page with JS diabled... ;)

BTW, Mu is really cool!


I just remembered our conversation from just about a year ago!

https://news.ycombinator.com/item?id=21268252#21293237


> It's challenging, because we got the restaurants first, and we didn't have millennia of home-cooked tradition built up before the restaurants arrived.

That's a bit ahistorical, so your analogy is actually better than you think.

Computers started out essentially hand built one at a time and hand programmed by changing the wiring (this is analogous to building and cooking over an open fire). Gradually (but quickly) they progressed through craft and guild stages to the point that computers were being mass produced, at which point software was being physically traded on spools of magnetic tape (analogous to cooking for your family, band, clan, tribe & trading recipes with other cooks, marketplaces for staples and specialized implements), then software & patches being posted to Usenet and emailed around (cookbooks, large-scale food production, the local inn, home kitchens), and only then did software start to become a product that people mostly bought and used rather than making yourself unless you had to.

The closest food analogy to this transition is probably baking bread, which was quite specialized for a while (even if you had a kitchen with a stove and used it, you almost certainly bought your bread, and even if you baked your own bread, you definitely bought your flour) and technology had a strong role in decentralization by making home kitchen ovens reliable and common, although any baking you did was probably cakes rather than bread. This was equivalent to the personal computer and operating systems vs. applications.

You can extend the analogy further and say that modern smartphones and app stores are like an an apartment with a kitchenette that only has a microwave oven so all your food is bought pre-packaged at the supermarket. Sometimes you go out to eat, sometimes you order pizza delivered, sometimes you grab fast-food.

In all this the restaurant (an evolution of the inn) hasn't gone away, and in fact some of those restaurants scaled up to the point that the pre-packaged food you buy has restaurant branding (eg. Marie Calendar's).

No-code solutions (starting with IFTTT) are a lot like making your own sandwiches from store-bought ingredients, including pre-sliced bread.

Now, you're pointing out that the computer software equivalent transition from cooking all your own food to centralized and industrialized food production and preparation only took an eyeblink in comparison, but you are wrong in stating that software production didn't go through all the same stages of evolution and just sprang fully formed into the equivalent of restaurants and microwaved meals, or that the food production and preparation value chain wasn't subject to the same kind of technology-driven swings between centralization and decentralization as software, they just don't apply to all parts of the chain at the same rate which can lead to reversals in other parts of the chaoin.

For example, industrialization of appliances wasn't the only development that pushed decentralization of food preparation; canned vegetables, refrigerated transportation, gas/electric/water infra, increasing sophistication of ready-to-use ingredients, and other similar developments were necessary and responsible too.

So, basically, this sort of irregular ratchet that always moves in the direction of standardization and industrialization sometimes pushes some particular layer toward centralization, and sometimes toward decentralization, sometimes adds more layers on top, and this has been just as true for food as it is for software.


That's really interesting, thank you. Comment favorited.

https://news.ycombinator.com/favorites?id=akkartik&comments=...


You're very welcome.


>The 2nd dream for software was object orientation. It’s hard to describe how strong and confident this idea was. by using classes, codebases become collaborative and useful to one another - The perverse, dark-magic idioms of unix disappear in OOP best-practice. The decline of java may be the saddest story in computer science. It was just before my time.

Java is far from the only object-oriented language. Does the author suggest that C# and Python are of only historical significance? It's unsubstantiated to associate the decline (?) of Java with the decline (?) of object oriented programming, and even moreso to associate either of them with the implied decline (?) of composability, which can be achieved in many ways other than object orientation. For example, Go and Rust do not implement OOP in the same way as Java, but many people think their modifications improve composability, far from contributing to its decline. What we have in the quote are three dubious facts (it's not clear how much of a decline any of those have undergone), associated through nonexistent connections (they are not related).


Considering mainstream languages, I find java unique in forcing OOP into places it doesn't really belong. You can see this in your first "hello world" application where the entry point ends up getting wrapped in a class definition. Something that makes zero sense, and does little but obfuscate. I tend to like a bit of OOP here and there, but the little java programming i've done, it always feels like I was forced into pounding the square peg into a round hole.

https://www.programiz.com/java-programming/hello-world

Compared with say, C++ where you can write entire programs that never declare a single class. Its the same thing with purely functional languages, in any significantly large program you reach the point where your solving problems using the wrong paradigm because the language is too dogmatic.


Having an entry point in a class lets you have per-class entry points, so you can choose what to run, if that appeals to you.

Java likes its objects, but in modern java there's enough in the way of lambdas etc that most of the weird "this must be a class" stuff is removed, or at least hidden. And the java community are far less dogmatic about it these days.


However, the author is specifically longing for the old Java, and the halcyon days of trying to object orient everything.

It's not a blog piece: it's a cry for help.


There are multi-paradigm languages like Scala which address that concern, at the cost of higher complexity.


I found this post to be incredibly confusing and thought provoking at the same time. I wanted to say it doesn't belong on the front page of HN (and that it kind of feels like it was written by a person losing their mind), but the more I read the more I thought. I don't have anything to add, I'm really just curious to see comments from others.


I liked it, I found it thought provoking as well and I enjoyed the format, the “collage-y” presentation worked well with the subject matter. To me it presents as disillusionment or polemic rather than somebody losing their mind. Glad I found it on here


>To me it presents as disillusionment or polemic rather than somebody losing their mind.

You're very right, that's a better way to phrase it.

>Glad I found it on here

I completely agree. I feel like it could be a great catalyst to a lot of good conversation which is why I was ultimately glad it was on here and that I read it, even though at first I found it strange. It made me feel a way that I can't exactly explain or put it in to words; and I consider that to be a good thing.


I think about this often.

It’s exceedingly rare that I’ll come across some solution to a problem that is the best of all available options. That’s generally fine in everyday life. You don’t need to be the best carpenter in the world, just the one that offers the best service at a fair price in your area.

I don’t think we’ve really absorbed the fact that with software development, average solutions to problems have no justifiable reason to exist other than market inefficiency.

A lot of ad hoc software I see that has had hundreds of thousands of pounds invested in it is ultimately a buggy, feature-depleted version of SquareSpace. And yet massive sums of money continue to be spent on un-necessary greenfield projects to reinvent one wheel or another.

The UK government recently wasted unspeakable sums of money “developing” a track-and-trace app. The quotes are because it’s not clear what needed to be developed when essentially the track-and-trace API’s from Apple and Google boil down to slapping your local health service’s splash screen over the template and not much else.

I do sort of suspect that over time, as search has become synonymous with Google and internet video has become synonymous with YouTube, that we’ll end up with a one-solution-to-supplant-them-all for almost every common use case. We’re not there yet, but we’re not far off either I’d wager.

There are pros-and-cons to this. Microsoft had cornered the market on what people at the time perceived to be the most essential software in the 80’s and 90’s and those products are significantly less consequential today. Maybe we’ll operate in cycles of centralisation and decentralisation like we have with information networks.

Eventually we’ll all just be replaced by AI anyway, so maybe it doesn’t even matter.


Does anyone else find it incredibly ironic that this website won't load with JavaScript disabled? Perhaps some people should stop writing software, on that I fully agree . . .

In seriousness, we don't have this kind of AI yet: https://youtu.be/7Pq-S557XQU

Therefore, IMHO, we should not stop writing software on that basis alone.

I agree with many points put forth, I really do, but I don't come to the exact same conclusion. I believe we as a society shouldn't stop making software. But I believe there needs to be a better filter. Far, far, far too much shitty, pointless software is out there. The author cites Java as a pinnacle; funny, some people have said similar things about Lisp[0].

Perhaps some people should stop writing software. Just determining who is allowed to write software or what software is allowed to be written is the hard part, and is ripe for abuse. Perhaps this discussion should be centered more on the why a particular piece of software is being written. But then we get into ideology, and even ideologies that are destroying the physical planet are staunchly defended when they are obviously having measurable harm on our information systems.

[0] - Greenspun's Tenth Rule of Programming: any sufficiently complicated C or Fortran program contains an ad hoc informally-specified bug-ridden slow implementation of half of Common Lisp. Also a quote from many moons back on slashdot: "Lisp was a simple, elegant language that demonstrated that almost any language written after 1961 was unnecessary, except for demonstrations of concepts like Object-Oriented programming that could then be re-implemented into Lisp, and that any code written in older languages could be replaced with something better."


> Does anyone else find it incredibly ironic that this website won't load with JavaScript disabled?

It also tries to load a live reload script from localhost.

> Perhaps some people should stop writing software, on that I fully agree . . .

Or rather stop writing blogs like it's software.


The horse analogy in the video you linked doesn't consider the fact that while the population of horses is much lower today, the ones that exist today don't all live as slaves. A portion of them live in the wild, some others are pets and some others actually are for "work".

He seems to suggest that having billions of wage slaves is somehow better than having fewer happier people.


This article reminds me of what I think about houses: modern houses are so cheap, ugly, no architectural sense, poor construction, etc; those old Victorian houses are so nice, robustly constructed, beautiful, attention to ornamentation, etc. But the reality is that there were a lot of lousy Victorian-era houses. And fifty years from now people will look back on the modern houses that survive (because they are good enough quality to last) and wax nostalgic.

Also, if he thinks 1970s Unix was the pinnacle of design, he should at least read about Plan 9 before deciding that we should stick with 1970s Unix. And maybe learn Go.


> The promise of a computer is to reveal truth, through a seamless flow of information.

Funny. I thought that the promise of a computer was to do a bunch of stuff that computers are good at but that humans are not. I thought that the promise of a computer was to help make certain tasks easier and/or faster.

No idea what any of that had to do with truth.


unclear to me whether we meet even your low standard


> ...we’ve built programs that just dont talk to each other at all.

This really struck me, the idea that perhaps software mirrors the human experience. In the US, we've traded tribalism for individualism, open discussion for echo chambers, and the exchange of beliefs for the walled garden of one's personal vision.


It struck me too... but as completely wrong. A large majority of my work is in APIs, both as a consumer and provider; between local processes, local networks, and the global internet. Every program I build talks to other programs... a lot. If there is any problem at all, it's not that programs don't talk to each other; it's that they talk 1 of 1,000 different standards.


And yet, we have companies building their value based on giant, secret hoards of information hidden away in their lairs, offering mere glimpses of treasures (or horrors) within through their APIs.

Just try to imagine what would happen if the entirety of Facebook's or Google's user were dumped somewhere for everybody to rifle through... The only saving grace here seems to be that the amounts of data are so huge that copying it all takes an unreasonable amount of resources.


This doesn't really have anything to do with Google and Facebook. Did you read the article?


To quote from.the article:

> Information has not become more seamless.

My comment is a reflection on this.


Do you know what "reflection" means? You're not making any sense.


The Oxford dictionary lists multiple definitions of reflection. One is "your written or spoken thoughts about a particular subject or topic".


Huh. I've had the opposite experience of this author. In my 10 years are a software developer we've made amazing strides towards making sense out of the software itself and the data flowing through my systems.

Honestly I feel like we'll have a decent base to our craft in another decade or two.

The biggest problem I do see in software is that people with no software design experience often drive the most important software design decisions. Primarily companies marketing departments driving their development.

It's not necessarily wrong for those types of roles to drive product design but the people in those roles need to learn a lot more about what makes a good product.


Everything is supposed to decay. But software is effectively immortal, and that's the root of this problem. We need to introduce decay into software systems, or we end up with zombies that are too useful to let die.

We should allow software to die. It's okay. Whenever a generation feels like they can't lose the artifacts of the previous generation, civilization is in decline. Which is okay too, except when technology protects those artifacts too well, which I believe usually hastens the end.


> We need to introduce decay into software systems, or we end up with zombies that are too useful to let die.

We already have this. Just today I read on another post on here how someone had to stop using their 32-bit audio plugins because their OS stopped supporting 32-bit. That software wasn't "fit enough" to survive in the current environment, so it was selected against. It died. Its entire species (its copies) died out.

The really interesting discussion, I think, is "what is software"? Where does its boundary end? The 32-bit app died because the team that built it couldn't/didn't update it to 64-bit. It didn't die because it stopped working for the environment it was built for. It died because of money and people, not bits and transistors. Why was that? What happened to the team? What's the lesson? Should the plug-in have been made in the first place? Does it matter? If the "death" is due to people and money more than bits, where's the boundary the remarks "software". What should we let die (copies of bits, or teams?) and how does that look?

For me, all this leads into economics, wealth distribution, resource sharing, leadership. The "real" software?


I'm not sure I understand this comment. Could you give some examples of "zombies that are too useful to let die" right now?


There are tons of programs related to moving money from one place to another, all written in COBOL or Fortran at the time when men wore hats and smoked indoors.

First they were kept alive by cobbling together new hardware to keep the old monster running. Then they were moved to virtualized environments that emulated the original hardware with all its weirdnesses and glitch.

These are the zombies we're talking about. The useful living dead. We feed them their pound of flesh every now and again in the form of coders who can still speak the ancient incantations, but otherwise we keep them in the basement toiling away.

We don't touch them because we are too afraid. No one dares to take the responsibility of rewriting all that code that still works, no matter the cost.


And what's the alternative? We rewrite the whole stack in Java/C++/C# this year, and in 20 years we rewrite the whole thing again in Rust? That really doesn't seem like an improvement to me.


Same thing happens everywhere. Fire code, regulations and other best practices improve and we don't use asbestos, leaded pipes or non-grounded electricity to build houses anymore, despite how good and convenient it was 30 years ago.

Now you could still live in such an old house today and be happy but the moment you remodel and tear down that asbestos wall you will need to sanitize the whole thing from dangerous particles. Same thing with old programming languages, they don't have the same guard rails as today so once you start tearing up old dust there could be invisible demons awaiting, especially if done by people with limited experience in such environment.


The alternative would be to keep the code fresh. If you let it stagnate, the hidden knowledge on how and - more importantly - WHY it does things gets lost.

If you write it once and fire all the programmers, every update is a huge undertaking with immense risks. But if you keep reiterating and refactoring it slowly over time, the code stays fresh and the knowledge isn't lost.

You might change the language stack for the whole product during the iterations, maybe just use a fresh stack for a part of it. Maybe keep the old tech, but make it more readable and easier to integrate with other systems.


Anything travel related as well. Somebody in the 1980s thought that being able to read your order with just your surname as a password was a good idea. Or that passwords have to be 8 character maximum, no profanities or common words, and they can't have Y or some other characters since you could not have dialed it on a specific model of 1980s phone. I haven't touched a GDS for the past 10 years or so, but they are out there, probably still on mainframes as well.


*The exception, of course, is Google Reader.


Oh good, your comment just made me laugh milk out my nose. Bravo sir, and thank you for that excellent gem!


> Everything is supposed to decay. But software is effectively immortal, and that's the root of this problem. We need to introduce decay into software systems, or we end up with zombies that are too useful to let die.

> We should allow software to die. It's okay. Whenever a generation feels like they can't lose the artifacts of the previous generation, civilization is in decline. Which is okay too, except when technology protects those artifacts too well, which I believe usually hastens the end.

I don't think this will be a problem, over the long term.

the average life-span of companies listed in S&P 500 was 61 years in 1958. The current average is under 18 years.

As long as that code is helping to keep it's hosts alive, it will keep being run. When the remaining hosts die, the code will die too.

60 years from now, someone will be having this discussion about having to keep a virtualized instance of AWS running, complete with all the CPU, GPU, and networking bugs, just to keep some crucial bitcoin clearinghouse that no-one understands anymore running.

Heck, maybe it will be even weirder, like, what if GMail ends up being that sort of zombie infrastructure, and it stops working if there is no more spam? We'll have to set up a dedicated AI to create and send spam just to keep the last demi-sentient instance of Gmail running without having an existential meltdown.

I'm reasonably sure that by that point, COBOL itself won't be a problem anymore, there simply won't be any companies that still depend on it left.


I liked this. It can be read as a very timely articulation of the age-old human obsession with platonic ideals. The quest for 'the seamless flow of information' is so much like the quest for true democracy, spiritual enlightenment, immortality, a unified scientific theory, artificial consciousness, transcendence before it... In the end we always re-discover that our noble ambitions don't free us from our mortal and earth-bound bodies, and that makes us feel sad. We're not gods after all, we're monkeys who managed to self-evolve the hair off of our bodies and are, for the present moment, transfixed by programmable glowing rectangles ... Importantly, this failure to be anything more is what reconnects us to ourselves. Albert Camus comes to mind:

“I leave Sisyphus at the foot of the mountain. One always finds one's burden again. But Sisyphus teaches the higher fidelity that negates the gods and raises rocks. He too concludes that all is well. This universe henceforth without a master seems to him neither sterile nor futile. Each atom of that stone, each mineral flake of that night-filled mountain, in itself, forms a world. The struggle itself toward the heights is enough to fill a man's heart. One must imagine Sisyphus happy.”


I like the recommendation at the end for Life In Code by Ellen Ullman. Definitely read it; it's very good and says a lot of things that need to be heard.


I do often feel like we’ve hit diminishing returns for our current paradigms. We employ a stunning number of people to solve minor variations on the same problems, over and over. Our applications are still mostly silos, integrations are expensive, compatibility is always incomplete and idiosyncratic. The number of lines of code grows and grows and grows, but clarity never increases


I am a starting programmer (5 years now or so) but I don't understand how you can just talk about OOP like "It’s hard to describe how wrong and confident this idea was." without any examples. I am really starting to hate this, is this some kind of religion with a holy book where it is just written "OOP is bad."

I'm finding it difficult to take such a piece seriously and over time it pisses me off more and more. I use OOP everyday and sure, there is such at thing as "too much OOP", I have reached that point, but there is also sometimes too little and just a mess of linked functions. I only use Python btw.

Ugh, what a horrible negative piece to read for someone who really deeply loves computers.


I don't think it can stop. I feel this is more like "is this the end of physics/mathematics?" like question. Our generation scientists have to use software to discover stuff...I mean why stop?


It's clearly a rhetorical question being asked meant to stimulate our thoughts about the state of modern software.

I've often thought software has become a fetish instead of a tool.

I worked at a place where I was scolded for things like indentation formatting issues and the product we were working on made 1M a year maybe.

Then I went on to work for a place that had a steaming pile of PHP code that probably made the company 20 million dollars or more.

It was an eye opening philosophy changing learning experience for me. It wasn't the software that mattered.. it was the solution to the problem...the software fetishists were hung up on the pedantry of the software while these people out here are writing anything that works and solving problems and making money and not giving a shit if the software looks pretty or has the latest language or framework.

Also, being in devops where you blend systems with development and you offload as much as possible to the SYSTEM instead of trying to emulate the system in software.

I think one of the best trends of software is it's evolution as glue between already written systems that we're seeing in cloud.

The writing is really beautiful. If the author is on this thread very nice work.


This is a form of poetry that I would like to see more of.


I would like to stop. And by stop I mostly mean, set on a handful of tools and stop mindlessly passing from language to language, framework to framework, cloud provider to cloud provider, trying to find the _perfect_ one. I think it would be better to just pick something, anything really, and become very good at it.

In the past decade I've built production apps in C#, Java, Go, Ruby, Javascript, Python, using AWS, Azure, GCP, backed by DynamoDB, Postgres, Snowflake, MySQL. I've used dotnet, Unity, Ruby on Rails, Sinatra, Spring, DropWizard, Gin, Gorilla Mux, Node, React, Express Flask, Django. Probably more. I didn't use any of them long enough to become what I would call an "expert", but I used each of them long enough to dislike various components in favor of other pieces from other languages/frameworks. I kinda just learned that nothing's ever going to be "My" language, like I see for other people all the time. They'll always just be "meh" at best, but I should probably just pick a devil to know, rather than hoping to find magic somewhere.


I think, the main takeaway is that computing should be (still) about information. Currently, it's all about media and media use, and the major vendors of operating systems have turned into media houses and/or data transit facilities (regarding their revenue model). And it shows. (Compare Big Sur – this is certainly not an interface that's about information processing.)


I completely agree with this. However, I also feel that there is something out there that I don't even know about, but somehow I need it.


Heh, I somewhat felt this way yesterday. Two co-workers and I (all programmers) spent half an hour trying to get a video chat to work. By the end, I think we all felt a little embarrassed by our profession. (To be fair, Zoom works pretty well most days. Yesterday just had gremlins I guess.)


I decided the other day [the abstract recent past other day, not yesterday like you] that what I really want is about ten years.

What I really want is for my computing devices not to change for about ten years at a time. I don't want to have to upgrade the hardware, I want it to be repairable with replacement parts available, I don't want to have to upgrade the OS to a new major version that moves all of my widgets around for about ten years. I don't want to use beta software that slowly mutates until it mostly works or is abruptly discontinued, I want to install a suite of finished software that continues to work the same way with only imperceptible security fixes for ten years, so that I don't constantly have to be adjusting my workflow as individual portions are obliviated.

And then, when that ten years is up and I need to upgrade, I want the changes to be actual upgrades stemming from fixing design mistakes and removing pain points and doing things that not only weren't possible ten years ago, but which are also desirable and not merely novel.


"Ten years" reminds me of the Decenarian maths in "Anathem." There are other little concepts in that novel that point to the value in taking one's time, such as the "chalk, ink, stone" mediums, and which ideas are worth expressing in each. I'm also reminded of the slow food movement. Perhaps we need a "slow software" movement?


If we model it after cars, then there's a large population of people like you (us). But there's also a large group who trades their car in every year or two for the 'new model'. And they revel in the widgets moving around and the color scheme changing.

So maybe we need a Studebaker for computers.


The wish behind your wish is perfect engineers - the only way to achieve what you're describing is to have engineers and designers so good that they converge on a perfect, bug-free system on the first try. What we have in real life is developers of sometimes high but always finite intelligence and sometimes high but limited foresight.


The wish isn't for perfect engineers -- the wish is for the perfect not to be the enemy of the good. I just want to be able to use a "good enough" product with the knowledge that it will stick around rather than be dropped in favor of iterating the design, maybe forwards, maybe backwards, maybe sideways, but always moving, spinning, chasing the perfect version of itself.

I want stability, not perfection.


Well, he could get his wish by buying a few copies of his computer, for spare parts, installing what he wants and then unplugging from the internet. Interestingly, this is precisely what JPL does before launching their missions: they have original, exact hardware and software on hand to trouble-shoot if they need it.



my browser wasn't loading this web page and it says:

      <script
      id="reload"
      src="http://localhost:35729/livereload.js?snipver=1"
    ></script>


Our company has committed to using Google Cloud Platform. I do not see a lot of need for writing much new infrastructure. Someone needs to create and manage Kubernetes and its ilk, but the people doing this will make up a similar percentage of our society (and tech ecosystem) as people who generate electricity and grow food. Most of what we build amounts to modern shell scripts and crontab entries. The vast majority of what we create is mortar to cement together other people's bricks. This is a great victory. I love programming but I also love getting things done quickly.


Page tries to connect to localhost

  <script
      id="reload"
      src="http://localhost:35729/livereload.js?snipver=1"
    ></script>


> At Google in 2001, frustrated with a lack of progress, Larry Page fired all the managers in the entire company. on one day. with no warning.

Did this really happen? I hadn't heard of it before.


According to this article all of the Project Managers where fired: http://www.slate.com/blogs/business_insider/2014/04/25/googl...


That article (and others) say it's because "Google hired only the most talented engineers, he thought that extra layer of supervision was not just unnecessary but also an impediment. He also suspected that Google’s project managers were steering engineers away from working on projects that were personally important to him. For example, Page had outlined a plan to scan all the world’s books and make them searchable online, but somehow no one was working on it. Page blamed the project managers."


Thanks, that is a good link. Reading on it seems they weren't fired after all.


Wow, what an awful manager. That's the business equivalent of flipping over the gameboard because one is losing.


Sadly, it sounds like the exact same thing needs to happen at Google again. History is repeating itself.


I don't see anyone mentioning it but the computer vision vacancy rate project https://www.movesmartly.com/articles/condo-units-sitting-emp... is something I wanted to do for a long time. I'm not sure if the author was criticizing it or not, but at least it's bringing attention to something through one avenue of survey.


>we really don't use, share, or make decisions using information today.

Don't lose sight of the fact that you're reading this on (and the author posted this through) a computer connected to a global network. That's the only reason it's possible to read this. Disseminating this so widely would have been incredibly difficult (and expensive) not that long ago.

Is it perfect? No. But I feel like it's hard to see how far things have come when we're living with it day-to-day.


I've felt this way for a while, that we're just spinning our wheels. UIs/lipstick gets prettier, but there is little out there we couldn't of had 10-20 years ago. Ergonomics of software development have made great strides in some ways, terrible regressions in others. There isn't much of substance to discuss. We probably don't enter the "new world" until quantum becomes ubiquitous or some other massive breakthrough.


To me this is just utterly ridiculous.

In the field that I work in (audio software) there are dozens of things that we can do in software now that were unimagined even in 1990. Polyphonic note editing? Utterly transparent time stretching? Even just the ability to do huge amounts of serious DSP on generic CPUs has completely altered the entire landscape of music production (and the software that is used to do it).

The same is true of so many other fields.

What hasn't changed much is the sort of software that is concerned with entering, editing and generating reports on data. The business/consumer focused versions of this stuff have changed from being built with tools provided by Oracle in the 1980s to using various web stacks, but the basic model - data input, fetch some data and present in a GUI - remains unchanged. And perhaps that's because the task hasn't really changed much, and what we have is actually fairly good at doing it.

But switch over to other areas where data-driven applications are important - many scientific disciplines for example - and even there you will find huge expansions in what is possible, particularly in terms of visualization (or auralization) as a way to help humans explore data.

And FFS, Google freakin' maps! Yes, something like it existed 10 years ago, but have you actually used it while driving recently! It's not bringing about world peace or solving hunger, but good grief, that is an absolute marvel of the composition of so many different areas of CS and software engineering into one incredibly user-friendly tool that I don't even know what to say other than "use the 2nd lane from the right and then turn left at the light".


I don't know why, but your extreme excitement for Google Maps really made me smile.

It's important not to take really good pieces of software and services for granted, yet it's something we all do every day.


> It's important not to take really good pieces of software and services for granted, yet it's something we all do every day.

This is so true. In fact, the user to whom you are replying is the creator of one of those really good pieces of software: Ardour [0].

If others who read this comment are users of Ardour, please consider doing a $5 monthly donation to Paul on PayPal.

[0] https://ardour.org/


Google Maps is so much better and so much more impressive and so much more useful than Ardour that it's not even funny!

Thanks for the plug, even if it feels a bit out of place here (and most of our supporters only pay $1/month, which is fine too).


> In the field that I work in (audio software) there are dozens of things that we can do in software now that were unimagined even in 1990.

Consider that these things aren't really due to the practice of making software becoming better, but rather simply that hardware has become ludicrously powerful so as to enable this at all. It's the hardware portion of IT that deserves the gold medal here.


That's not really true of the first two things I mentioned. These required significant evolution in the DSP/math involved. In 1990, timestretching existed, but generally created artifacts. Polyphonic note editing (as implemented first by Melodyne) didn't exist and wasn't really even imagined.

It is true that they both benefit from more powerful hardware, but these examples required significant advances in software too.


And what novel possibilities have Polyphonic Note Editing and Utterly Transparent Time Stretching brought to music? All that has been accomplished 4% YoY reduction in the cost and time of producing Mass Media, which requires a massive volume of production work. These are conveniences which are and always have been utterly conceivable consequences of sufficient engineering hours, not revolutions or fundamental changes in our relationship to the world. Even if some of those are just now coming into people's minds, all those changes were already conceived of and implemented 20+ years ago.


Where did it say in the "let's create software" contract that the only acceptable goal was "revolutions or fundamental changes in our relationship to the world" ? As I said in another comment in this thread, I understood the role of computers to be helping humans with tasks they wanted to do by doing things that computers were good at. If that happens to include revolutions or fundamental changes, fine, but it definitely includes a lot of other things too.

You are right that the particular examples of audio software capabilities do not in and of themselves bring anything in particular to music.

But the timestretching stuff has totally changed how huge numbers of people now make music, because they can work with audio that is in the wrong key and/or at the wrong tempo, without really thinking about it.

Do I think that this results in an aesthetic leap forward for music itself? I do not. In fact, probably the opposite in many senses. But that is true of so many human technologies, not just software. Some people would even argue that the advent/invention of well tempered tuning (and the concomittant move away from just intonation) hurt music in the west, and that was just as much the result of "sufficient engineering hours" as anything in the software world.

Also, just to correct you, 20 years ago, I guarantee you that nobody, absolutely nobody, believed that you could ever do polyphonic note editing. When Melodyne first demonstrated it, most people who knew anything about this just had their jaws hit the floor. It was absolutely not an "utterly conceivable consequence", even though in retrospect of course it now all seems quite obvious.


The whole premise of the comment you were replying to was that we were spinning our wheels, which is not an utter refusal of progress but a characterization of progress. You responded with marveling at the things which have been achieved which you find remarkable and unappreciated. I responded by characterizing those as having some qualities of remarkability and novelty which ultimately fail to exceed wheel spinning. You are now accusing me of having set up an unreasonable standard of progress. There are two characterizations of the progress in this domain which I believe are likely but not certain accounts, maybe both are partially true, maybe just one, but both are contained by the wheelspinning metaphor. One is that progress is being made, but the progress is not forward and in fact the progress is the digging of a deeper and deeper hole that makes actual progress more difficult, and the other is that progress is being made, but that progress is that of an infinite series which logarithmically ascends from 1.0 to 1.1. There were genuine discoveries, inspiration and novelty required on the road from note identification to chord identification and deconstruction that could be reversed and reconstructed with reasonable accuracy. That does not mean that we aren't doing anything more than refining and improving the accuracy of processes which we were already performing rudimentary forms of 20+ years ago. I'm not saying there isn't progress, only that the progress is limited, and we are now reaching towards the same limits which we had begun approaching at the inception of the programmable machine and there is no escape in sight.

https://www.jstor.org/stable/3679550?seq=1

"The real power of a neural net is its ability to compute solutions for distributed representations. In most cases, the solutions for these complex cases are not obvious. The pitch class representation of pitch is a local rather than a distributed one. In this case a possible solution for the chord classification problem is apparent without the use of a learning algorithm. A net containing 36 hidden units, one representing each of the possible major, minor, and diminished triads, could be constructed so as to map chords to chord types. Thus our interest in using a pitch class representation was not to find this obvious solution, but to find a solution which used a minimum number of hidden units. We hypothesized that three hidden units would be adequate and that the hidden units would form concepts of the intervals found in triads: i.e., major third, minor third, perfect fifth, and diminished fifth.

Each pitch-class net used 12 input units to represent the 12 pitches of the chromatic scale and 3 Output units to represent chord type. The number of hidden units and the values of the learning parameters are summarized in Table 1 for each of the eight pitch class nets discussed. Net 1 had an adjacent layer architecture as shown in Fig. 2 and three hidden units. It identified 25 percent of the chords after more than 11,000 learning epochs. When a fully connected architecture was used in conjunction with three hidden units in Net2, 72 percent of the chords were identified after 2,800 learning epochs. "

https://secure.aes.org/forum/pubs/conventions/?elib=11400


Almost all progress is incremental. Look at it from one angle, and it looks like "spinning our wheels". Look at it from another angle and it looks like almost all progress in almost all fields of human endeavor.

Neither of those papers cover any of the technology or ideas behind what Melodyne introduced with polyphonic note editing, which allowed the editing (in time and/or pitch space) of a single note within the audio of a polyphonic performance.

I'm entirely fine with saying "getting computers to do things humans have done for a long time isn't really progress". I'm not sure it's true.


You raise an outstanding point here, in that DAWs do tend to simply automate away the pain-points of making music as conceived of by people before the advent of personal computers. However I would caution against applying this mentality to Paul's project, which if anything is doing the most of any DAW out there to fight against those very conditions.

One of the primary problems with DAWs as conceived initially was that they were closed, proprietary systems comprised of stupid-expensive hardware to even just open the dang application. This helped to facilitate the inescapable bubble in which the Mass Media finds itself today, playing right into their competition-killing hands. So of course, the world was stagnant for 20+ years, since the only people that had access to this software were "audio professionals," who had the creativity, ingenuity, and passion of a wet noodle. And they did predictably lame things with it all.

When only Kanye West and T-Pain had access to polyphonic note editing, it was pretty lame indeed. But access is, in itself, novel. The world has since changed considerably, and we have projects like Ardour in part to thank for this.


c'mon, we live in an era where people get Grammys for albums that they could record entirely in their bedroom. Just 15 years ago you had to book professionnal studio time with an engineer at a ~1000€/day for the good ones around where I lived.


up'ed for the implicit Jacob Collier reference, whether you knew it or not :)


eh, had Billie Ellish in mind but that works too haha


> The same is true of so many other fields.

The Dunning-Kruger effect can be fiendishly subtle. So many people are limited by their experience, they don't stop to think (as you have) and imagine what fields outside theirs have taken advantage of in information systems.

I agree with the article that in the mainstream web/app ecosystem, there is a lot of unnecessary trash. Through my own experience, I've seen duplication of libraries and APIs that just frustrates. On those fronts, yes, it would be nice to have a little less paradox of choice.

But as you have graciously pointed out, there are domains undreamt of by the author that wouldn't give up the progress of the last 70 years for any amount of gold, and have much need of software still. Thank you for your examples.


> Polyphonic note editing? Utterly transparent time stretching? Even just the ability to do huge amounts of serious DSP on generic CPUs has completely altered the entire landscape of music production

Right, so now what is popular music today? Vocalists who can't sing in tune without help from technology, and musicians who never play more than a few bars at a time in studio because it is all stitched together digitally, and live performances that are just lip-synching to a playback. It's artificial from end to end.


why care about "popular" music at all?

Yesterday on youtube I watched two hour+ live concerts, both made available by the lead artist (Dhafer Youssef). The first was a more jazz inflected performance (not suprising given Mark Guiliana and Chris Jennings as the rhythm section), the second was more "world music" (also not suprising with Zakir Hussain on tabla and Husnu Selendecir on clarinet). Both featured Youssef playing oud and singing in his incredible melismatic style. One was recorded in 2014, one in 2106.

The music was utterly incredible. Virtuosic performances of the highest levels, astonishing compositional and improvisational music structures. And amazing sound quality (though sadly one was marred a little by clipping).

You don't have to like this particular music. Just stop focusing on "popular music", which has ALWAYS been dominated by dreck of one sort of another. Remember that there is (and often always has been) a huge universe of incredible music out there from cultures around the world, reflecting both tradition and experimentation, skill and inspiration, solo brilliance and music collaboration, virtuosity and taste.

Lamenting what the kids are doing with Ableton Live or FL Studio when you can watch Dhafer Youssef play live for 2+ hours with Tigran Hamasayan, Mark Guiliana, Zakir Hussain (or whatever floats your boat) is just wasting your time and your life!


I know this is a common opinion held by many on HN, but I couldn't disagree more.

UIs have not only gotten prettier, but in general, UX has improved significantly in the past 20 years.

Take something like Google/Apple Maps and compare it to the tools available back in 2000. Sure, you could Mapquest some directions and print them out. But nowadays, you don't even have to think about planning your drive out. You hop in the car and navigate to your location. Need to stop somewhere for gas? Search along the route and find the cheapest gas with the shortest detour (not based on distance, but an accurate time estimation based on traffic/lights/etc) with your favorite restaurant nearby. Traffic up ahead? Get a suggested reroute in real-time. Need to take an exit? The app will tell you exactly which lane to be in to anticipate the next turn. Road debris/collision ahead? The app will let you know. Tioga pass closed? You don't even have to think about it since the app will route you via the correct pass from the start. Traveling via public transit in a new city, walking, biking? The app will give you directions tailored for your specific mode of travel.

All of this, happening in real-time, on a device that fits in your pocket, while driving in your car.

I could bring up the same examples for any other type of service. The fact that so many dismiss these legit, quality of life, improvements just shows how jaded many of you are. Sure, we aren't making breakthroughs at the fundamental levels of software, but that's only because we haven't even gotten close to reaching the limits of our current capabilities.


Traveling via public transit in a new city, walking, biking? The app will give you directions tailored for your specific mode of travel.

FWIW, there are at least two layers of map that map apps have no clue about that I find essential as a cyclist: how much shade am I going to get while going down a road in a city that routinely has temperatures that are near the top end of what the human body can deal with? And how bad is the road surface in a constantly-sinking swamp city full of potholes and half-assed repairs?

There’s other layers - I don’t need to worry about whether or not I want to trade a longer route for clawing my way up a steep grade now that I don’t live in Seattle, for instance.


Assuming you're talking about NOLA, you can just have the app monitor which bus routes have been diverted due to poor road quality (I know of at least 2-3 at present)


Yes. I’ve never thought of that! Feels like it might be a lot of work compared to just “letting my brain keep a map”, though.


and yet ... do you not believe that if open street map (or others) could collect this sort of data, current map/routing software could just use it?


probably, yeah! I'm doubtful anyone would make it a high priority unless they were active cyclists themselves with some time outside of the endless cycle of feature requests from management. :)


I don't know if I agree.

I think in a lot of ways, UI ebbs and flows; a lot of modern UI feels pretty terrible and non-intuitive, and re-designed for the sake of being re-designed.

And at the same time, writing software is tremendously easier than 20, or even 10 years ago. But maybe that's an ebb-and-flow issue as well. I tried adding React to an existing project last week, and was completely unable to. Everything broke. I'm debating which is the less-bad option - create a new repo, starting with React and gluing everything else around it, or moving just-the-react parts into a repo of their own. I'm tired of trying to figure out how to glue disparate, opinionated things together. I'm missing the Unix philosophy the author is talking about.


> I think in a lot of ways, UI ebbs and flows; a lot of modern UI feels pretty terrible and non-intuitive, and re-designed for the sake of being re-designed.

I think we're still figuring out UIs. As someone who sucks at them myself, I often wonder where are the modern Bruce Tognazzini's. I'm still reminded of a point on UI design he made that the corners of the desktop in a WIMP are special because they are infinitely large targets - you just fling the mouse pointer there and hit them. This point was of particular interest to me as an early Linux desktop (FVWM) user with virtual desktops (3x3) that you could configure to flip between when the mouse got near the edge, so it broke the infinite corners convention.

I'd like to find people who are thinking about UI's as deeply as the AskTog articles used to.


Currently, It's terribly a mixed bag of broken platform communities and somewhat feel like java era again where huge ads money drives technology and tools. Same vibe of cults, force people to learn inferior framework and re-invent things that do not solve problems. Specially the web industry looks so bad right now. Google and Facebook seem to join force on _that_ javascript framework right now is .. ugghhh I should stop it here.


> UIs/lipstick gets prettier

I wouldn't even say this is true. I hate most UIs out there. There seems to be little thought given to how they appear and are interacted with on a wide-variety of devices.

In general, I just don't like a lot of æsthetic out there either.


The level of rose tinted spectacles on this is nausea inducing.


We don't need Facebook, Instagram or stupid apps which make you bunny ears, but we need a hellofa lot more which solves real problems for real people.


I often laugh at Enterprisey world about how complicated it become.

The problem is that, i only need one table to really deliver REAL value to my customer, instead of huge giant monothlic with complicated business logic !

Why ?

I don't know. You might don't need "general" or "framework" to deliver real value to your customer.

What i (or many) needs is a "framework of design", not code on how to deliver value to customer.


Beautiful, and funny, and beautiful. I wish to subscribe to your newsletter, but there is no RSS feed. Can you write some more software for that?


You can use this feed [0] which is automatically generated from the markup using rss-proxy [1]. Disclaimer: I the developer of rss-proxy.

[0] https://rssproxy.migor.org/api/feed?url=http%3A%2F%2Fblog.sp...

[1] https://github.com/damoeb/rss-proxy/


> music was around forever but in the 16th century, if you made music, you made it for god, or the king

What about minstrel ballads or festival music?


I believe that computers today can do way more than they ever occurred back in the 1970s.

But yes, things do you not speak well to each other still.

I think the single most important distinction from back then too right now, is that back down programs was written to serve the user.

These days people no longer use programs, they use online services, and they are designed to make money.

That’s a small detail which makes a world of difference.


https://en.wikipedia.org/wiki/Conway%27s_law

> Any organization that designs a system (defined broadly) will produce a design whose structure is a copy of the organization's communication structure.

— Melvin E. Conway

I think what we're seeing is a kind of global instance of Conway's Law.


We should reinvent the wheel, but not reimplement the wheel which is what most people mean when they say "reinvent the wheel".


I love the strange design of the blog. The design, with the crazy spacing and bullet points makes it kind of fun to read.


1995 Niklaus Wirth came to the conclusion software engineering is moving towards exhaustion. https://people.inf.ethz.ch/wirth/Articles/LeanSoftware.pdf

So did Dijkstra in 2000, see EWD1304.

What changed since to the better?


complete interoperability didn't happen, so we should stop writing software? what?

> Our computer programs are as adhoc and inscrutible as they were in the 70s, and we’re all struggling under them.

Objectively untrue.

I really don't understand the point of this page/post/thing. it's some serious r/im14andthisisdeep content


Author uses all the latest technologies to spread how "software is not working" I'm reading this in my mobile browser, which can also run my game development tool inside of it.

So yes, software is working, and yes, new cool software is released all the time (eg deep learning)


> The promise of a computer is to reveal truth, through a seamless flow of information. [...] Information has not become more seamless.

I disagree pretty strongly here. When I got my first computer as a kid, OSes outside of POSIX were silos, you literally couldn't read a Mac disk on DOS or vice versa. We've made huge strides in terms of a heterogeneous internet that supports all manner of data exchange unimaginable in decades past.

Now, is it seamless? No, but with orders of magnitude more programmers and more code, the edges of the interaction graph also have exploded; of course there will be more seams. The lesson isn't that we screwed up and somehow need to wrangle the complexity. To the contrary, successful ecosystems require diversity. We do not need an omniscient benevolent dictator to show us the way, rather we need to embrace the chaos and carve out smaller spaces for elegant systems within the jungle. It's a tricky balance because economies of scale are valuable, and capitalist incentives push businesses towards consolidation, but a single hulking Tower of Babel, no matter how high, is still a local maximum and structurally vulnerable.


Information processing and communication is evolving in exciting ways and producing incredible results. Techniques for accomplishing this rise and fall and are ultimately unimportant. Seriously, I think the author needs help for depression.


My takeaway as the real lesson here is don't get too caught up in idealism.


This was marvelous. I really wish I could wield words like this person.


Poetry is great, it vividly expresses how one feels, often about nature.

This is ideology - it has a distinct stink to it, an agenda, a point of view that wants other people to be other than they are.

Most people think this way - they just happen to have a happier disposition. Most people 'feel' a certain way and get taught a certain ideology (in school, media, friends, parents) which then shapes their worldview. Instead of questioning the world view they already have, they simply live in it and project wrongdoing and rightdoing (morality) onto others and the world.

It's like being taught to walk barefoot, pricking your foot and deciding small prickly objects need to be gotten rid of. It never occurs to them to wear shoes.


enter ... the human condition


Happened to read this while listening to "Information Travels Faster" by Death Cab for Cutie; would highly recommend. It adds another layer to this surprisingly cathartic post.


I thought, these are some interesting thoughts, maybe I will subscribe to their RSS feed.

But there doesn't seem to be one.

I looked at the source of that web page and apparently all the HTML was generated by JavaScript...

Sigh.


Maybe this dynamic feed [0] is good enough.

[0] https://rssproxy.migor.org/api/feed?url=http%3A%2F%2Fblog.sp...


That's helpful, thanks.



you're going to need software for devices and gadgets that you can't even imagine yet, it's not going anywhere anytime soon


Can you imagine a world made better with more software? I can. We should continue.


Not matter what. We're not going to stop. That's way too late.


We should have stopped making To-Do list apps after Wunderlist.


Nothing is ever enough. The article is bullshit.


I agree, this article looks well.


goth music isn't dead


Great - absolutely great.


> Information has not become more seamless. > There has never even been a reduction in the number of seams. > It just has not ever happened.

This is ridiculous. The world of information today with the Internet is better than it was in the 1970s. It just is, that is an indisputable fact. We have more information at our fingertips than we have ever had in human history, and it's good that we have that information. Finding information has become more seamless.

I can go on Reddit right now and get a dozen people to give me detailed breakdowns of what fountain pen to purchase. Last week I got a professional chalk artist to give me recommendations on what chalk brands I should be looking at within 30 minutes of me posting the question. When my parent's computer broke a couple months ago, my Dad went on Youtube and got detailed instructions on how to take it apart and fix it from someone who didn't even speak English. It's not just that people exist who can help you out, there are enough of them online, even if they're nontechnical, that even the most niche topics can often support a community of educators and experts.

And that's not even talking about the social effects of being a marginalized individual and having access to other people who are like you, even if you live in a rural repressed area. Growing up alone without the Internet if you're in a hostile environment is awful. Nobody should wish that on anyone.

I get that it's trendy right now to hate on the Internet, but the author really needs to get some perspective on this.

----

And on the subject of software being 'done', pick any creative field, anywhere, and ask people working in it if there are any new things they want their computer to do. I can think of 5-10 features purely off of the top of my head that I want added to Krita that would make me a much more productive artist. Does anyone really believe that if you went to a 3D artist and had them look at the state of Blender today that they would say, "all of that is just cruft, it isn't making my life easier."

It's absurd. People romanticize the past, but even just going back 15 years, Linux wasn't an ideal paradise. It was hard to do everything. Nobody who wasn't a computer engineer could reliably use the OS. Today, I have children who haven't even gotten out elementary school using Linux as a daily driver, because the OS got better and because software today, for all of its real flaws, is still way better than it used to be.

If people want to point at flaws in modern software, I'm down for it, there are plenty. If people want to say that we've regressed in some areas like Unix philosophy, I'm down for that too. I think we have regressed in some ways. But it's just not a coherent argument to say that software has gotten less accessible or that it hasn't made any gains in the past 20 years. Try to retouch a photo in a 20 year-old version of Gimp, and then try to tell me how amazing everything was back then.


The overall point is that utopian ideals in computer science simply haven't worked out. Spoiler: all utopias fail.

Underlying that is something more interesting. Technologists keep trying to create a cooperative environment where data is freely shared—Unix, OOP, Semantic Web. However, it never sticks.

The reason is that the economic imperative drives competition, not cooperation. Ultimately this is a byproduct of capitalism, not computer science.

Software works as well as our society does, which, you know... it's complicated.


As I was a child I really believed that there's a technological solution for everything.

I don't agree with that anymore.

At least now there's a book that writes about the state of the world as I see it now:

https://www.penguinrandomhouse.com/books/575671/the-story-of...




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: