Hacker News new | past | comments | ask | show | jobs | submit login
The Failed Commodification of Technical Work (mataroa.blog)
324 points by M2Ys4U on Nov 24, 2023 | hide | past | favorite | 200 comments



I think - and it's only a think - that the author has ignored that a large part of what used to be called technical work is now commodified.

I remember when a mail merge literally meant printing out lots of address labels and then manually sticking them onto a letter & envelope. Word 2.0 (?) solved that problem for the 1990s and MailChimp has commodified it for the 21st century.

Double-entry book-keeping was technical work and was usually run by highly trained individuals. Nowadays every shop keeper just scans a barcode and has the customer tap-to-pay.

There's not yet a drag-and-drop like interface for anything more complex than Scratch (wither Visual Basic!) but the hard part isn't the technical work of stringing together libraries; it's requirements gathering.

Speaking of which, it has never been easier to drop in a high-quality cryptographic library, or import an interactive map on a website, or WYSIWYG edit a website.

So, the author is right that you can't stick a bored 18 year old in front of an IDE and have them create you an ERP. But a lot of the "grunt work" of IT is now firmly a commodity.


What you say is true, but the amount of "grunt work" is not constant over the years. In fact, I think the amount of "grunt work" in teh tech industry is just growing and not shrinking; I think the following look is quite obvious:

- amount of current grunt work: X

- new tech Z appears that makes X be reduced to 0.1X

- at the same time Z enables new ways of doing things. Some things become grunt work because they are a byproduct of Z

- amount of current grunt work: Y (where Y ~= X)

- ...

If the technological progress had stopped in the 2000s, then all the grunt work (originated in the 90s) would be esentially zero today. New tech just brings automation and grunt work. I don't think we will live in a society where there's practically no grunt work.

The most recent example is AI: there are AI tools that generate sound, images, video and text... but if you want to create a differentiating product/experience, you need to combine (do the grunt work) all the available tools (chatgpt, stable difussion, etc.)


>If the technological progress had stopped in the 2000s, then all the grunt work (originated in the 90s) would be essentially zero today.

If you wanted to have a simple database application in the 1990s, Delphi, VB6 or MS-Access were most of what you needed to get it done. The UI was drag and drop, the database was SQL, but you almost never touched it, mostly it was wiring up events with a few lines of code.

The work was commodified out of the way! Domain experts routinely built crude looking but functional programs that got the job done. It was an awesome time to be a programmer, you just had to refactor an already working system, fix a few glitches, and document everything properly, and everyone was happy.

Then everyone decided that all programs had to work on Steve Jobs' magic slab of glass in a web browser connected through janky Internet, and all that progress was lost. 8(


Are all of those proprietary products? I can't speak on your experience, but if linux was created in 1991, seems like in another angle you're bemoaning the rise of OSS and web.

I'm just a web developer that learned everything from online resources. So i think we are both biased on different ends on the spectrum.


Open source is great, Lazarus does a pretty good job of replacing Delphi.

Microsoft went insane with .NET so VB6 was killed in the process.

Access automatically handled table relationships, building queries and seeing them as SQL, and the report engine was pretty good. Thanks to ODBC, you could use the same database across all of them, or hook up to a real SQL server when it came time to scale up.

What's missing is the desktop and a stable GUI API these days. Windows apps from the 1990s still work, because they are distributed as binaries. Most source code from back then will not compile now because too many things have changed.

I love Open Source, but it doesn't solve everything.


> Microsoft went insane with .NET so VB6 was killed in the process.

I'd love to hear more about this perspective or any links to get more of it.

I did a (very) little hobby VB6 and loved it. Never made switch to .NET at that time (I was young, it was a hobby).

Having recently worked through part of a .NET book, I was pretty impressed by how far MS took it (although it seems extremely mind-numbing). Obviously it took a long time and had false starts, but MS stuck with it. On a personal level, I am very opposed to the entire model in an ideological sense, but it does seem to make a lot of business sense for MS, and it seems to cover a lot of cases for a lot of businesses.

So, was Microsoft's insanity with .NET just the obsession part, or doing things poorly for a while, until eventually getting it "righter", or is the insanity still pretty apparent?

I really would love to learn more about the historical-technical aspects of this specific comment quote, from VB6 to modern day, because it fits my experience perfectly, but I've had second thoughts about the position more recently. The more the specifics the better.


The insanity was to abandon the advantage they had with VB/COM, in order to challenge Java on its own ground. They threw away the baby with the bathwater. The C# pivot also slowed down their desktop efforts pretty dramatically, doubling the blow.

They were lucky Sun squandered the opportunity they had engineered with Java, focusing on the hardware side and missing the boat on browser, virtualization and services. If Sun had bought Netscape and then focused on building something like Azure, instead of fighting the inevitable commoditization of server hardware, they would have eaten Ballmer's lunch.


Disclaimer: I am not a .Net programmer, so these are just my thoughts and impressions as someone on the outside who followed the development from a distance.

I think a lot of the focus on .Net was driven by MS and Balmer's fear of Java. At the time, almost all desktop computers were running Windows 9x/2k. If 3rd party applications were developed with cross-platform Java, the customers would no longer be locked in to Windows.

First they tried the famous embrace/extend/extinguish approach by creating a Windows-specific version of Java. Sun fought back, and MS decided to push .Net instead.

It seemed to me that the initial strategy was to claim .Net was cross platform, but focus more on Windows and let open source projects like Mono be their cross platform "alibi". They changed strategies after a while, and now I guess the cross platform is more real.


> Windows apps from the 1990s still work, because they are distributed as binaries.

Only if you have the right libraries, and runtimes, and OS interfaces, and even if you have all that, oh no, it's a MIPS binary and you don't live in 1996!

Any proprietary API exists precisely as long as the owner says it does. Open standards don't suffer from that malady.


>Only if you have the right libraries, and runtimes

That generally only happens with .NET based programs in Windows systems. You always need some .NET v2,3,3.5,4,4.5, etc., runtime.


Totally agree. There is no backward compatibility with .NET runtime - if your application is built/linked to a given version, it won't work with any other version of .NET


That's simply not true. Newest .NET 8 does not need the assemblies you reference to target .NET 8 - as long as the TFM is any version of 'netstandardx.x', 'netcoreappx.x' or 'net5'+ it will work.

You can even make proxy-projects that target netstandard2.0 but reference .NET Framework and with certain compat shims the code will just run on .NET 8 unless it relies on some breaking changes (which have mostly to do with platform-specific behavior, there have been no breaking changes for the language itself since I think C# 1 or 2? some odd 20 years ago).

As for the runtime itself - the application can restrict itself from being run by a newer version of runtime but you can absolutely do so. The lightweight executable that just loads runtime and executes the startup assembly may complain but just try it - build a console app with 'net5.0' target and then run it with latest SDK with 'dotnet run mynet5app.dll' - it will work.


I think the point is that the Access, Lotus Notes tooling was in largish corporations somewhat ubiquitous.

The experience of this tooling was make a change and it was in production. It was incredibly simple and productive to work with given the needs of the time.

There was also plenty of opportunities to make a mess, but I don't think that has really changed.

Learning was not difficult, you just had to be prepared to spend time and some money on books and courses.

It is not a tooling set you would want to go back to for a bunch of different reasons but it worked well for the time.


> It is not a tooling set you would want to go back to for a bunch of different reasons but it worked well for the time.

I remember using lotus domino at one of my first jobs. There were all sorts of things I hated about it. But you could have a database - like the company’s mail database. And define views on that database (eg looking at your inbox, or a single email). And the views would replicate to a copy of that database living on all of your users’ computers. And so would the data they needed access to. It was so great - like, instead of making a website, you just defined the view based on the data itself and the data replicated behind the scenes without you needing to write any code to make that happen. (At least that’s how I understood it. I was pretty junior at the time.)

Programming for the web feels terrible in comparison. Every feature needs manual changes to the database. And the backend APIs. And the browser code. And and and. It’s a bad joke.

Commodification has a problem that for awkward teenagers to make the same fries every day, we have to ossify the process of making fries. But making good software needs us to work at both the level of this specific feature and the level of wanting more velocity for the 10 other similar features we’re implementing. Balancing those needs is hard! And most people seem content to give up on making the tooling better, and end up using whatever libraries to build web apps. And the tools we have are worse in oh so many ways compared to lotus domino decades ago.

I wonder what the original lotus notes designers think of web development. I think they’d hold it in very low regard.


Right!!

10/20/x years ago we didn't have DevOps, CloudOps, CloudFinOps, CloudSecOps, IaC experts, Cloud Architects, Cloud transformation experts, Observability architects, SREs, plus all the permutations of roles around "data" that didn't exist discretely, etc etc etc.


We did not have web scale products, which enabled new possibilities. E-mailing documents and collaborating offline sucked.


> I think the amount of "grunt work" in the tech industry is just growing and not shrinking...

Not sure, but isn't this just another way of saying that the tech industry keeps growing?


I'm not sure what the parent post meant exactly, but I do agree there is tons of grunt work -- I've seen big name SV companies where large parts of their work flow include parts like "and then just every hour you need to do something in a slow UI that can't be automated" to keep vital systems working. I would say that's really grunt work, and there are even persons in such companies where their only task is doing such grunt work. Truly I've been told by clients I work with they have entire double-digit sized teams where the members only responsibility is to reboot VMs that breach specific resource thresholds -- easily automated and even built into most hypervisors, but for whatever reason these tech giants opted for a human to do it -- the only semi-reasonable explanation I got from one client was that their infrastructure team got outsourced and they laid off the only people who knew how to use the automation tooling. It's a dumb reason for sure, but at least I can understand why they opted for the manual grunt work.

Similarly, keep in mind a lot of this grunt work is just to satisfy some reporting requirement from somewhere -- some person(s) in the company want to see at least X% of uptime or Y LOC every day, so you get people trying to write a lot of yak shaving code that basically does nothing except satisfy the metrics or ensure that uptime % always looks good (i.e., they don't fix the cause of the downtime entirely, they just get the endpoint that is checked to determine update working well enough so it reports to the monitoring system and they leave it at that)


If it's the amount of grunt work to solve the same problem, it just means the ecosystem keeps getting worse.

What IMO, is quite obvious.


We are invening the problems of tomorrow by solving the problems of today, and people tend to be the constraint.

Managing complexity to where a fixed team can operate the software.


I guess the point may be that after 30-40 years of this, the low hanging fruit of commodification may be gone. Further, the more we commodify, the higher order our problems become and the specialist engineers you hire move further and further up the stack.

Also, not sure if it's always been the case.. but the latest vintage of SaaSified startups have a high % of products that don't actually do any of the things you want them to do yet. They want you to pay them for their service, so they can capture your use cases for implementation and then commodify them for other customers. So you end up with long lead times and IP leakage. Neat!

I think the example of templating SQL is always this misunderstood target for management. I dunno, the language in particular has survived an incredible length of time in our industry... it actually does a pretty good job. 99% of wrappers/DSLs/etc put on top of it make it far worse and still require you to dip into SQL for anything remotely non-vanilla. Further, instead of hiring SQL experts (there's many) you need to train up SaaSified DSL SQL wrapper X experts (none exist).


This is a fair critique, and I'm giving it some thought now. I'll need to stew on it a little bit. Maybe the fundamental issue is that many of these products are designed to, as one of the other commenters noted, appear to purchasers that don't work in the field as simply appliances one purchases and then the problem is solved.

The issue is that a lot of the stuff out there doesn't actually solve the problem - it just appears to because other people buy it, and then lie about the implementation being successful to get promoted. Things like mail merge -are- like kettles, they're solved problems, and the only way to solve them is to try things.

The broader issue is that my employer purchased Workday because they believed it's like a kettle, but it can't actually fix the fact that our org structure is so horrendous that it can't be modeled.

(Incidentally, this year is the first year that I've realized that a sufficiently bad org structure, in a large company, tech debt of a sort. You end up doing all sorts of crazy things just to work out who works for who, and what can this user see in this database, etc.)


Some aspects have indeed been commodified. But what about the bigger picture?

How simple is it to run a business, a website, organise a travel around or pay a bill nowadays, compared to 1994 or 2004?

At times, I can't help but feel that the previous generation had a more leisurely pace of life, which led to a more fulfilling lifestyle. Nowadays, time seems to pass at a rapid pace, with high levels of stress.

Allow me to share two experiences:

a) The other day, while at the bank, I witnessed at least three individuals over the age of 60 struggling to complete simple tasks, aimlessly wandering around and pleading with the staff for assistance. These tasks are supposed to be easily accessible through online banking, but due to certain exceptions, the system did not support their specific needs. As a result, they were forced to make appointments, with the earliest available slot being three to four months away. One of them needed to withdraw money from a blocked account to purchase wood and heat her home, but the bank's staff refused to budge, insisting that she wait three months to solve the problem.

b) Just two years ago, my father was in Sicily and could not find a way to make a simple phone call back home. Yet, in the 1970s, all he had to do was walk into the bar in the area with a few coins.

Not to mention that while once upon a time your average person could fix the lights, the car, the heating, the non-automatic door, etc. by themselves now they need to call the professionals.


I wonder if there's a term for Amdahl's law but applied to human processes. Like the other side of "law of diminishing returns".


Amdahl's Law applies cleanly to human processes. Perhaps the most revealing example is the origin of "computer" as a human occupation and how scaling the compute process happened at Los Alamos https://ahf.nuclearmuseum.org/ahf/history/human-computers-lo...

The more general aspect of Amdahl's law is captured by certain scaling laws and limits generally related to communication (see full bisection bandwidth) and certain architectures (e.g. Cray) meant to optimize for this


It semantically works but it has not been adopted by people outside computing, I'd guess think the definition isn't relatable or understandable enough for people coming from humanities/biz backgrounds, so it might well be possible that there's a parallel concept there.


"Adding manpower to a late software project makes it later." - Fred Brooks, 1975


to bounce on your point, there's the distinction between creative & non creative work. Sure, tools might help the creativity, but it can't replace it, and their article discusses how it can make it worse. Accurate requirements gathering requires a spark of creativity...


Nah, WordPerfect and wordstar had mail merge way before word did.


I miss WordPerfect. Maybe it was the novelty but it had such enjoyable fonts.


Well I remember when it had no fonts :)


It's still there if you want to buy it. At least in name.


the people you're supposed to be eliciting requirements from are just regurgitating what ChatGPT told them are the requirements hehehe


This is becoming so true. I have read so many documents in the last year that are obviously from a GPT, especially when it’s about something new to a group.

But in the end, I would rather get a half baked GPT doc than a quarter baked junior analyst doc. I just worry that GPTs are going to kick the rungs out of the bottom of any knowledge work later. Being bad but junior used to be a learning environment without too many repercussions.

But how do you compete with peers using AI? You use it also. But now you have robbed yourself of a learning opportunity. Yeah you can learn someway by doing it, but it’s like doing homework by looking at the answers. Sure it can help you double check, but if you don’t put the effort into constructing your own answer, then you have only cheated yourself.

I think the AI alignment issues are probably over blown in the short term, but what about the long term when the average person has regressed so far as to be unable to live without AI. They will just do whatever is told to them.


I agree that the full commodification of technical work is a bad idea and will, hopefully, continue to fail.

However, having read the Phoenix Project twice and hating most of Scrum, I disagree that’s what the Phoenix Project is advocating for.

My main takeaways from the PP are:

1. Have clear systems in place to carry out and manage your repeatable work, automate where possible

2. Minimise the time work is in progress for so people aren’t bogged down with a million tasks

3. Share information widely and have multiple members of the team able to carry out the same task

4. Make sure the work being done is what the business actually needs doing

5. Reduce noise and unplanned work so staff can get on with the higher value work they actually enjoy rather than wading through a quagmire of disorganised chaos

The point of PP isn’t to turn people into interchangeable automatons - it is to put a system in place to allow people the headspace and time to do the really valuable work that can’t be automated or systematised.

I’ve run a factory and been a dev so I see it from both sides and making devs production factory workers isn’t sensible but likewise where work looks like factory work (known work, repeatable steps etc) it should be treated in a similar way.


The entirety of the Phoenix project is literally just copying Goldratt's the goal and then doing a s/manufacturing/IT/g and updating the references to modern day.

I'm not saying I don't like it, I've read the book half a dozen times and try and get every team I'm on to read it to help modify their thinking to be more systems focused, but I'm not going to pretend that it much deeper or insightful than the goal.


I feel I have to respond to this, as the Phoenix Project is probably the cringiest book I ve read in my life, and I've read The Effective Executive...

I just don't understand why we have to veil common sense practices (like continuous improvement, good communication, shared goals, etc) in this vaguely culty, vague Japanese kind of dev ops propaganda.

My biggest problem with the book is the same problem I have with scrum and all its hellspawn variations: it preaches how a method is special and if you only follow this method, everything will be okay. Well, guess what, if your team is full of people who don't communicate well, no management method can bring them up to be geniuses or to be suddenly a star team. On the other hand, if you have a team/teams of good devs, then you don't have the problem that The Phoenix Project/DevOps/Scrum are pretending to solve.

If anything, what you get out of blindly following the scrum recipes and people who fetishize The Phoenix Project, is mediocrity. We need to have value delivered on 2 week intervals, we need to always pester clients for their opinion, we need Friday demos each week to show how much we centered this div, and how much value this new button gives...

If you think you can chop value on small little chunks week by week blindly following the first thing that gives value, because long term planning is waterfall, and waterfall is bad, then you are a dummy and deserve your scrum and card estimations, and cringe standups. And you deserve it cause you gobble that bullcrap that those books and methodologies preach.

Card estimations with Fibonacci numbers?

Scrum masters?

Product owners?

Product managers? (that is somehow different from Product owners)

Sprints?

Standups?

Just take the retrospective, add some standups, kick all all non technical people from the tech meetings, add a sync or two with other teams, and you are done. But please don't write a book about it cause I will absolutely hate on it.


> I just don't understand why we have to veil common sense practices (like continuous improvement, good communication, shared goals, etc) in this vaguely culty, vague Japanese kind of dev ops propaganda.

I enjoyed the book personally. I think the key point it was trying to get across is that of all the things that look like "common sense" to people, the combination of these particular things is what is actually effective. it was never about blindly following some magic recipe, simply "here is a way of thinking about the overall task of project management that may be helpful, and here are some specific techniques that support it".

note that "if the project is behind we should make the engineers work 12 hours a day instead of 8" is common sense to a very large percentage of managers.


I don't know, to me, that was not literature, but a guidebook with examples.

Here is Bill, he's tired and overworked. If only he can focus on the important tasks and clear the clutter...

Here is Security guy. He is grumpy and is in a war with the developers because they don't follow his ancient and unworkable security practices. if only he could update his security practices to something more modern and cool.

Here is Maxine. She is a PO. Her team is given task after task and not allowed to focus. If only Maxine could protect her team from outside influence...

Here is CEO guy. His company is failing and he is trigger happy on ever changing initiatives and transformations, and nothing comes to fruition. IF only he could chart the course for his team, set performance metrics, and not change direction every 15 seconds...

Here is operations linux admin guy. He has a bunch of scripts that make the deploys when devs throw some new garbage over the fence to him. He is mad because the devs wrote yet another service in yet another language, making the ratio of devs to languages used 10 to 17. If only he and the devs could agree on a deployment standard or read about the wonders of k8s...

If this kind of preachy obvious rhetoric inspires somebody to take a deep hard look at themselves, recognize their flaws, and change, more power to them. However, I am simply allergic to patronizing narratives like this.

> note that "if the project is behind we should make the engineers work 12 hours a day instead of 8" is common sense to a very large percentage of managers.

Then out with managers like that. Most engineers can do their job without a pencil pusher standing over their shoulder and trying to "manage" them. However, there are only few managers who actually can do anything useful without underlings...


it was literally meant to be a guidebook with examples. I found that an entertaining way to present the material - if you were looking for literature or subtlety I can see why you would have found it patronising but personally I didn't feel talked down to when I read it.


If it was meant to be a guidebook with examples, why all the fuss about it?

You don't see people worshipping Cooking for Dummies, so why are we so cult-y about Scrum or The Phoenix Project. What's with the weird zen/Kung fu kind of vibe of it, as if they have just discovered sliced fknin bread?

Sadly, I genuinely think that for some people the Phoenix Project is an eye opener. Their enthusiasm on just discovering how to be a professional in the role they have been half assing for decades bugs the living crap out of me.

To me, reading TPP felt like reading a patronizing self help book. I found it nauseating, shallow, bland, and anyone expressing even a tinge of enthusiasm about it feels like an affront to my sensibilities.


> Their enthusiasm on just discovering how to be a professional in the role they have been half assing for decades

that is literally an entire genre of fiction - amateurs who have no idea what they are doing get a wise old teacher and shape up into a killer team. i suspect a lot of the enthusiasm for the book comes from the popularity of "people level up and the magic happens" stories.


I actually totally agree with all of this, and these are the positive takeaways from the Phoenix Project. It was actually a valuable read despite my ribbing re: prose. There are many things people do wrong with known work/repeatable steps that can't be rightfully laid at the feet of Phoenix Project-type thinking.

But the one that can is that I think it ignores is that I rarely do known work with repeatable steps, because I'm programming: whenever this happens, it's because we've made tactical errors in stakeholder management and now I don't have time to automate them - but it has been a long time since my last reading, so it is possible that I've forgotten some sections that make substantial concessions in this area.

A thoughtful, sensible reading definitely leads to your last paragraph even if the writers don't explicitly call it out, but I simply know for a fact that most of the managers I meet don't understand the difference between producing widgets and designing systems.


yeah I learned of the Phoenix Project as the 'why' behind doing the orange DevOps handbook 'how' when our company was hit with the DevOps wave. Continuous Learning + automation + instrumentation are many of the tools that let you do the 1-5 in your post.

Saying it is only to "work harder to get more work done faster" is not what I took away from TPP.


> hating most of Scrum

I'll just say that if you look at Scrum itself there is really nothing objectionable to it.

https://scrumguides.org/scrum-guide.html

It's the other shit people pack on top of calling it Scrum that usually sucks ass. I've found the best way to fight back against shitty-Scrum is not to fight it, but actually feign puritanical allegiance to the actual doctrine, it's much less repulsive. I makes you look like less of contrarian and it's easier to make an impact that way.


I don't want to make an impact in any organization where experienced engineers are treated like children.

So Timmy, what did you do today? Ah, cool, make sure to raise your hand if you make a boo boo, and involve your little buddy Mike, okay? Mhm, thanks.

Hey guys, let's theorize how difficult it would be to make a tree house? Would it be 1 candy? 2 candy? 11 candies? No, that's too much, let's agree on 5, parents are eagerly waiting for the tree house to be built. Okay? Thanks. Well, 2 hours passed, time to tuck you in bed.

Hey children, this is Jake, he is a bit slow in the head. He can't read yet but I have decided to make him the one deciding what is most important in your reading curriculum. I also have decided to talk only with Jake and check only with him what is your reading progress. If you haven't read all the books in the curriculum by (deadline made up by the first number which comes to Jake, and he can count max to 3), it's your fault, not Jake's cause let's be honest, he is a tool and he can't read. But if you all read your books, Jake gets cake.

I believe all those methodologies were invented because manager types are terrified of depending on people who are different from them and who they don't understand. So they decided to embed one of their own business types (who also has 0 qualifications to judge whether the engineers are doing a good work) to make sure the engineers are not playing ping pong all day.


Yep, stand-ups and retros literally feel like presenting my homework and saying what I've "learned" today most of the time. I've found myself as the ic that must speak and present on behalf of the group and this is all too accurate.


Very true, after being told many times “that’s not what it says in the Scrum guide, we need to do it like …” by a Scrum Master and a PO I decided to read it. I was astounded to find that the guide says very little and they were just using it as a weapon to push their own controlling desires on to the team. All the devs read it and the next time they used that line we asked them what it actually does say, they just made stuff up, clearly they had never read it either.


Yep. In pretty much every team I was in, devs would always push for "their own version of Scrum" that differed from the PO, which always ended up being much closer to vanilla Scrum than whatever some crazy PO or Scrum Master wanted.

Also funny: when the PM is is actually good, you barely have to discuss "the process". Almost any shit just fucking works. Who knew.


The naming of the time blocks as "Sprints" is objectionable. Let me just _sprint_ 20 times back to back in a year, year on year for my whole career!


Hard agree, the naming of much of Agile/Scrum is terrible from the dev perspective.


The Scrum guide does not capture the culture and ecosystem of scrum that has developed around it. It’s like Node without NPM.


That’s my point.

Keep the guide, ditch the ecosystem.

You don’t have to use Rails, Boost, Spring, or Scrum-XP-Rational-Waterfall. Sometimes back to the basics of the tool is needed, but don’t throw out the baby with the bath water.


Let's throw the guide as well, cause it's the people reading the guide who made the ecosystem.

If one is faced with a guide that essentially tells them: communicate well, don't be a douche, improve constantly, and do hard and smart work, and they are shocked by the guide's revelations, maybe we went somewhere wrong along the way.


I strongly doubt it’s the people who read the guide. It’s the people who learn about it from word of mouth that produce Jira-driven behemoths. Really, try the guide. It’s minimal.


You mean dividing features into fixed-size timeboxes without even bothering to fully define them isn't objectionable?


I was really confused by the title, because doesn't "commodify" mean "to make saleable", like commercialize? Hasn't tech made billions?

I think the author is talking about "commoditization", eg genericizing tech work so that any replaceable employee can do it.

From https://en.wikipedia.org/wiki/Commoditization:

> This is not to be confused with commodification, which is the concept of objects or services being assigned an exchange value which they did not previously possess by their being produced and presented for sale, as opposed to personal use. One way to summarize the difference is that commoditization is about proprietary things becoming generic, whereas commodification is about nonsaleable things becoming saleable. In social sciences, particularly anthropology, the term is used interchangeably with commodification to describe the process of making commodities out of anything that was not available for trade previously.

Am I being pedantic? I thought the two had different meanings?

Edit: Ah, but wait... from Wiktionary instead: https://en.wiktionary.org/wiki/commodification

> Sometimes used interchangeably with commodification

Guess it's just a common mixup.


> doesn't "commodify" mean "to make saleable", like commercialize?

As fas as I understand, no, it doesn't.

https://www.merriam-webster.com/dictionary/commodify

> to turn (something, such as an intrinsic value or a work of art) into a commodity

Thus, "commodify" is about "commoditization".



I've always heard the term used in the sense that the author is using it. Oil is a commodity because there are many producers of it and it doesn't matter which one you get it from since they're all making the same thing.


Which term? Commodotize?

Aren't we in agreement with the author then, that tech work isn't quite the same as oil or burgers (ie it hasn't been commodotized, even though it's been commodifed?)

Or are you saying the two words mean the same thing?


The work many were doing 30 years ago - making shitty PHP websites - has been completely commodotized to the point where as a dev many of us don't touch it. Why would we now that business people have enough literacy to click something which looks pretty together with Wix or that other website-as-a-service which funds 95% of podcasts :-)

Or hire a designer / commodity PHP-shop to make them a Wordpress.


So you're saying commidified creativity commititized art, leading to commodified podcasts commoditizing web design? Got it.


Perspective of someone who isn't really in this space: I've always seen them as the same thing, except commodification is talking about the idea in general while commoditization is taking about a specific product.


In the McDonalds analogy, developers are not the teenagers working at the machines, we're the engineers that designed the machines. In the McDonalds analogy, the computer is the teenager.

Programming isn't work, it's meta work, you come up with a list of instructions once, and then the computer does the work 24/7 indefinitely. Meanwhile you go on to write another set of instructions for something else. If you ever write the same set of instructions more than once, you're basically doing it wrong. So it's hard to know how long things will take, because you're always doing something new. You never do something more than once.


You hit on the fundamental assumption of calling IT a factory, and why that’s invalid.

IT is not a factory, it builds factories.

I believe a lot of developer/management conflict stems from a lack of terms for what we call manager. A McDonald’s manager supervises employees to ensure they’re following a process that produces a product. Failure to adhere to the process is obvious, the process is assumed to be valid as a given, and failure to produce the outcome is obvious.

Whereas a programmer is employed to develop processes that a machine will follow. A manager over this employee may have a process for the programmer to follow, it may be obvious whether this programmer is following the process, but it is uncertain if the process is valid and will connect specific actions to prescribed outcomes and the outcomes themselves may be non-obvious.

But we call both people in these roles “managers”, despite them being very different.

Programmer, in this machines analogy, is a different type of manager. One that presides over machines, and not people. But these machines are turned loose on the world, left to their own devices and not watched over by the programmer.


I think this is great insight. It reflects why I often feel like I'm having to be the product and program manager as well. Even though I'm not, so often ideas on what could be improved, what could be done better, how to do things better can only come from the programmers, because the people whose title is manager are too removed from the actual management of the machines that are the ones executing the process that yields actual outcomes.


> Programming isn't work, it's meta work

This is either profoundly insightful of absolute bollocks, but I don't think I'll be able to tell which before the next 5 years.


Programming is 5% of painstaking work to break down a problem into hyper-detailed description that leaves no space for ambiguity (that's the "meta work"), and 95% of dealing with self-inflicted bullshit like build systems, package managers, platforms, service architectures, devops, devsecops, secdevops, and all the procedural nonsense necessary to give people paying us a modicum of control over delivery.


Wow. This may explain why I don’t really like programming anymore. After doing it for years the solutions often seem immediately obvious. It’s the modern programming process (the “self-inflicted bullshit”) that sucks.


It's a series of simple tasks interrupted by two week diversions.


Author here - nothing really to say other than that I think this is bang on the money, and I've used the word meta-work to describe it too (when done correctly).

However this doesn't mean this notion isn't possible absolute bollocks, as another commenter suggested, just that someone else will join me in embarrassment if I realize this is silly in a few years... which is better than being embarrassed alone!


About 10 years ago a few friends of mine (mechanical engineers) were surprised that I was studying software development. They said something along the lines of “is there much left to do? we can just use existing systems to do everything we need right?”

The misconception is that building systems to tackle new problems are easy and thus have been “commodified” meaning nobody needs to write code anymore.

The reality is that building software is rarely as easy as configuring a UI. You end up needing text which represents logical rules and flows, you need version control see how the system changes, rollbacks… which means you need programmers

Coding doesn’t disappear, it just moves up the levels of abstractions


I think he’s right. The tech industry has been trying to commodify devs for a long time (COBOL, Java).

But there’s a sort of essential quality that reasserts itself no matter what you abstract. Despite the seemingly simple requirements paired with high level frameworks, a lot of our software still doesn’t even work well.

As the author notes. The only real fix is talented devs that care.

You can make a career out of that.


The problem is that beyond the boilerplate we aren't solving commodity problems most of the time.

If you've worked in the same field long enough, you'll see that things certainly rhyme, but everyone has slightly different business requirements. At each level of the stack this grows, and so in total there's a ton of non-commodity, bespoke work to do.

  That's why no two products are exactly alike and we don't just have 1 giant all world megacorp producing everything.


The problem is that beyond the boilerplate we aren't solving commodity problems most of the time.

The only people that see it is a problem are the ones that wish programmers were commoditized. Your complexity is my salary.


That complexity is how companies typically differentiate their product, generally purposely to keep it from interoperability with other software so their entire business is not commodified.

Just look at the IOT space to see this in play.


> Your complexity is my salary.

Then your salary paid out of broken window fallacy.

But that's fine. All the nice, humane things happen where the market optimized things a little bit, but not too much.


Unless you're SAP, you always need to customise whatever you build for every customer.

You can have reusable components, but a full one size fits all solution is really hard.


Is there a business that has implemented SAP without customization? My experience has been the opposite and that vanilla SAP is usually stupid and wrong.


The #1 rule of taking SAP into use in your organisation (according to multiple actual SAP Consultants I've chatted with) is: your organisation must switch its processes to fit the SAP model, not the other way around.

You _can_ customise SAP to fit your way, but it'll be an uphill battle all the way. Every update will also need to be customised, every new feature has to be modified and eventually it'll fail and you'll be out tens or hundreds of millions.

You can look up any of the big SAP failures and that's the reason for every one of them.


And that software purchasers demand changes. I would imagine we are not far from an AI being able to take over pure Waterfall development methodology.

But successful businesses aren't static, nor are their software needs. Once the AI decides on a data layer and you ask it to iterate on the code that interacts with it to meet new use cases and add features, god help you.


The hardest part is not implementing the specifications as described (maybe AI will do this soon, doubt).

The hardest part is taking various competing requests in plain English (often in their second language) from non-technical people, having an interactive conversation to tease out actual needs, and converting that into a specification (if even only in your head) to then implement.

When we have an AI that can understand French Quants, let me know.


Yep! What the business really needs is something that will do whatever necessary to ensure stuff actually lands. The skills are important for sure, but the responsibility aspect can’t be offloaded to machines well.

And even a lot of our boilerplate removal mechanisms (eg frameworks) are not very good either. They’re just slightly less worse than before.


Yes, or better to see these products as time to market reducers.

There's plenty of times I have the option of writing my own code or using some good-enough framework that gets me started, and then add my own customizations over time.


Devs even try to commodify devs!

I cannot count the number of times that developers have gotten outright giddy when an opportunity for self-commodification comes up usually under the guise of self-taylorization.

> I would adore it if the doctors and nurses in my life didn't constantly lament the stream of indignities that their single-neuron administrators heap upon them with each new proprietary system.

It's funny - the medical industry is precisely the counter-example I use when the auto-commodification discussion arises. There's two emotional appeals it makes. Patients hate it when they have to bounce from doctor to doctor. They understand that commodification comes with some degree of increased specialization[1] and that each hand-off represents an additional failure point[2].

The second appeal is simply one of status and being able to fashion one's work style in a more high-status form. I lead with the first, and let the second read as subtext and it usually works. Workers tend to maintain more generalization and rely on each other for consults rather than handoffs.

1. assembly lines are sequenced specializations

2. See handoff errors w.r.t. medical resident work hours


Author here. I have some famous-ish (within medical circles) family members, and it's interesting. On one hand, they have a lot of leeway and status, and at least the hotshots in my family are clearly treated as the medical equivalent of professional athletes, with a real craftperson's mindset.

On the other hand, I was actually talking about the EMR systems, etc, which are repeatedly purchased at exorbitant prices by vendors that churn out nearly useless products, and insist the staff spend ages doing data entry. I have some pretty horrendous stories in this space, which is why it's my go-to example in most of my posts.


> I cannot count the number of times that developers have gotten outright giddy when an opportunity for self-commodification comes up usually under the guise of self-taylorization.

I dunno. I'm a sysadmin, and my job is arguably to replace myself with a small shell script. Somehow I've been at this twenty-odd years and there's still more work than people.


There's a lot of ways to take this, but just to clarify, you mean automating your work, not your self, right?


> You can pay people to churn out bad self-help all day, but none of those are going to be worth a damn without allowing the human element to flourish. But you still need a factory which can print the things, and as clinical as a mass bookbinding operations sounds, I really believe that you're only going to get a beautiful binding when that factory is run by people who have the connection to the work necessary to exercise taste.

Operation, innovation and maintenance. Pick any 3.

Better if done by the same team or teams that are very close to each other.

Ideally, all done to varying degrees by each person on the team because that’s where inspiration for improvement comes from.

We can split the 3 functions up into 3 groups, with 3 sets of management hierarchy. Often the result is mediocre and the people are unhappy (especially those stuck on the maintenance team - thankless but crucial work, that).

We can try to make teams responsible for all 3 functions, but often hire managers who overfit on one of them (coincidentally the one that leads to better rewards next quarter). It’s hard to champion operational excellence, diligent maintenance and hammock time in a single culture.

No wonder it’s hard.


Consider the Business Intelligence/Analyst roles. The industry is trying to replace people who write SQL with people who can use Tableau or similar. "Just connect to whatever datastore you have and non technical people can drag and drop."

Its got some problems:

1. They forget you need to hire many more (lower paid) people, because your output now linearly scales. Human hands have to turn the crank because its all UI-based work.

2. You still end up with very complex and disorganized business logic transform code, and now its buried in the UI. The PMs or business teams are the only ones who know what that logic is. The engineering org delivers high quality, tested datasets that are pretty raw for the purpose of answering business team questions.

The hard part was always solving the weird way business outputs are obtained from raw upstream data. Now that solution gets stored in a tableau workbook and cant be used as input for something else. It has to be copy pasted from the UI into a new tableau workbook. Well, now we bought the Tableau cloud service and our BI team can build and maintain SQL extracts in a more rigorous way. Tableau now looks like its trying to take a chunk of Databricks business here, but now its a non-engineering team doing it.

Not sure its going to work out.


Respectfully disagree. Meaningful answers from raw data is hard, but the hardest part was always making business people to _ask the right question_.

Take as an example a typical business question: "which countries are our users from?"

But do they mean the country the user declared in the registration form? Or country they're currently accessing your service from? Or the one from their payment method? Or the one where they where born? Or the one they have citizenship from? Or the one they're currently residing on? Or their shipping address? Or they billing address? Or..?

If your dataset is sufficiently big, every one of those countries will output a different answer. I'm sure that in the "global fintech" space those weird cases become the norm.

Your typical lower-wage Tableau user will just look for whatever country codes show up and run a count, then declare it the truth.

A slightly smarter Tableau user will bribe an engineer to write SQL for them.

It'll take someone with knowledge of the dataset and probably the systems where the data is sourced from to push back and force the business to ask the proper question, and provide proper context.

Tableau and the like are good to replace "technical work" which is a guy copy/pasting the same SQL query into pgAdmin and emailing the resulting CSV on a daily basis, and then some, but it's not making "less skilled" UI-oriented workers to think better.


> ask the right question

Exactly right.

As a UI developer, I guess I had always assumed having close contact with end users and domain experts. Nothing I could articulate. It's just how things were done.

Then I served in a QA Manager role for a while. Naive me started out focusing on the QC & Test parts.

Eventually I figured out most of the value add comes from the Quality Assurance parts. Formalizing some of the stuff I used to do intuitively, like requirements gathering, sure.

But I'd say (without proof) most value (impact) came from a) asking the right questions and b) verifying the team solved the problem they had set out to solve. In other words, formalizing the team's internal feedback loops.

Alas. That was late 1990s. The Agile Manifesto cult swept aside all that silly formalism. "Too heavy!"

We now have "business analysts" backfilling QC/Test, without any training or guideance. And I haven't seen any QA style requirements gathering, analysis, and verification in probably 20 years.

As if we can use A/B tests to achieve quality.


> The Agile Manifesto cult swept aside all that silly formalism. "Too heavy!"

The Agile Manifesto puts close contact with the end users and domain experts as a fundamental principle (actually two principles, out of four). I do think you have the wrong culprit on your mind.


Agreed. When devs, QA, and other doers have a direct line to customers, there's no problem. In my personal experience.

That arrangement has been rare. More common is gatekeeping and incompetence. (Which may be the same thing.)

--

I've never figured out how to do "agile" QA/QC/Test. I'd don't even know what it'd look like. And, yes, my prior experiences and expectations may be keeping from seeing the new paradigm. Which is why I keep asking.

The best candidate I've read about is "Test Into Prod". But I have not yet done that strategy in real life. Soon (fingers crossed).

Oh, and "bug bashes", are pretty great. Where everyone examines logs together and either explains or eliminates exceptions. That needs to be the norm.


Well, with Scrum or whatever usually passes as agile, I have no idea either. And I imagine people can't really answer your question, because almost nobody practices the stuff on the manifesto. The motto would be certainly be to bring the customer around to specify your tests, but the actual procedure is a bit hard to imagine the details.

Anyawy, my comment was just to point that it's not exactly the manifesto stopping you.


For business ”leaders”, #1 is a feature, not a bug. Linear scaling of output to costs is something that can be modeled in spreadsheets, can be passed on and billed to a customer. Everyone in a business hierarchy feels comfortable with those types of dynamics. Whereas having a dedicated specialist who can produce 1000x output with little effort, but is not entirely sure if that will take a day or a week makes everyone uncertain and puts leaders on edge. This is a hard sell if everyone wants comfortable mediocrity.

#2 ironically means more specialists, different specialists though, but more of them to maintain all the logic.

At the end of the day it will work out as being a jobs program and will end up being more expensive, but also distribute proportionate value to the politically influential fiefdoms, and that will make it successful.


Sounds like a good business killer. First you fire all the back end folks cause they don't know nothing about the business and tableau is the new backend (nevermind that the backend folks have been absorbing and helping refine business requirements). Then when the business folks get tangled up in the web of shit they call in the tableau consultants who suck em dry and produce nothing of value.


I am coming from structural engineering and see software more on par with city planning rather than other engineering disciplines. It is just too vast. There is no surprise that after years we start to get many more diverse positions in the field that describe specific types of work, e.g. backend/frontend/firmware/ml engineers, data scientists, security analysts, kernel devs etc. I think it will get to a point where you will need very specific certification to be allowed to work in one of these spots as standards move forward. Again, it might change to something else but the crystallisation effects are visible nonetheless.

Thinking about software as a factory is also possible and might be useful. But not everything fits into that analogy, for instance, when you think about integrations in the product or when it is a service rather than an app. Factory implies something is being made but software is not the end result in many cases but is an enabler in itself.


In software, the vast majority of work has asymptotically been automated. So we forget that it was ever work.

Consider the humble file copy. Or “automated scribe” if you will.

Copying is automated to the point where the enormous amount of copying our systems do has becomes invisible to us - and also lost as a point of economic differentiation.

And it’s meta useful! All our copying programs are themselves easily copied - with the same algorithms!

The point is that software writing, like mathematics, will always spend its time futzing around the edge of the known and unknown, because every area that gets fully known/characterized will get automated in a way that both solves that problem for everyone and scorches the economic earth of that activity.

Software work will always include an element of quest (research) into the unknown, in search of riches (economic value), in some aspect, whether that new area is glorious or tragically mundane.

Whatever area of software work a manager without software expertise can do themselves automagically, is like a fruit tree tamed so that it is easy to pluck fruit from, because it no longer has fruit.

The manager would no longer be doing anything differentiated or valuable if their problem statement wasn’t upgraded back into difficult to automate territory requiring pesky creativity and expertise.


I once worked for a company that ran on the Phoenix Project ideas, treating devs as mechanical cogs in a factory; last time I've heard about them they were shutting down their main subsidiary and laying off most of the staff (already cherry picked from all around the world to minimize costs by automated tests more difficult than interviews at Google) while the leadershi* that led to that kept their positions intact.


I think every engineering manager has either worked for or interviewed with a company that believes this stuff.

Software dev is still at the craftsman[0] level. It might move out of that, eventually. But not yet, and probably not in the next 20 years or so. We haven't solved some intrinsic problems around defining a problem completely, precisely and succinctly without having to write code[1]. And getting five engineers to write a single piece of software is exactly as complex as it was when Fred Brooks wrote about it, I think the only improvement we've had since then is Git.

[0] craftsperson? that doesn't feel like the right gender-neutral expression. I guess "artisanal" but that looks rude. Suggestions?

[1] The "I got ChatGPT to write this application without writing a single line of code" phenomenon is interesting, but it seems like an alternate skill path - you can write code, or you can write prompts. The complexity is the same, and the amount of effort is within an order of magnitude. I'm not sure, though - I haven't managed to get ChatGPT to solve a single technical problem successfully yet.


Chat always writes me something that includes

           perfecttly_suited_package_that_doesnt_exist(my_inputs)


Ironically, that's one of the common practices in programming - writing calls to nonexistent functions that you implement later.


It's great isn't it.

Npm packages that don't exist and yet always have the same parameters as your data.


eventually it will create and publish it for you, they say... not only the package definition, but the implementation also.

I don't know exactly why, but I doubt it will.


Getting a bit philosophical, I believe that the problem of 'defining a problem completely' is where the practical application of 'how then should we live?' gets self-referential. Startups spend inordinate amounts of time and money trying to figure out their customers' problems, drilling down, pivoting, etc. It's the challenge of silicon valley.

And the more one increases the search space the more primary that question of defining the problem becomes. Whether the execution is handled by chatGPT or a human coder, it looks to me like defining the problem we want to solve is the majority of our current frontier.


I think that's two different, but related problems.

"what should we build?" is a different problem from "what do you think we should be building?" - the first is a customer discovery problem (what do our customers think they need?), the second is a communication problem (what does our manager want us to build?).

The related bit is that humans are bad at communicating and we need to spend a lot of effort deciphering ambiguous human waffle into precise program code.


craftsperson or artisan both work, but no one cares if you use craftsman


“craftsman” is fine, don’t worry about it


Doing some project lit. review a few years back, I came upon the area of component-based software engineering (CBSE), where the idea was to mirror the same kind of manufacturing approach as in electronics. You'd write a software component to do one thing, and have well defined inputs and outputs, and you could then "simply" compose complex systems by chaining lots of these software components together.

Nice idea by maybe software engineering wasn't/isn't that formulaic as more well-established engineering disciplines.


Congratulations you've reinvented the concept of a library.


Funnily enough, that's where that particular write-up ended up in.


1) developers are not fungible. 2) things that are created are not the same , otherwise it would just be reused. 3) to understand how to build on the system you must learn the domain it is built on, which is also not fungible knowledge.

You are fundamentally building items , not reproducing them in software. A factory reproduces the same items.

There is a crisis in industry where legal contracts and business forecasts do not align with what is reasonable and predictable for software development .


This post resonates with me as a radiologist with 40 years experience, and a son who founded and runs his own company centered around machine learning, and now, LLMs. I frequently hear about how "AI" is going to replace radiologists any day now, but I do not believe it, for some of the same reasons described by the author, though in a different context.

I recently wrote a post "Will Artificial Intelligence Replace Radiologists": https://anordinarydoctor.substack.com/p/will-artificial-inte... that explains why I think the answer is "Yes, when it replaces engineers and everybody else".


Reading the comments I haven't found any evidence for anything.

And that is the trouble with software in a nutshell.

The assumptions are too many and the subjective opinions too deep.

I rarely submit to assumptions any longer without the other person submitting at least some evidence for their claims.

Experience is important but gathering information on everyone's experience is more important to form some sort of experience based evidence.

I know everything everyone writes is well meant but damn I miss someone wrote a book and made a course on how professionalism shoukd ne based on evidence gathered from our fields experience.

The amount of managers incapable of commitment towards serving towards that are just what fred brooks said it was - they rarely exist.


I find that some evidence is worse than useless. There’s always some evidence for anything and now you’ve got people with their deep subjective opinions held unshakably because they are SCIENCE.


What the author failed to mention is that the software industry itself is also responsible for this state of affairs. I am talking about the consultants and the salesmen of enterprise software. In many cases, they convince CEOs that they simply need to sign a contract and all their problems will be solved. They don't care to explain the high rate of failure of software implementations or how much manpower is needed on an ongoing basis to maintain the shitty software.

CEOs think they are buying an appliance. e.g., I buy a coffee maker, I pour water into it, I put in the coffee pod, I press the button, and coffee comes out. In the best scenario (from a CEO perspective), they are getting an assembly line someone else put together for them. e.g., the CEO hires a general contractor who lines up all the subcontractors, installs the different pieces of equipment including the "glue" that connects them, and will maintain equipment going forward. The CEO simply needs to pay for the capital expense and provide the operators for the assembly line. In the typical scenario (bad from a CEO perspective and bad for the software industry), what the CEO gets is a stack of re-purposed software modules that have been pressed together in a slip-shod fashion that works when it wants to and fails without a trace as no logging exists. The CEO has to hire IT developers just to get the thing to run. And he will now look for his next job because the board of directors told him, you failed.


Thank you for taking the time to write this. "CEOs think they are buying an appliance" sums it all up perfectly. I hold them accountable because they're paid so much that they should know better, but you're right that a lot of people try very hard to mislead them, and sometimes they're even incentivized to be misled. If you build in-house to suit your data models and workflow, there is a chance you won't ever get the thing deployed, then you look bad.

You can always sign up for Salesforce, and it might be bad, but it's running so you can hide the details of the badness from the board. This is why when I wrote another post on saving 500K for my company, the two-slide presentation I was asked to write went through a lead, a director, and a lower-level C-suite member before being allowed to reach the top of the pyramid.


I think the problem is that the outcomes of technical work is not equivalent to a product with a more static value. Simply put, software is not the same as a burger.

Software development to me is a financial investment strategy. Some risky investments, some safe bonds, some good debt to invest, some bad credit card debt that will be cleared later.

To me, these are the same as a risky overhaul of a critical system adding some automated tests, skipping some type annotations, or launching what was supposed to be a demo/MVP to be first to market.

a piece of software that took months to build could be worthless tomorrow if a competitor comes out with a better solution. There's a time value and strategy to it.

speeding up development with CD lowers the investment cost & time to market of an individual strategy. Automated tests lowers risk. Feature flag & AB testing diversifies the portfolio.

One line of code be the most valuable piece of a product, like Doom's Fast Inverse square root. and millions of lines of code could be worthless.

the value of a software engineer's work is extremely contextual. Two automation can be structurally the same, developed by the same person, written in the same language, take the same amount of time and resources; and still be COMPLETELY different in value.

until IT/software strategies are treated like an investment strategy, companies & leadership will continue to flounder when managing technical teams.


I agree whole-heartedly!

I am sometimes called upon to audit codebases for acquisitions. I've spent enough time in the strategic accounting world to realize that we almost have no concrete units of measurement to talk about software. Lines of code is as meaningful as the count of boards in a house. In a house, we can at least get total square feet, number of legally defined bedrooms and bathrooms, lot size, etc. In software, we've got almost nothing.

I've seen tiny codebases that can CRUD hundreds of different records, and I've seen multi-million line codebases that can barely handle saving a dozen different records. Consider all the ways data flows in and out of a system: asynchronous processes, batch processing, auditing, third party integrations, permissioning, reporting, and legacy system integrations. Each of these change the value and technical debt of a system.

I suspect we will have an increasing need to perform an accurate audit of a codebase. And provide the value, depreciation, and debt of a system in dollars. I would love to see a standardized metric for accounting for assets and liabilities.


A recent post was talking about software as an expression of an idea. If you take something simple like a text editor, which purpose is to edit text, you can go and express this simple idea however you like. The final result will be different according to the person/people who shaped it. And that is why you have things ranging from nano, vim,... to sublime and vs code. To go from an idea to a product, there are just too many variables in play, and most of them depends on people.

> I would love to see a standardized metric for accounting for assets and liabilities.

I'd say that is easy. Just compare what the business wants/needs to what is inside the codebase. The diff percentage is a nice basis for the above. But that would require a very detailed study of both the company processes and the codebase itself.

I think that's why most of us feel the need to build our own tool. Because the ideas represented by the software we are using are not what we would have come with. And thus, we're building extensions and plugins.


The most successful technical organizations (US Navy, HP, Xerox, Bell Labs) let people own their own solutions and work. You give someone a problem with context on why that problem needs to be solved and then you let them own it with near complete autonomy. Not everyone can work in this environment which is why technologists will not be commodified in the near future. Until I can tell an AI that they need to reduce my AWS spend by 20% or they need to add an extra 9 on my service’s reliability there are going to be technologists involved.

https://govleaders.org/rickover.htm


I don't find this argument very compelling. Technical work has been commodified to a great extent, even in bleeding edge applications. Does anybody have any doubt that OpenAI has timetables with clear action items that need to happen so they can release their next model?

The counter-example of a software to abstract SQL queries is weird. This is exactly what we have been doing with other levels of abstraction, happily so. Why write Python instead of just using a compiled language? Because it's easier, and allows you to hire different types of people and focus on higher order problems. Maybe that's offensive if you are a world-class programmer like the author seems to think he is.


Yes, yes. In case of OpenAI: not only clear action but all the research and the disciveries needed are planned at least a few years ahead. They know exactly when AGI is going to be here!!!


> My man is out here advocating of a workflow that consists of feeding your subconscious mind research for four hours, then meditating on it for another two, then sleeping and praying that the Gods of Design simply bless you with an answer in the morning

Delightful.


Judging both by the office->cubicle->open plan progression, and by https://www.workatastartup.com/jobs/62929 there's been some degree of commodification. The low end of that salary range is less (inflation adjusted) than offers (that had options on top) which I was getting last century just out of school, and the high end is less than I was making as a "software engineer" with two years experience.


I can't remember if I saw this in Slack or Peopleware (either way, very evidence-light books), but they made the great point that there's basically no good data on open plans being better than offices. It's just that you can measure the cost of floor space and it makes manager lives easier. Programmers benefit from thinking, managers benefit from being able to ask people for updates on a whim and lowering costs in the short-term, and guess who decides on the seating arrangements?

I have some other hot takes on how, despite I believe Peopleware being a Bill Gates favourite and my executives drooling at the thought of being as wealthy as him, they don't actually read or do anything of the things he recommends when it doesn't result in an immediate superficial victory.


> and that giving someone a salary is enough reason for them to subjugate the entirety of themselves as they turn up to work every day...

> Well, you're wrong, and you can fucking bite me.

Hear, hear!


There's an entire spectrum of options here from complete outsourcing (banks and gov) all the way to writing everything yourself (Google).

I've lately found a middle ground that seems to work well - licensing source code or libraries instead of entire closed web solutions. It enables a small team of good engineers to be really productive and ship something to production rather quickly.

You pay for maintenance and bugfixes for what you license and can still retain control of your data and interfaces.


I read something great to this effect recently, but I can't remember where. The gist of it is that adding people increases overhead, so the absolute smartest play is to keep the tiniest team physically possible and give them an absurd amount of leverage - which frequently takes the form of licensing the things that you really -can- commodify well.

The fundamentals of data engineering (https://www.oreilly.com/library/view/fundamentals-of-data/97...) is adamant on this. If you're a new data engineering team, just buy things that download the data you need, because fetching data from APIs is (usually) pretty simple to do with automated tooling. Then your team can focus on the part that we haven't nailed, like making sane models.


The key decisions of a manager entail when and where to collapse complexity into simplicity (and inversely expand a simple system into something more complex). So you can pick out a foolish manager by their unwillingness to work with complexity in areas core to their business/craft/discipline, and you can find bad ass collaborators by examining what sorts of complexity they're enthusiastic about.

(Note: I'm using the term "manager" broadly. It could be an executive, a team lead, an entrepreneur... basically any contributor making choices.)

I personally love being around people who are willing to get dirt under their finger nails, smell the soil, ask questions, and read the documentation (ie, willing to step into complex systems). It's inspiring. It also is a good indicator that the environment I'm in is being well-cared for.

Think of a human caretaker – those with empathy (a beautiful example of willingness to engage complexity) far surpass others in both ability and impact. Think of a barista observing details like humidity while adjusting their process. And now keep this analysis while moving to higher levels of abstraction.

(BTW, this is a phenomenal essay. I love how the author left it to the reader to realize all the lessons contained in the McDonald's anecdote.)


I was already going to write that this is a beautifully written comment, and then the high praise at the end sealed the deal! Thank you! I must confess that I am still in the reactionary phase of having watched Simple Made Easy two years ago, and am relentlessly trying to remove complexity, but your comment made me think about whether I've gone too far the other way. Complexity is frustrating because it is frequently the result of bad choices (at least where I work), but the ability to handle it probably is a fair marker for excellence.

"Make it simpler" isn't an option sometimes, and I still instinctively flinch away from those instances.

Oh, and the description of "empathy" as really being the willingness to engage with a kind of complexity is a very beautiful parallel between the worlds of software engineering and being a decent human being. I really love that.


Well written and enjoyable read.

The image of a Burgermaster 5000 operated by a Highschool kid producing burgers is one that really gets stuck in your head.

I am wondering to which degree something like this might or might not come true with the advent of AI driven coding.

I have built a full swift app using Ai (not writing a single line of code) but it took a lot of coding knowledge to pull it off. I wonder if this will change.


What was the full app?


Calorie counting using ChatGPT API. Sadly, burns too much money to publish it but I have been using it consistently for a few months.


Oh so you say “I had a burger” and it estimates what kind of calories that would entail?


One key problem is nobody, none of the suits anyway, want to believe that there are essential, hard problems that can't be outsourced, can't be commodified, can't be shortcut in any way.

It's the business version of the get-rich-quick scam course hucksters. The truth that there's no silver bullets can't compete.


What a fun read; thank you!

I’m still very early on in my career in software dev (~3 years in) and you articulated the idea I’ve been grappling with extremely well in this piece. I could never find the word for it but “commodification of technical work” is the perfect way describe it.


He doesn't explicitly cover the whole AI angle to this - in the spring I wrote something about the obvious parallel between AI tools and low code programming which I think is relevant, basically that both make something easier but anything outside of that something harder so they don't really add efficiency https://gist.github.com/rbitr/3294819148316df3ed90a2a1ce8a91...


I get what this post is saying, and I feel it too, deep in my bones. But technical work, especially in software, is being commoditized. If you view commoditization as a process, the end point of it is "free" (as in beer).

I've been in tech long enough to remember when you had to carve literally every line of code out of the firmament of the heavens to get anything done. Today the bulk of technical work is plumbing together literally millions of lines of "free" (as in freedom and beer) code to get some behavior that checks all the requirements boxes.

Going from the post some more, modern developers are more akin to line cooks that put ingredients together. There's value to that skill, but it's a reduced value as its easier to learn to plug together APIs than to craft a cache aware data structure and reduce the big-O of some algorithm while fitting it into limited RAM.

In most restaurants line cooks are almost a commodity (even the sous and executive chefs are to a point), customers don't know if the cook from yesterday quit in a rage and a new guy got picked up this morning who had also quit in a rage from the place next door. Follow the recipe and you get the same product to the customer.

News for the tech folks here, line cooks (even experienced ones) make shockingly less than the average fell off a bus tech-bro with a couple bootcamps and a github repo.

The average developer is today thousands of times more productive than the developers from 30 years ago because they don't have to build the ingredients anymore -- the industry is well bootstrapped by now. But if some future Co-Pilot LLM can turn one developer into 4, and effectively "auto-plumb" with minimal human direction then...


I had a great chat with a reader yesterday, where I became very concerned that we aren't incentivizing young people (like myself!) to pick up the "carve code out of firmament" skills. We just rely on certain personality types getting into it, whereas we fling prestige and money at surgeons. Meanwhile, where I am, they'll only fling money at me if I memorize details about the Snowflake billing model and API - not very useful to society if we need to innovate!

You're right though. As someone near the top pointed out - some things actually do get commoditized, like mail merges. That actually works and is probably never going away. Maybe the real issue is that a lot of these products aren't innovative at all, have no shot at being the next mail merge, and our leaders largely don't understand how to tell the difference.


The way I think about it is that as programmers we build the process for the kitchen. The software is designed around commodifying the repetitive work for non-programmer domain experts.

In my experience this was building a process for data scientists (math PhDs) to train and deploy ML models, and currently chemical engineers to build and deploy process simulations.

Data scientists and chemical engineers will have to excuse my comparing their work to flipping burgers :)


(Enterprise) Java grew a lot because the non-technical message managers got was something more like: An engineer can produce objects, and we can swap out objects and their factories. That way we might one day in the future decide we want a faster object or choose a cheaper object.

In the decades since, the 'progress' we've made can be summed up as dropping the J in JVM.


I'm building a software business.

My bet agrees with the author: I am betting that I can spend time and brainpower designing and implementing a good solution to a problem, and people will pay enough to support me and my wife.

You can't commoditize design, and that's where my work actually is. This is also why I'm not scared of AI taking my job.


Reminds me of some of my startup skeleton stories....

A founder so dumb he cannot understand the basics of one app equals one domain...wanted to fill each domain he owned with a part of a major app and then sell each domain and not understanding that would destroy the full app if we could ever break the one app one domain equation in the first place!


You can't abstract away risk, you can't abstract away complexity. You can either reduce, increase or shuffle them. Shuffling with layers of abstraction is the manager class preferred approach. Which leads to tech stacks of the type - "only me and god knew in the beginning what was going on, now only god knows"


Having read the phoenix project while working at red hat I can see some value of the process for large organisations with a lot of people and teams that need to be aligned, but for smaller projects with few people then just steer clear of all this software factory stuff


It requires an extreme level of selective amnesia to make McDonald's the operations standard for anyone. Try "the ice cream machine is broken today" on your next I.T. contract and see how far that gets you.

But to the author's point, I remember checking out a book from 1980 on software engineering from a highly well-known author (I forget who). I was shocked to read the author state unsarcastically that software development labor would be fully itemized in a standard catalogue like auto mechanic by the year 2000.


> that software development labor would be fully itemized in a standard catalogue like auto mechanic by the year 2000.

That's pretty dumb... Software doesn't "break" and if it could be itemized to such a degree it could be made self-healing.


> Software doesn't "break"

It most certainly does because most software is written atop dynamic systems and every possible change cannot be pre-empted.

Extreme example: RAM writes accidentally flipping bits ala RowHammer. Broken, yes, but is it a bug?


I'd still describe the first one as "non-line-item-able" how can you set a lineitem time to "re-jigger the dependencies"? Those take open-ended investigation that no experienced "dev" would commit small amounts of time to like replacing the alternator in a 2003 Ford Ranger.

The second one is clearly an automation issue, are you going to assign a lineitem to "fix the latest rowhammer attack"?


This is a real issue I've seen at my past jobs in Support -- in theory, the better the support team got at closing issues without needing RND, the better the case load should have gotten and the better overall time for everyone involved (support, rnd, C-levels, customers). We had many huge enterprise customers using the product (usual big names across the globe), and everything was working pretty damn well and people were happy.

Then the acquisition came, hordes of new C-Levels and directors added, and suddenly, the system everyone wanted was no longer enough; we needed more out of the Support/RND team for some reason, more sales, more renewals, more special contracts with ENT clients with exhausting demands from the company. And the expectation was just "be more efficient". All of this came over slow time with small changes (back porting features for ENT clients who refused to upgrade, forbidden solutions for the Support team because "it upset the major clients", allowing sales/renewals to force RND engagement if they demanded it, new CRM because it was too hard to do marketing campaigns in the old one, even though people were calling _us_ and asking to just send them a quote they liked the product so much)

I honestly think trying to commodify and extract even more from what was a very successful system financially and just overall ended up ruining pretty much everything. Sales are slumping as are renewals, we're churning RND and Support folk, and because of slump in sales, there's belt-tightening everywhere.

We've implemented so many new systems needlessly with huge implementation/consultant fees that absolutely no one knows how to use -- workday is a prime example of this as out of the box it doesn't do _anything_ we needed, but we had to use it anyways, and absolutely no way to pay for the features we really needed. Same with ServiceNow implementation, it was supposed to be a near 0-code experience we were sold, but naturally that was not the case. Why did we get it? No idea except that it came down from on-high from persons that don't use the CRM and now we're stuck with it.

For me what it comes down to is too many people having to be like the CEO and Bill from the article -- for some reason such Clevels need to show they're doing "something", but I didn't think that something should include implementing wild changes to workflows in the company the C's know nothing about, or even worse, responding to customer complaints and demanding we "fix the issue" without knowing what the issue is.

The conference experience from the article resonates with me heavily, cause we have all these systems that do "everything but nothing", all these workflow changes without listening to the people having to use the work, it's really awful.


the author does make some good points, but I was a bit taken aback by the whole "who could possibly get excited about logistics of all things?!" bit. makes me wonder what else he simply doesn't "get" because he has written entire areas off as intrinsically boring and lacking in value.


Not at all - I probably just wrote badly and gave off the wrong tone there. I love that people can find excitement in almost anything. But uh, yeah, the book is very strange. There are a lot of characters that just take all sorts of insane abuse at work, and then with a grim face think: "This company is my family. I will not fail. Parts Unlimited will conquer the opposition."

Like, dude, this is pathological. Yes, you can work hard and try to save people's jobs, but where's the part where your friends strongly advise you to simultaneously look for work elsewhere because this place is toxic?

The "excited for logistics" thing was just uh... well, the first quote about the "excited" person just reads horribly in the context above, and The Goal wasn't even a fast-paced thriller. I believe the energy is best described as "Hail Corporate!"


I’m sure this fellow is a pleasure to work with.

In reality most of the world operates somewhere between “we’re a factory” and “we are unmeasurable Picassos.”

It’s not a huge stretch to combine thinking that “software is unpredictable” and “people pay up for predictability” so we should still try to tilt that way.


You know it’s going to be a good read when they post a new one


If you look at the development of any technology there is a consistent evolution from high skilled use of simple tools to low skilled use of complex tools.

Consider the evolution of human weapons. It was very easy to make a crude throwing spear (break off a branch and sharpen the wooden tip against a rock) but a ton of strength and precision was necessary to take down a target with it.

It took a while to figure out how to make a decent bow, but once invented it spread all over the world because momentum could be imparted to the arrow instantaneously when releasing the string. Strength was still required to hold the bow at full tension; this requirement was eliminated with the crossbow. As a tradeoff, the crossbow required more significant manufacturing expertise and economies of scale, particularly with regard to precise metal locking mechanisms.

The evolution continues with guns replacing crossbows, tanks replacing horses, and ending with nuclear weapons. Each technological development requires a much greater level of manufacturing organization, and in turn bestows more power on an unskilled user. The resulting economies of scale are why only a small set of huge states are militarily relevant today, and why the idea of nomad bands posing a threat suddenly seems laughable (whereas it was a dominant worry for civilized society up until a few hundred years ago).

It is natural to glorify the work of the artisan whose disorganized but brilliant insight has not yet been commoditized. But for better or worse, progress usually takes the form of painstakingly systematizing those insights, detail by detail, until they are reproducible and robust, so that a broader range of people can consistently churn out similarly good work. It is this systematization and rationalization of the manufacturing process that really brings quality of life improvements to large numbers of people. It is also what can seem to take the magic out of the original insight, eg as Edison’s discovery of electricity becomes as mundane as flipping on a light switch. Progress happens in the transition from magic to technology.


OP is correct when a business decides against standardization.

> For example, one pitch in particular was for a product which promised to remove the need for me to write SQL in exchange for being able to set up all my dependencies from a drag-and-drop editor, with the sales pitch consisting of "You can get rid of thousands of lines of all that SQL you hate!" - no I can't, fucko, because your application is still connecting to Postgres so it's just writing the SQL for me with another layer of licensed abstraction on top of it. Why would I pay to have more abstractions designed for you to sell software to multiple clients, you blue-suited dementor? Eight times out of ten, I want to pay you to remove them from my codebase.

See, the problem is that you can't fully remove the original SQL. It's still there, in the code. The database is still there. So instead of having one interface to access the database - SQL - now you have two, the low-level SQL and this bastardized abstraction. Some of your employees will write SQL, and some other of your employees will use the abstraction. They use different tools, so the tools will fall out of sync, and the employees will fall out of sync, and you'll get discord.

Pick one tool. As far as it's technically feasible, pick one database, one programming language, one UI framework, one wiki vendor, one CRM, one ops visualization framework that is right for the business. Don't pick up some fad-of-the-day, and don't let your "artisan" engineers do "research" projects to see whether some other framework might suit their needs "better". Tell your engineers, here's the business problem, here's the pre-existing stack, solve the problem with the already-existing and already-supported tools. If the engineers are truly artisans - guess what, an artisan doesn't blame his tools. Pick new tools only slowly, deliberately, when you have no other choice, and only with a solid plan for standardizing the tool for full adoption across the enterprise.

Why? Because high-performing organizations ensure that they are always building upon prior work. Five engineers with five years' experience with React will be a stronger team - more productive, faster, more accurate at forecasting, have an easier time reviewing and supporting each others' code - than will John with 5 years React experience, Sally with 5 years Vue experience, Wang with 5 years Svelte experience, Cynthia with 5 years Angular experience, and Ilya with 5 years jQuery experience. If Ilya gets hit by a bus? The other engineers will have difficulty picking up the load. Not much of that React, Vue, Svelte, Angular experience will help with supporting the jQuery parts of the codebase.

Commodification requires a standard. We appreciate the value of standards when we succeed at establishing them - tabs vs. spaces, Docker containers, infrastructure providers, project management tooling - because they eliminate discussions that do not revolve around the more fundamental question of how to deliver value to users. Shouldn't you ask yourself, if you're evaluating a new language, tool, or framework, whether you really benefit from breaking the company standard?


Programming is still a craft, not engineering, or manufacturing. A software house should work like bespoke tailoring, or fine cabinetry, or glass blowing.

There's still no better training for programming than the equivalent of master/journeyman/apprentice. Apologies for the gender specific terms, but they are specific to how tradespeople operated from medieval times.

The worst thing to ever happen to the practice of business is the invention of the MBA. MBAs are imbued with the misleading axiom that management is a craft and science of its own, independent of the type of process or practice that is being managed.

Combined with endless selling of the latest buzzword theories by consultants is why we end up with JIRA-Driven-Development, nonsense like t-shirt sizes, 2 hour wankfests called "Sprint Reviews", let alone all the scrumming and standing-up and backlog-refining and endless make work.


I approach the work like this and appreciate others who do, but it doesn't scale. It's engaging and fun and can produce more elegant, attentive, humanized work. But larger technology organizations (or organizations pursuing larger projects) need to draw on engineering and factory models to meet their software development needs the same way that IKEA and GM draw on them. Just like in software, there are still countless master woodworkers and master mechanics who bring beautiful attention to their craft, but there is also a scale where that approach is simply not suitable. The different scales coexist and even comingle, and they're all valid.


All of those things have value if they are allowed to provide value.

None provide value if management do not allow it to have value.

> t-shirt sizes

The whole point of planning poker is for people to guesstimate how much they think a task will take, and the poi t of doing it in a group is to allow the team to discuss the things they may not have thought about.

> 2 hour sprint reviews

I would agree that 2 hours is vastly excessive, but a sprint review is just a time for people to regularly meet and say what things are pissing them off, and what can be done to fix them.

If management know best and you have to use a shitty Jenkins pipeline with someone else's scripts, then it is pointless. But that's not the fault of the sprint review.


Planing poker is the one ceremony that I find consistently helpful. Listening to different devs discuss why they think something is 13 points or 3 points is helpful to the whole team. You could change the names of all the parts of the meeting, but having the discussion about differing opinions on complexity is so helpful.


In theory yes. Back on earth: I can argue that each task is 13 points. Usually this goes to being a pissing contest. The person that is deeply familiar with what needs to happen and has written this down (assuming written down not a one line to fix somthing) to be "estimated" probably is in the best position to acually do it/fix it. Also it's an insidious way to get the drones to fight each other trying to one up and show how smart they are. The person assigned to fixing it should estimate it (ie I have completely silenced "10x" developers by just telling them to just pick it up and do it if it's so easy).


I do like the planning poker for the same reason. I just find that people use story points overly granular. E.g. when using Fibonacci series, what is the point spending time discussing the minute difference between 1,2 and 3 points.

What I have liked in the past, is just using small, medium, large or unactionable.

Or with the right team/managment, not using estimates at all, just have people pick up the tasks they feel confident that they can get done within a given time frame after having the discussion in the group about differing opinions on complexity.


There's more difference between 2 points and 3 points than between 5 points and 8.

If you're looking to fill an arbitrary bucket, relative size matters most.


Our team had so many planning poker sessions where we spent 12 people * 2 dev hours trying to figure out whether stories were a 2 or a 3, management finally said it is always a 3. We were literally spending more aggregate time trying to decide effort, than the actual effort it took to complete these tasks.

2 and 3 are equal. 5 and 8 are equal. The question is simply "Is this a couple days, the whole week, or the whole sprint?".


The only reasonable estimates, ever:

+ That's trivial, it will be ready for testing before lunch.

+ I know how to do that, should be ready tomorrow/next day.

+ I can see how to do that but there are lots of other constraints - at least a week, might be more.

+ It's a big project, needs more planning and specification before it becomes a series of estimable tasks. Let's do that.

+ That breaks other things we care about. We need to prioritize and be ready for rework.

If you want, you can call these small, medium and large.


I completely agree with these levels, but disagree with mapping them to “small, medium, and large.” That’s a lossy compression if you will, and the heart of the problem. A shorthand might start with good intentions, but rapidly is muddied by conflicting interests.


5-8 just means that you think that the task is getting a bit big.

Have a think and can you split it into two smaller tasks for the juniors to pick up.


Planning poker is only useful in the context of who may be doing the work, and even then I'm not sure how useful it really is. What is an 8 to one dev may be a 5 to another, as one may have a better grasp of the problem space than the other. In short, one shouldn't assume equality across the team for ability to do given task.


How is it helpful to the entire team for everyone to pay attention if adding a feature is 3 or 12 points? It's the one thing that wastes so much money and time. In the end it's 3 points. Thank you everyone for wasting $5,000


I think you might be confusing sprint review with sprint retrospective? What you describe is the sprint retrospective, and when teams are autonomous enough to actually act on their pain points, it's easily the most useful scrum ceremony, IMHO.

Sprint reviews are really kinda just a wankfest where people show nice graphs and demos for an audience that either already knows or doesn't care.


> it's easily the most useful scrum ceremony, IMHO.

Agreed. I would kill almost anything else from scrum (maybe weeklys can stay too. Sometimes they are useful), but sprint retros stay. And also agreed about the uselessness of sprint reviews. I've never been in one and thought later that I've learned anything or was useful to anyone in there. It's like going to church without being religious in villages. "The community would look with disgust at us if we weren't there on Sunday. So, we are there."


>The whole point of...

there are many things that in theory have a particular point but that when put in practice most often do not have any point at all - in such cases opinion seems to divide between those who complain that the people who put things in practice do so incorrectly and those who feel that if the practice when implemented tends to be wrong then it is the fault of the theory itself.

I personally hold with the latter view.


About sprint reviews in particular (not retrospective), do have a point in theory? I have never found anybody that can point to any.

AFAIK, everybody that do it is cargo-culting.


The theory of identifying something as being shit and then removing it is a bad thing?


That is not the theory, that is the goal. Sprint reviews have a goal of identifying something as being shit and then removing it, they have a theory as to how this will be achieved which is the sprint review process.


None of what you're saying is an absolute truth. It applies in many environments, and in many others there is definitely space for commodification and MBA-style management gimmicks. It all depends on scale, technical complexity & maturity of the product, centrality of the software in question to the core product, etc.


Exactly

Like, there are mass produced furniture shops. (Say, building chat bot clones, or flappy bird clones). The basic layout/shape/function, is known and being tweaked.

Then there is the custom built furniture. I need an 18th century armoire. The 'Bespoke', a custom application for some odd business need, or solving a new problem.

We do often discuss these by squishing 'Software Engineering' into a single bucket and it leads to many arguments around really different things.


Hardly, you’re confusing risk for commodification.

It’s possible for an an incompetent plumber to fix your plumbing problems by accident, expertise is about consistency. The dirty little secret is MBA led projects constantly fail in a truly spectacular fashion.


> Apologies for the gender specific terms,

It doesn't make any sense to apologize for using a language in its current form.


All living languages are in some state of flux. That's how you can tell they are still alive.


I've not stated the opposite.


It comes out of an abundance of caution, necessary in these uncertain times.


> master/journeyman/apprentice

Even if it's the way I learnt, it's just one way to learn.

I'm in software but my background is electronics engineering. But I can say electronics can be a craft (just build some stuff), or engineering (all the way to knowledge of modern physics, telecommunications, semiconductors design), or manufacturing (build some stuff at scale).

Same applies to programming. I've known quite a few people with computer science backgrounds. Some would have a level of comprehension and ability that's just unachievable by just practicing programming as a craft, and some others can determine any algorithm Big O notation at first glance but can't write 5 lines of code if their life depends on it.


> Programming is still a craft, not engineering, or manufacturing.

Programming can be engineering if you know how to do it. Engineering is, in a nutshell, a set of principles you learn and apply, and never deviate from, in work and life. Engineers always deliver the highest quality.

Craftspersons do not know these engineering principles, or they disregard them. They invariably deliver faster than engineers, and their output is always simpler and of lower quality.

In "The Rise of Worse is Better" [0], an engineer (from MIT) debates with a craftsman (from Berkeley). The Berkeley guy cheerfully admits he is disregarding an engineering principle (correctness), and that's why "the MIT guy did not like this solution because it was not the right thing".

[0] The Rise of Worse is Better https://dreamsongs.com/RiseOfWorseIsBetter.html


> Engineers always deliver the highest quality.

Engineers deliver a cheaper product that still meets some known set of requirements with whatever safety margin.

The quality is as good as the requirements dictate.

A big reason why software is largely not engineering is because nobody knows wtf the requirements actually are, and the dependency stack is unknowably large


It's true that engineering principles are not usually applied in the software field, but if quality is the most important requirement, they are indispensable.

Example: https://galois.com/

Example: https://inst.eecs.berkeley.edu/~cs162/sp13/hand-outs/They-Wr...


what are the engineering principles of software? does team A agree with team B what they are? is there alignment between academia and industry? Electrical engineering has Maxwells Laws and math. What is the math of software? Do teams A and B agree?


The best book about the application of engineering principles to software is SICP [0].

Abelson and Sussman wrote SICP to illustrate the principles. Everything else in SICP is some means to that end.

They reveal that SICP is about engineering in the Preface to the First Edition:

  > The techniques we teach and draw upon are common to all of engineering design. We control complexity by building abstractions that hide details when appropriate. We control complexity by establishing conventional interfaces that enable us to construct systems by combining standard, well-understood pieces in a "mix and match" way. We control complexity by establishing new languages for describing a design, each of which emphasizes particular aspects of the design and deemphasizes others.
[0] Structure and Interpretation of Computer Programs, second edition. Harold Abelson and Gerald Jay Sussman. https://mitp-content-server.mit.edu/books/content/sectbyfn/b...


Engineering is about applying repeatable processes to achieve a specified result. Suitable analysis is performed before starting on the final work product to ensure the results are achievable. No process, no specification, and no analysis means no engineering.


Engineering is also a craft, but we have figured out how to manage it. The same will happen to programming (if it hasn't already). This happens to all crafts, from weaving to nuclear engineering.


>There's still no better training for programming than the equivalent of master/journeyman/apprentice.

This is inseparable from the fact that programming is still a domain dominated by white males from upper middle-class families (and immigrants from countries with functional, non-boondoggle professional training and/or demoscene). DEI as a rainbow-wash goes away when organic mentorship of black kids and young women happens en masse.


I'm absolutely certain there are much more non-white (Indian, Chinese, Vietnamese...) programmers than white.

And if we stop with the racism, are you sure it's not the programming that made these guys upper middle class?


> I'm absolutely certain there are much more non-white (Indian, Chinese, Vietnamese...) programmers than white.

Worldwide, sure. I'm speaking about the US.

>And if we stop with the racism

That would be up to you. I haven't said anything that's racist.

>are you sure it's not the programming that made these guys upper middle class?

That would be even worse. It would mean that the white men who lived childhoods of lesser means and found success in the tech industry, presumably under the guidance of seniors who themselves (under your hypothetical) were more likely to be first-gen professionals, looked at black and brown kids in similar situations and thought, "Fuck you, got mine."


I'm not following what your point is. Would there be a way to make everyone expendable faster is it wasn't just white entitled dudes?


>I'm not following what your point is.

I'm not sure how I could be clearer.

>Would there be a way to make everyone expendable faster is it wasn't just white entitled dudes?

You might want to check this for errors.


I am not following your point either. Surely there must be a way to reword it?


It would help if someone would articulate what is unclear.

Forgive me, but on this subject, I've encountered many who are fully capable of understanding, but would rather not. So I'd ask that you try harder. Perhaps restate what you think I mean, and I can correct you if it's mistaken?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: