Hacker News new | past | comments | ask | show | jobs | submit login

I think - and it's only a think - that the author has ignored that a large part of what used to be called technical work is now commodified.

I remember when a mail merge literally meant printing out lots of address labels and then manually sticking them onto a letter & envelope. Word 2.0 (?) solved that problem for the 1990s and MailChimp has commodified it for the 21st century.

Double-entry book-keeping was technical work and was usually run by highly trained individuals. Nowadays every shop keeper just scans a barcode and has the customer tap-to-pay.

There's not yet a drag-and-drop like interface for anything more complex than Scratch (wither Visual Basic!) but the hard part isn't the technical work of stringing together libraries; it's requirements gathering.

Speaking of which, it has never been easier to drop in a high-quality cryptographic library, or import an interactive map on a website, or WYSIWYG edit a website.

So, the author is right that you can't stick a bored 18 year old in front of an IDE and have them create you an ERP. But a lot of the "grunt work" of IT is now firmly a commodity.




What you say is true, but the amount of "grunt work" is not constant over the years. In fact, I think the amount of "grunt work" in teh tech industry is just growing and not shrinking; I think the following look is quite obvious:

- amount of current grunt work: X

- new tech Z appears that makes X be reduced to 0.1X

- at the same time Z enables new ways of doing things. Some things become grunt work because they are a byproduct of Z

- amount of current grunt work: Y (where Y ~= X)

- ...

If the technological progress had stopped in the 2000s, then all the grunt work (originated in the 90s) would be esentially zero today. New tech just brings automation and grunt work. I don't think we will live in a society where there's practically no grunt work.

The most recent example is AI: there are AI tools that generate sound, images, video and text... but if you want to create a differentiating product/experience, you need to combine (do the grunt work) all the available tools (chatgpt, stable difussion, etc.)


>If the technological progress had stopped in the 2000s, then all the grunt work (originated in the 90s) would be essentially zero today.

If you wanted to have a simple database application in the 1990s, Delphi, VB6 or MS-Access were most of what you needed to get it done. The UI was drag and drop, the database was SQL, but you almost never touched it, mostly it was wiring up events with a few lines of code.

The work was commodified out of the way! Domain experts routinely built crude looking but functional programs that got the job done. It was an awesome time to be a programmer, you just had to refactor an already working system, fix a few glitches, and document everything properly, and everyone was happy.

Then everyone decided that all programs had to work on Steve Jobs' magic slab of glass in a web browser connected through janky Internet, and all that progress was lost. 8(


Are all of those proprietary products? I can't speak on your experience, but if linux was created in 1991, seems like in another angle you're bemoaning the rise of OSS and web.

I'm just a web developer that learned everything from online resources. So i think we are both biased on different ends on the spectrum.


Open source is great, Lazarus does a pretty good job of replacing Delphi.

Microsoft went insane with .NET so VB6 was killed in the process.

Access automatically handled table relationships, building queries and seeing them as SQL, and the report engine was pretty good. Thanks to ODBC, you could use the same database across all of them, or hook up to a real SQL server when it came time to scale up.

What's missing is the desktop and a stable GUI API these days. Windows apps from the 1990s still work, because they are distributed as binaries. Most source code from back then will not compile now because too many things have changed.

I love Open Source, but it doesn't solve everything.


> Microsoft went insane with .NET so VB6 was killed in the process.

I'd love to hear more about this perspective or any links to get more of it.

I did a (very) little hobby VB6 and loved it. Never made switch to .NET at that time (I was young, it was a hobby).

Having recently worked through part of a .NET book, I was pretty impressed by how far MS took it (although it seems extremely mind-numbing). Obviously it took a long time and had false starts, but MS stuck with it. On a personal level, I am very opposed to the entire model in an ideological sense, but it does seem to make a lot of business sense for MS, and it seems to cover a lot of cases for a lot of businesses.

So, was Microsoft's insanity with .NET just the obsession part, or doing things poorly for a while, until eventually getting it "righter", or is the insanity still pretty apparent?

I really would love to learn more about the historical-technical aspects of this specific comment quote, from VB6 to modern day, because it fits my experience perfectly, but I've had second thoughts about the position more recently. The more the specifics the better.


The insanity was to abandon the advantage they had with VB/COM, in order to challenge Java on its own ground. They threw away the baby with the bathwater. The C# pivot also slowed down their desktop efforts pretty dramatically, doubling the blow.

They were lucky Sun squandered the opportunity they had engineered with Java, focusing on the hardware side and missing the boat on browser, virtualization and services. If Sun had bought Netscape and then focused on building something like Azure, instead of fighting the inevitable commoditization of server hardware, they would have eaten Ballmer's lunch.


Disclaimer: I am not a .Net programmer, so these are just my thoughts and impressions as someone on the outside who followed the development from a distance.

I think a lot of the focus on .Net was driven by MS and Balmer's fear of Java. At the time, almost all desktop computers were running Windows 9x/2k. If 3rd party applications were developed with cross-platform Java, the customers would no longer be locked in to Windows.

First they tried the famous embrace/extend/extinguish approach by creating a Windows-specific version of Java. Sun fought back, and MS decided to push .Net instead.

It seemed to me that the initial strategy was to claim .Net was cross platform, but focus more on Windows and let open source projects like Mono be their cross platform "alibi". They changed strategies after a while, and now I guess the cross platform is more real.


> Windows apps from the 1990s still work, because they are distributed as binaries.

Only if you have the right libraries, and runtimes, and OS interfaces, and even if you have all that, oh no, it's a MIPS binary and you don't live in 1996!

Any proprietary API exists precisely as long as the owner says it does. Open standards don't suffer from that malady.


>Only if you have the right libraries, and runtimes

That generally only happens with .NET based programs in Windows systems. You always need some .NET v2,3,3.5,4,4.5, etc., runtime.


Totally agree. There is no backward compatibility with .NET runtime - if your application is built/linked to a given version, it won't work with any other version of .NET


That's simply not true. Newest .NET 8 does not need the assemblies you reference to target .NET 8 - as long as the TFM is any version of 'netstandardx.x', 'netcoreappx.x' or 'net5'+ it will work.

You can even make proxy-projects that target netstandard2.0 but reference .NET Framework and with certain compat shims the code will just run on .NET 8 unless it relies on some breaking changes (which have mostly to do with platform-specific behavior, there have been no breaking changes for the language itself since I think C# 1 or 2? some odd 20 years ago).

As for the runtime itself - the application can restrict itself from being run by a newer version of runtime but you can absolutely do so. The lightweight executable that just loads runtime and executes the startup assembly may complain but just try it - build a console app with 'net5.0' target and then run it with latest SDK with 'dotnet run mynet5app.dll' - it will work.


I think the point is that the Access, Lotus Notes tooling was in largish corporations somewhat ubiquitous.

The experience of this tooling was make a change and it was in production. It was incredibly simple and productive to work with given the needs of the time.

There was also plenty of opportunities to make a mess, but I don't think that has really changed.

Learning was not difficult, you just had to be prepared to spend time and some money on books and courses.

It is not a tooling set you would want to go back to for a bunch of different reasons but it worked well for the time.


> It is not a tooling set you would want to go back to for a bunch of different reasons but it worked well for the time.

I remember using lotus domino at one of my first jobs. There were all sorts of things I hated about it. But you could have a database - like the company’s mail database. And define views on that database (eg looking at your inbox, or a single email). And the views would replicate to a copy of that database living on all of your users’ computers. And so would the data they needed access to. It was so great - like, instead of making a website, you just defined the view based on the data itself and the data replicated behind the scenes without you needing to write any code to make that happen. (At least that’s how I understood it. I was pretty junior at the time.)

Programming for the web feels terrible in comparison. Every feature needs manual changes to the database. And the backend APIs. And the browser code. And and and. It’s a bad joke.

Commodification has a problem that for awkward teenagers to make the same fries every day, we have to ossify the process of making fries. But making good software needs us to work at both the level of this specific feature and the level of wanting more velocity for the 10 other similar features we’re implementing. Balancing those needs is hard! And most people seem content to give up on making the tooling better, and end up using whatever libraries to build web apps. And the tools we have are worse in oh so many ways compared to lotus domino decades ago.

I wonder what the original lotus notes designers think of web development. I think they’d hold it in very low regard.


Right!!

10/20/x years ago we didn't have DevOps, CloudOps, CloudFinOps, CloudSecOps, IaC experts, Cloud Architects, Cloud transformation experts, Observability architects, SREs, plus all the permutations of roles around "data" that didn't exist discretely, etc etc etc.


We did not have web scale products, which enabled new possibilities. E-mailing documents and collaborating offline sucked.


> I think the amount of "grunt work" in the tech industry is just growing and not shrinking...

Not sure, but isn't this just another way of saying that the tech industry keeps growing?


I'm not sure what the parent post meant exactly, but I do agree there is tons of grunt work -- I've seen big name SV companies where large parts of their work flow include parts like "and then just every hour you need to do something in a slow UI that can't be automated" to keep vital systems working. I would say that's really grunt work, and there are even persons in such companies where their only task is doing such grunt work. Truly I've been told by clients I work with they have entire double-digit sized teams where the members only responsibility is to reboot VMs that breach specific resource thresholds -- easily automated and even built into most hypervisors, but for whatever reason these tech giants opted for a human to do it -- the only semi-reasonable explanation I got from one client was that their infrastructure team got outsourced and they laid off the only people who knew how to use the automation tooling. It's a dumb reason for sure, but at least I can understand why they opted for the manual grunt work.

Similarly, keep in mind a lot of this grunt work is just to satisfy some reporting requirement from somewhere -- some person(s) in the company want to see at least X% of uptime or Y LOC every day, so you get people trying to write a lot of yak shaving code that basically does nothing except satisfy the metrics or ensure that uptime % always looks good (i.e., they don't fix the cause of the downtime entirely, they just get the endpoint that is checked to determine update working well enough so it reports to the monitoring system and they leave it at that)


If it's the amount of grunt work to solve the same problem, it just means the ecosystem keeps getting worse.

What IMO, is quite obvious.


We are invening the problems of tomorrow by solving the problems of today, and people tend to be the constraint.

Managing complexity to where a fixed team can operate the software.


I guess the point may be that after 30-40 years of this, the low hanging fruit of commodification may be gone. Further, the more we commodify, the higher order our problems become and the specialist engineers you hire move further and further up the stack.

Also, not sure if it's always been the case.. but the latest vintage of SaaSified startups have a high % of products that don't actually do any of the things you want them to do yet. They want you to pay them for their service, so they can capture your use cases for implementation and then commodify them for other customers. So you end up with long lead times and IP leakage. Neat!

I think the example of templating SQL is always this misunderstood target for management. I dunno, the language in particular has survived an incredible length of time in our industry... it actually does a pretty good job. 99% of wrappers/DSLs/etc put on top of it make it far worse and still require you to dip into SQL for anything remotely non-vanilla. Further, instead of hiring SQL experts (there's many) you need to train up SaaSified DSL SQL wrapper X experts (none exist).


This is a fair critique, and I'm giving it some thought now. I'll need to stew on it a little bit. Maybe the fundamental issue is that many of these products are designed to, as one of the other commenters noted, appear to purchasers that don't work in the field as simply appliances one purchases and then the problem is solved.

The issue is that a lot of the stuff out there doesn't actually solve the problem - it just appears to because other people buy it, and then lie about the implementation being successful to get promoted. Things like mail merge -are- like kettles, they're solved problems, and the only way to solve them is to try things.

The broader issue is that my employer purchased Workday because they believed it's like a kettle, but it can't actually fix the fact that our org structure is so horrendous that it can't be modeled.

(Incidentally, this year is the first year that I've realized that a sufficiently bad org structure, in a large company, tech debt of a sort. You end up doing all sorts of crazy things just to work out who works for who, and what can this user see in this database, etc.)


Some aspects have indeed been commodified. But what about the bigger picture?

How simple is it to run a business, a website, organise a travel around or pay a bill nowadays, compared to 1994 or 2004?

At times, I can't help but feel that the previous generation had a more leisurely pace of life, which led to a more fulfilling lifestyle. Nowadays, time seems to pass at a rapid pace, with high levels of stress.

Allow me to share two experiences:

a) The other day, while at the bank, I witnessed at least three individuals over the age of 60 struggling to complete simple tasks, aimlessly wandering around and pleading with the staff for assistance. These tasks are supposed to be easily accessible through online banking, but due to certain exceptions, the system did not support their specific needs. As a result, they were forced to make appointments, with the earliest available slot being three to four months away. One of them needed to withdraw money from a blocked account to purchase wood and heat her home, but the bank's staff refused to budge, insisting that she wait three months to solve the problem.

b) Just two years ago, my father was in Sicily and could not find a way to make a simple phone call back home. Yet, in the 1970s, all he had to do was walk into the bar in the area with a few coins.

Not to mention that while once upon a time your average person could fix the lights, the car, the heating, the non-automatic door, etc. by themselves now they need to call the professionals.


I wonder if there's a term for Amdahl's law but applied to human processes. Like the other side of "law of diminishing returns".


Amdahl's Law applies cleanly to human processes. Perhaps the most revealing example is the origin of "computer" as a human occupation and how scaling the compute process happened at Los Alamos https://ahf.nuclearmuseum.org/ahf/history/human-computers-lo...

The more general aspect of Amdahl's law is captured by certain scaling laws and limits generally related to communication (see full bisection bandwidth) and certain architectures (e.g. Cray) meant to optimize for this


It semantically works but it has not been adopted by people outside computing, I'd guess think the definition isn't relatable or understandable enough for people coming from humanities/biz backgrounds, so it might well be possible that there's a parallel concept there.


"Adding manpower to a late software project makes it later." - Fred Brooks, 1975


to bounce on your point, there's the distinction between creative & non creative work. Sure, tools might help the creativity, but it can't replace it, and their article discusses how it can make it worse. Accurate requirements gathering requires a spark of creativity...


Nah, WordPerfect and wordstar had mail merge way before word did.


I miss WordPerfect. Maybe it was the novelty but it had such enjoyable fonts.


Well I remember when it had no fonts :)


It's still there if you want to buy it. At least in name.


the people you're supposed to be eliciting requirements from are just regurgitating what ChatGPT told them are the requirements hehehe


This is becoming so true. I have read so many documents in the last year that are obviously from a GPT, especially when it’s about something new to a group.

But in the end, I would rather get a half baked GPT doc than a quarter baked junior analyst doc. I just worry that GPTs are going to kick the rungs out of the bottom of any knowledge work later. Being bad but junior used to be a learning environment without too many repercussions.

But how do you compete with peers using AI? You use it also. But now you have robbed yourself of a learning opportunity. Yeah you can learn someway by doing it, but it’s like doing homework by looking at the answers. Sure it can help you double check, but if you don’t put the effort into constructing your own answer, then you have only cheated yourself.

I think the AI alignment issues are probably over blown in the short term, but what about the long term when the average person has regressed so far as to be unable to live without AI. They will just do whatever is told to them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: