Hacker News new | past | comments | ask | show | jobs | submit login
Times are great for programmers now. How does it end? (vaghetti.dev)
314 points by vaghetti on Feb 21, 2022 | hide | past | favorite | 602 comments



I feel like we have so much leverage and don't use it at all.

We're still attending stand-ups every day with non programmers telling us when we can and cannot refactor. It's nuts to me that a skilled profession - that not many can do - lets themselves get micro-managed like this.

If anyone has read Developer Hegemony, I'm fully on board with that general premise - we start operating like lawyers with partnerships, and turn bosses into customers. Though that does require us to think of ourselves as professionals not nerds who are too smart for business.


> We're still attending stand-ups every day with non programmers telling us when we can and cannot refactor. It's nuts to me that a skilled profession - that not many can do - lets themselves get micro-managed like this.

This is actually quite interesting as I was just talking to a few other professionals about this in a social setting. I was the only one in software development.

I mentioned how stand-ups work briefly, and that it might not be a bad idea to adopt for (for example) an accounting department to keep things on task.

The response was both extreme and universal: How the hell do you all accept being micromanaged to such a degree? Don't any of you have any dignity?

Every single one of them (all professionals, like I said) were adamant that they would leave any position that managed them no better than a burger-flipper.

No lawyer, accountant, doctor, engineer, scientist or other professional stands up each day to report *to their peers* on their progress.


The DSU shouldn't be for non-technical people. They literally should not speak at all during this daily meeting if they are even there.

It's for Devs only to make sure they are not blocked and they are communicating what they are working on. In my last company the non-tech people weren't allowed and the DSU was only for engineering.

It is not a daily status meeting!

If it has turned into this then it should be scrapped because as you say it'll lead to people leaving as they feel they are being treated as low skill staff.

Btw I wrote this in a DSU that had non-tech people yabbering and which was basically a daily status meeting.


> It's for Devs only to make sure they are not blocked[..]

If in the course of one's work one becomes blocked, would one really wait until the following day's stand-up to tell anyone about this?

If so then I think both the worker and the company have bigger issues than whether the stand-up itself is a good use of time.

The mind boggles.


> If in the course of one's work one becomes blocked, would one really wait until the following day's stand-up to tell anyone about this?

There is hard block and soft block. Hard block means you have no clue how to continue and so of course you should get help right away. Soft block means you have ideas and are working on it, but - unknown to you - someone else on the team knows exactly how to solve that problem and could solve it for you in a few minutes or you can spend all weeks working out the answer. The soft block is a lot more common - engineers are smart people who can solve complex problems, but we often fail to use the help of someone else who has already solved it. Thus the standup should be able airing problems that you just need to finish typing the solution in. Sometimes someone else will say "I can help you do this faster", others it is just keep working until you get it.


Right. This sort of blockage and spotting unnecessary/undesired rabbit holes are the value of DSU. It should be a pure dev experience.

However 99% of the time what I see in Real Life is even when it's just devs talking they aren't offering up enough information for either of those to happen. Either imposter syndrome kicks in and they don't want to look stupid in front of their peers, or they're just annoyed at having a meeting. So it just becomes a status update meeting anyways


> Either imposter syndrome kicks in and they don't want to look stupid in front of their peers

This to a T. An awful lot of stuff coulda gotten crushed, quickly, if I'd just asked for help ASAP instead of siting on it and feeling dumb.


Never heard those defined before. Is there a source, or is this just a pearl of wisdom?

edit: to be clear, I think they're good definitions, just never heard a definition


I don't know if I've heard it before, or made it up on the spot. Probably a combination: the ideas have been said before, but that exact wording is mine - maybe.


I hate DSU to be honest for this exact reason there's no way I'd way a whole day. I feel compelled to explain though that at the very least there is a definition of what you should be using it for and few companies I'm work with have ever used it for that. It's mostly used as a status meeting and becomes a micromanagers daily meeting to beat people up about poor progress.


On this same site you can read all day about developers complaining about interruptions. Now a company has big issues if someone puts a task aside to do something else until the following day where they will avoid interrupting someone?


I have come to realize as a senior engineer my job is to be interrupted. I'm equal to any better than any non-entry level engineer (and entry level will advance fast) at typing in implementations, but because I take interruptions from people I can tell them the part they are missing to solve their problem - 5 minutes of conversation with me can turn several weeks of trying to find a solution into 1 day of implementing it. It only takes a few of the above to make me a 10x engineer because I'm helping others avoid false starts in fixing their problems. (and of course when I interrupt someone else they in turn do the same for me - this is a two way street)


Somebody communicating about a blocking situation depending on your input (or output) is not an interruption.

BS "can I pick your mind", chit/chat, questions that could be Googled, and of course, micromanagement BS are interruptions.

DSUs themselves are also interruptions -- it's just that they're scheduled and not event-based.

In other words, necessary communication is not interruption. Accidental communication is interruption (to be read as the same concept as necessary and accidental complexity).


Stand ups are the bane of my life, interruption wise. Not everyone wants to work the same hours but we all have to set aside a mutually inconvenient time to distract from what we're supposed to be doing. Before the stand up, your mind's not properly on your work 'cause you know you're about to be interrupted. After the stand up, you have to get back into your work but now with less time to do it.


No of course not. In my experience standuos are best for encouraging accidental knowledge sharing - "oh I know what the issue with that is", "oh I think Dave in the other team has done that before", "I was planning on changing that" etc.


Agreed, and maybe this is more common than may impression & experience, but if that's the goal, wouldn't you prefer to do it in the afternoon? After you've spent some time on something, but perhaps still have some time left to shift approach or have a 1:1 call with someone who says they can help?

Sometimes stand-up for me means 'remind myself what I was doing'; if not that then I've certainly not got far past it, haven't had long to get much done or be stuck. I'm unlikely to say in a stand-up that I'm stuck on something, I'll have a(nother) crack at it and perhaps ask after lunch if necessary.


I wish I could get this across to my PMs, bosses and other non-technical stake-holders. Current PM is an nice guy but first of all, he absolutely loves to talk and second, he believes the "daily status meeting" format is what Scrum is all about.


Send him this link:

https://www.scrum.org/resources/what-is-a-daily-scrum

From the horses mouth:

"The Daily Scrum is Not a Status Meeting"

Or maybe see about getting him certified.


But it always is.

They go around the room and everyone has to say something and it ends up being a status update.


To be honest I would just throw SCRUM into the trash if I could I think it's garbage but following some not quite SCRUM method which is what most companies do is actually insane.

The whole process becomes a micromanagers wet dream.


I have come to really like kanbahn methods. No two weeks sprints, no committing to work you will get done (thus needing to sandbag to ensure you meet them - management will measure this to ensure you hit your commitments 100% - though to be fair SCRUM itself says don't do this, but it happens anyway), just get take the more important story off the stack and work it, and repeat. If management changes priorities then reorder the stack and we will get to it.


It's really weird. I have resorted to reducing my workload from sprint to sprint to avoid the drama when estimates aren't met. Management is much happier now, they see "work done" and equate it with better performance. And Scrum has always been like that for me whereas Kanban types of work create a much saner work environment.


This is also my main method of improving work life. Inflating the cost of sprints lets me spend more time "working"- on chores around the house, side projects, childcare, etc.

As long as you complete what's "agreed" to (which is transparently a conflict of interest: engineers stating what can be accomplished; the ones who have to do the actual work), then everyone seems bizarrely happy.

The reduced working hours I qualify as increased salary, which prevents me from jumping ship for a better salary. I also qualify the reduced work as metal health recompense for perpetually dealing with scrum narcissists.

It's Putts Law https://en.m.wikipedia.org/wiki/Putt%27s_Law_and_the_Success... - but somehow management's impotence is becoming increasingly transparent.


I used to work at a company where the standup meeting was once per week, and we would pre write everything in a shared doc. The meeting was then just silently reading, only saying something if necessary. A great place of sanity, that was.


For office work that sounds great. For remote working don’t mind burning 20 min to see my coworkers faces and chat 2-3 times a week.


If it looks like a duck, walks like a duck, and quacks like a duck, then it's a duck.

A lot of `Agile` material is just doublespeak meant to sell consulting hours and training.


We even have a Scrum master & certified Scrum trainer and she does not intervene which is kind of surprising considering the DSU devolved into a status meeting. You can't make it up, old industry really doesn't change.


It is always a status meeting, even if it is advocated as not being so.


it's only a status meeting for management if management insists on being there.

kick them out.


I've been in teams with just developers, being siloed from non-programmers, with a technical manager. It still becomes a status meeting. We can tell them DSU aren't supposed to be status meeting, but it still devolves into one.


Part of the process is making sure the process works, i.e. achieves objectives. This sounds like such reviews were never done in those teams.


This attitude is something I associate with very junior developers.

"Hey you know your morning check in, this is the one true holy way that every company in the world should do it"

it's not even that it's not probably a good way to run stand-up. But the hubris is pretty extreme.


> It's for Devs only to make sure they are not blocked and they are communicating what they are working on.

If this works for you then this is good.

But I think these meetings are not for devs or something is very wrong with the setup. If they are Devs are are in the same team and working on the same project then I assume they have access to version control => no need top update on what everyone is working one. Instead teach them to do small commits and to write better commit messages.

If the meeting purpose is to communicate what is blocking someone => this again fails IMO because then someone might wait 1 day blocked? And if they don't wait that much and do an action about it what is there to communicate about in a daily meeting?


>It's for Devs only to make sure they are not blocked and they are communicating what they are working on

Devs on their own could do that, and do it better, before such BS like DSU was invented...


Totally agree. Not even I join the daily as I don't want to mess with the scrum master's work or invite the team to ask questions as it would create longer meetings. At our startup, the devs meet with UI/UX, QA and the scrum master, shortly discuss what's happening and move on. Max time budget of 15 minutes. The rest really is micro-management.


> No lawyer, accountant, doctor, engineer, scientist or other professional stands up each day...

I think this has a lot to do with the office culture of software being figured out at the same time as the professionalization of Management and the trend of "methodologies" to give a veneer of objectivity to the managerial class. Combined with FOMO about the fortunes being made at certain other companies, whose whims and experiments are then followed.

Whereas the other professions you mention had their office cultures defined in very different times. Once defined, these tend to be sticky.

For example the extended hazing ritual by which we mint medical doctors in the USA is completely insane and would not be tolerated by doctors in most other countries, much less by non-doctors. Or for scientists, the amount of time many of them have to spend on acquiring funding for their employers instead of doing actual science, not to mention the very broken world of peer review.

The thing that makes our situation special -- and which these people were probably reacting too -- is how transparently ridiculous, toxic, and demeaning a lot of it is. From open-plan offices to stand-ups to pair programming and on and on.

I think we put up with this stuff because we are convinced we should be thankful for our fat paychecks and fear the Barbarian Hordes waiting to take our jobs for cheap in Elbonia.[0] That both of these things are objectively false has not made much difference in the last decade, but the salaries have gone up, so...

[0]: https://dilbert.fandom.com/wiki/Elbonia


> For example the extended hazing ritual by which we mint medical doctors in the USA is completely insane and would not be tolerated by doctors in most other countries

I married a doctor. The "hazing" is slowly going out of fashion as older doctors retire.

What they do have is a very nice licensing procedure. Yes, it's annoying, but it works a lot better when a doctor needs to get a job, because their interview process is much smoother than in the software field.


I have a wife who is a licensed nurse, and I was just blown away when I heard how the interview process goes for medical professionals. What, they trust your credentials at face value? You don't have to do procedures live in front of the interviewers? The interview is mostly _them_ trying to get you to work there?

What a concept.


> For example the extended hazing ritual by which we mint medical doctors in the USA is completely insane

In case anyone wants a glimpse of how bad medical school could be:

https://web.archive.org/web/20101218031844/http://www.medsch...

It's a pretty old site, no idea what it's like now.


> to pair programming and on and on.

Glad someone else is mentioning this, about 10 years ago it seemed like it was the hottest trend and that anyone (meaning any programmer) not willing to adopt it was either too lazy or had something to hide, or, most probably, both.


Would love to know which countries don’t actually haze their doctors.

Here in DR docs get hazed out the Wazoo-being told they don’t have the status to look at the head surgeon in the eye or being tasked to get coffee.


Dominican Republic?

I'm sure there is a lot of status-related meanness everywhere but I was thinking in particular of the US practice of intentionally and radically overworking people (to a level dangerous to both them and their patients) in order to prove they can take it, which is pretty much classic hazing. Wouldn't be surprised if it's copied elsewhere but I know it's not done everywhere.


It sounds awful to me, too, but in casual conversations with medical professionals who have been through it (and in some cases while they were going through it) they very frequently defend the process. It doesn’t sound like Stockholm Syndrome or gate-keeping justifications either. They basically say that it’s the best way to see/learn the full progression of a case. Essentially it comes down to charts and records can only communicate so much and actually being there is how the real learning happens. I don’t know if I buy it, but I do know one thing — systems like this usually carry really value, or they wouldn’t be so persistent.


I'm fairly sure that's the norm almost everywhere. It's not just to prove they can take it, hospitals like to use students as free labor. I know it happens in Western Europe.


Daily standups can be useful when its a small group of people (3 or 4) working on the same project. In such meetings participants actually have the context to make meaningful suggestions based on other's updates "oh.. have you considered this? you should try this!"

Standups with more people working on separate projects are useless*.

My most recent standup with a group of 8 who were working on many separate projects spread across mobile, front-end and server. In this meeting, even if I did share some details about what I was doing, no-one else would have the context to help me. Also, it feels disrespectful to go into detail about something highly specific to my work while 6 other faces stare blankly as I waste their time. Ive heard stories from a friend's company of a standup with 30+ people across product, engineering and management. Could you imagine going into detail about your choice between an abstract class and interface on the call?

In such standups, the only sane action is to give a quick, high level update and mute yourself and go back to work. Standups like this should be ruthlessly dived down into several smaller meetings of people actually working together.

*Interestingly though, most people on my team love the standup for another reason - human interaction. Folks simply want to tune in and say hi to their team. In my opinion though, this isn't a good enough reason to keep unproductive standups, and the meeting would be better replaced with an optional 15 minute "coffee and chat" meeting. If folks want to tune in and chit chat, great - let them do it! But no need to waste people's time with mandatory "agile" ceremony.


That's pretty much the gist of it. The ideals of "agile" (or really just Scrum for the majority of people) aren't what most developers rail against. It's how tightly the "how" is defined and acclaimed as the "one true way", despite authors repeatedly writing it isn't, and despite proof of software projects having done just fine otherwise. Which then saps the energy of many participants, making them less likely to do what they would do otherwise.

I've never heard opponents go "communication doesn't matter", yet the moment we talk about taking down these meetings, that's the first kneejerk most proponents I spoke to reach for. As if being against standups is indicative of thinking poorly of communication. Same goes for specification of goals, cooperating to keep everyone going strong, and improving to make things less of a drag. Is there so little faith in basic human intelligence and cooperation, that this one method is the only way we silly developers can be redeemed?

Now we have a bunch of people who haven't touched code in multiple years parroting what the consultants sold them and trying to talk their way out of aforementioned proof. Or they make claims like "well but it is better now!", as if their previous methodology wasn't so horrible they could've thrown a dart and very likely get a better result.


> No lawyer, accountant, doctor, engineer, scientist or other professional stands up each day to report *to their peers* on their progress

Doctors do something not far removed. It's really important to them to caucus on case management.

I think other professions may do the same.


> Doctors do something not far removed. It's really important to them to caucus on case management.

Maybe, but it is removed.

From my recollection, doctors don't standup each day to report on what they have or have not done and their individual progress, they report on purely on problematic patients that could do with expertise from a different expert, and it's not a daily progress report.

If software development followed that sort of practice, we'd have problematic issues added to a pool that gets discussed once a week. What we have is micro-management of each individual. Doctors most certainly aren't being micromanaged like that (well, not the one who I was socialising with, and not any of the others I've socialised with).


I'm not entirely sure this is more than a matter of degree. I liked most of your point but I think you devalue aspects of this mode of work discussion. It's important to discuss work and holding yourself to account in a shared endeavour is not of itself bad. I have little doubt it's also used as passive aggressive bullying sometimes.


That's the problem. Anecdotally, for many programmers I spoke with over a 20 years of career, it's not "can be used as passive aggressive bullying sometimes"; it's "almost always". :(


Likewise. Pretty much everywhere I've worked uses standup as a way of bullying people into working more hours, with a couple of unusually relaxed exceptions. It is a far cry from the origins of agile as a means of programmers defending themselves against a hostile organisation, as it was originally. Now agile is the hostile organisation.


I think at this point it should be common wisdom that anything and everything can and will be adapted to yet again introduce a power imbalance. The people who supposedly rule people -- a-hem, sorry, they manage them (heh) -- don't like it when the worker has influence of the process and especially the costs and schedules.

So they'll co-opt any newfangled trend into plain old serfdom. And they do.

As another commenter said, we should take page from the lawyers. A lot of them manage to convert bosses into paying subscribing customers.


I’ve never seen bullying in stand-up? So I’m skeptical of this “almost always”.

In fact many of the teams I’ve working are fine with people skipping standup if they have any reason to. Currently I skip like half of my teams standups.


Everywhere I've been, ever, stand-ups weren't optional.


That’s interesting. I’ve run into this a lot online. Its obvious in retrospect but often taken for granted that people can have such diverse experiences that really influence our opinions and views of the world.


The nice thing about anecdotes is that they don't say anything. For example: I have heard this for none of the programmers I've spoken with over a career of a little over half that time.


Anecdotes don't cancel each other out. They are all true at the same time.

So yes they do say a lot, maybe not what you believe but they still do say things.


> From my recollection, doctors don't standup each day to report on what they have or have not done and their individual progress, they report on purely on problematic patients that could do with expertise from a different expert, and it's not a daily progress report.

Isn't this what standups should be for developers too? It's not supposed to be a daily status report, but a discussion about blockers and an exchange of knowledge from other devs that might be able to unblock you.


At least here, doctors have a meeting every morning.

It could have actually inspired the daily standup. With the tradition of taking inspiration from surgical teams and everything...


It's interesting to consider the difference between project and event driven (ops) work.

Engineering project teams doing Scrum struggle when tasks need to be initiated and completed at a faster cadence than their sprint cycle - for instance an engineer doing development work planned monthly gets 3 calls a day asking him to reboot a server.

There are apparently better ways of managing operational work - for kanban (with or without a board), ticket systems and queues and processes and dashboards for managers to spot bottlenecks live and re-alocate workers to solve them.

I know (from TV docs) that ERs here generally have a "morning meeting" where the manager briefs people and talks through throughput statistics and demand estimates.

I don't know how to manage an ER department, but I'd understand a lot of work has been done in that area. What I could say is that it isn't project work you could plan on a monthly sprint cadence.

Burger bar workers will also generally be executing operational work according to a clear process that involves tickets and ownership of a problems (tables). I've also seen IT help desks run like this.

What I'm saying is, Scrum is one possible way of running things, but it isn't one size fits all for all kinds of work. But just people are not doing Scrum does not mean they are not working within a detailed process where they may potentially be micromanaged to the same degree.


Operational safety focused professions (not just ER) have shift change meetings, where they tell the people from the next shift what they should be wary about.

Scrum dailys are very clearly based on those, despite the fact that development work is neither operational nor shift-based. It's cargo-culting all the way down.


>I don't know how to manage an ER department, but I'd understand a lot of work has been done in that area.

Even so, the long work hours of younger doctors are infamously sub-optimal. We shouldn't submit to the assumption that something is right because everyone does it and they operate in a free market.


Are you referring to patient rounds? It seems pretty different to me - that's (usually/when well-staffed) multiple doctors (at various stages of training) working on a single bug (pun very much intended) before moving onto the next one. And it's (for all patients considered together) a significant chunk of the day: what's left of it is used to do the work that arose from the discussion (with some of that left to whoever's on overnight).

If we need to analogise it, I'd say it's more like sprint planning - discussing what's to do on each ticket (patient). Except it's longer and more detailed, and the work that's left is much more like grunt work in a way - book the scan, take the blood, chase results, etc.


> Are you referring to patient rounds?

I think it’s a reference to the Mayo Clinic team-based model [1].

For simple stuff, a single doctor executing a textbook is fine. For complex cases, doctors of different disciplines meet to discuss, create and keep track of a care plan. The analogy being most problems a programmer is tackling are assumed to be novel cases.

[1] https://catedradecronicidad.es/wp-content/uploads/2020/06/Ma...


Sure. I still think I'd make the same argument: that's a team working on one problem, not discussing/managing the one of them working on each of multiple different problems.

Otherwise we can say any team sport discusses tactics for a game and this is a daily stand-up and it works for them so why not software.


> that's a team working on one problem, not discussing/managing the one of them working on each of multiple different problems

Pursuing a common goal. But working on different problems. (Is it a blood pathology? Is the immune system misbehaving?) If a team doing a stand-up isn’t working on any sense of a common problem, they aren’t a team and aren’t working efficiently.


I mean every show I ever saw where more than one lawyer is on a case they are constantly talking about how their part is progressing, probably they don't have a standup because they talk about it so much that they don't need 10 minutes a day to let everyone know what's going on.


You do it daily when you're managing life/death-impacting operations (ER, hospitals, factories with team shifts).

Software engineering/IT RUN and SUPPORT activities MAY fit there too, but that does not even mandate a daily review. BUILD activities definitely don't.

And worse, teams mixing RUN and BUILD roles... definitely don't.


Have you seen how lawyers, accountants etc actually work though? It's a shit show, there's little to no project management and stress everywhere. Talk to any lawyer under 40 about their work practices and your mouth will drop. There's no autonomy, just endless pressure, confusing directions and last minute reactivity.

Trello and a standup among teams would immeasurably improve things.

(Source: I know a bunch of big firm lawyers - and ex lawyers who said "Fuck this" and quit)


> There's no autonomy, just endless pressure, confusing directions and last minute reactivity.

So, software development in a nutshell? Maybe I only have seen bad examples, but I've never seen the agile/scrum/whatever-of-the-day stop these things from happening. You have these things and at some point they add the process on top cause "we sure do need a process".


> Every single one of them (all professionals, like I said) were adamant that they would leave any position that managed them no better than a burger-flipper.

Such entitled garbage. Restaurants manage people much more tightly, and often with very abusive practices and wage theft added in.

Stand-up is nothing compared to having an underpaid alcoholic yell at you because you leaned against the wall 2/3 of the way through your second double shift in a row.


Wage theft is endemic in programming. Think of all those colleagues who have worked late on a deadline because somebody pressured them into it. They're reducing their own effective hourly rate and contributing to a long hours culture because of the daily ritual of humiliation at standup. It is very different to the restaurant analogy but wage theft and humiliation are still baked in for many of us.


> but wage theft and humiliation are still baked in for many of us

It's harder to get much sympathy from others as well if/when you're paid much higher than many other averages. Using the term 'wage theft' - normally used in discussions around minimum wage job abuses - for someone who's making $80/hr, (and may get some stock/bonus) doesn't get the same sort of reaction. The term doesn't even feel quite right, but that may be just because I've heard it so often relating to the lower end. I'd imagine it's not even about money so much for many folks as much as it is about time.


But money is time. If my employer feels strongly enough about getting something finished tonight, they’ll have to pay for it (e.g. paid overtime). This naturally limits how much overtime your company would like you to do, and puts some backpressure one those managers burning through budget like there is no tomorrow.


When you accept a salaried position, you understand that there are going to be weeks your "effective hourly" fluctuates. Some weeks I have lots of work. Some I don't. I have sympathy but that's not wage theft.

Actual wage theft is very common in food service. As in, you earn the money and someone else takes it.


Most software positions have a never ending backlog of work that can easily fill 40hrs of any week. Meaning a salary set at 40hrs/wk is pure downside to the work as your actual job expectations have a floor of 40hr weeks but sometimes 50-60hr+ weeks during crunch times.

It's essentially legalized wage theft.

Thankfully I get paid hourly, but still when I do overtime it's only paid at a x1 multiplier because the dinosaurs in office deemed 'computer workers' [1] are exempt from fair labor regulations. Likely, those in power guessed that they could safely chip away at more worker's rights because most of the voting base won't feel solidarity for highly paid salary positions and 'computer workers'.

[1]: https://www.dol.gov/agencies/whd/fact-sheets/17e-overtime-co...


Counterpoint, almost every tech worker I know works like 30 hours a week in all honesty.


I've worked unpaid overtime at every tech job I've had. On the flip side, I never had any wage issues in 5 years in food service, other than that the pay isn't livable.


So because some professions have it worse developers need to have it worse as well? What a garbage point.


Huh? I was responding to someone saying that stand-ups === being managed "no better than a burger flipper".

I don't agree with that point, especially since food service has the most toxic management of any industry.


What restaurant(s) have you worked for or are you citing?


Seriously? Do you think working at a McJob is unicorns and rainbows? Getting paid minimum wage, being on your feet all day, entitled customers constantly up your ass, a management team that literally thinks of you as a fully replaceable work unit (and you are !!)…

Yes I’ve done it. It’s not the worst job I’ve had but my current job affords me the opportunity to type this from a vacation house on the beach in Hawaii. When I told the people making my lunch last week that I was headed to Hawaii they said they always wanted to go there and hoped to do it in a couple years, like it was a once in a lifetime trip for them. And it probably is.

People need to actually pay more attention to the world around them and have some conversations to realize how well we’ve got it.


I worked a McJob (actually the McJob) from 16-20, and it was probably my main motivation in getting into the CS department at UW (it helped that I was interested in computers and seemed to have a talent for the field). Sometimes I wonder if any of my colleagues had to go through that, but many did. Surely not all of us programmers came from stable upper middle class or better backgrounds.

I did have some really good managers though, people I couldn’t believe weren’t doing better things (and some real bad ones, too, of course).


I worked in various restaurants for about 12 years, but its not just my experience.


So the programmers are on the level of restaurant workers? Not sure what your argument is.


Responding the the person claiming that stand-ups === being "managed no better than a burger flipper".


No, I've literally had software jobs like that. I've actually had restaurant jobs that were better than some of my worst software jobs.


Hmm. I think there are two characteristics of software development that make it a special case.

Firstly the error rate. We make 'mistakes' (bugs) all the time, how many other professions have a dedicated QA department dedicated day in, day out entirely to catching error. Yes there's audit, but most departments only even talk to them a few times a year.

Then there's the issue that the problem domain, and the design you're trying to implement, is often very poorly understood or at least documented and specified, even by the domain experts commissioning the project. This is the whole reason we have agile.

Between squashing bugs and constantly course correcting on the direction of development, I think some of the micromanaging makes sense. There's a heck of a lot of very unstable micro that needs managing. As has been mentioned already, the nearest analogy I can come up with is medicine, particularly for critical patients with poorly understood conditions. I think a great, if completely fictional example can be seen in the frequent discussions going on between the medics in 'House'. They're constantly putting their heads together and white-boarding cases, though in reality such discussions are I'm sure much rarer than the show makes out.


The way those professions work is, generally, completely different to how programmers work.

A lawyer, accounting, or doctor will typically have their own clients that they focus on. They may ask for advice on occation but generally one of those professionals will handle many customers completely unique needs.

Lets look at an accountant for example, they will have multiple clients each of which take up a small portion of their day / month. Each has individual needs and rarely will they have any real relation or crossover in their work. It'll be even less rare that your client has anything to do with another client of your business which may be handled by another accountant. The same is the case for doctors, and lawyers as well. You have one professional serving a multitude of clients.

Programmers work differently, typically we have multiple programmers to a single client. That client may be internal or external. If your particular case needed 7 accountants, or 7 lawyers you'd be damn upset if they weren't in regular communication about the work they're doing towards your case. If they all came back with totally different suggestions that completely clashed you'd be shouting at them for not communicating together properly.

Engineers are the only ones where I think there's a stronger relation. You might only work with one engineer for a small change to your house, but on big infrastructure projects there had better be regular communication between related groups or you end up with your designs failing ragulation.

Daily standups are for the programmers in the team to get together, to de-conflict, and understand what's changing around them. If Janet is about to make a big change to the auth system and you're working on a new api endpoint you probably want to know and make sure you're not causing problems for each other.

If your daily standups are actually just 15 minutes each day where only 1-2 managers are benefiting from a statud update from everyone, then sure that's micro-management and probably wasting a hell of a lot of time. Suggest changing it to a weekly call, or having everyone email a summary each week. It's important to report status on a regular basis just as you'd be pissed if your accountant didn't let you know once they'd filed your tax returns or if your doctor had your blood test results back a week ago.

Conflating a daily standup with a daily status report though would be frustrating and probably does need challenging, but it's not easy to compare across professions like that. Their workloads are completely different to ours, and so need a different approach.


For me (and the culture of the 3 companies that I worked), standups were about collaboration, no micromanagement. Far from it, btw.

Your conversation above just sounded arrogant for me.


> The response was both extreme and universal: How the hell do you all accept being micromanaged to such a degree? Don't any of you have any dignity?

And they're totally right. I don't tolerate to be treated like that and nobody treat me like that.


Not exactly. All professions have similar communication. Subordinates report up to their supervisors and colleagues discuss laterally with themselves, especially when in a team focused on the same project.

> "I mentioned how stand-ups ... might not be a bad idea"

That's what the other professions are taking issue with: the way software teams communicate.

Stand-ups are a terrible idea. Imposed superficial questions with rarely any nuance or relevance while eating at productive time. We need proper measured processes rather than the agile/scrum/random voodoo of the day.


It seems like all the magic processes are a "Now you have n+1 problems" problem. Everyone who invented a new process thought they were fixing what was broken in the last process.


> No lawyer, accountant, doctor, engineer, scientist or other professional stands up each day to report to their peers on their progress.

My experience when I was a tax accountant, we have a weekly status checks for around 20 people which took 1 hour, different seniors and managers breathing down your neck everyday on the progress of their projects.

As a junior, I have to manage the expectations of these managers competing for my time to prioritize their client. Sure we don't have daily standups, but I will choose daily standups over this.


What? In medicine it’s called “rounds” and it happens every shift.


As an engineer, I can verify this. I just send my boss a few bullet points in an email on Friday afternoon describing what I've been up to. I couldn't do the stand-ups that the dev groups do. I frequently need to coordinate what I'm doing with dozens of other engineers, but I never need to be micromanaged about it. Is software really that different?


> No lawyer, accountant, doctor, engineer, scientist or other professional stands up each day to report to their peers on their progress.

This just isn't true. Semiconductor process engineers and manufacturing technicians have stand-ups twice a day. I'm sure most if not all of those other professions have ~daily alignment meetings as well.


> We're still attending stand-ups every day with non programmers telling us when we can and cannot refactor.

I highly disagree with this.

For some reason, many devs seem to think they are somehow better than everyone else, that they are the most important part of any team, that no on besides them can have any insight on timelines, or budget, on which features are or aren't important.

News flash - all of the above is none-sense. Developers are a profession that provides a specific skill, a highly valued one, but one that is a part of building things in most cases. There's nothing demeaning or wrong about devs having managers, whether or not they happen to also have been developers themselves. Some great software managers were never developers, and some great developers are horrible managers, just like in every other profession.

> It's nuts to me that a skilled profession - that not many can do - lets themselves get micro-managed like this.

Every skilled profession has people who are managed. Doctors, lawyers, they all have managers, and most of them interact with the public, who can override their expert opinions.


I think we're definitely better at non-developers at software development. Yet we literally have outside people dictating to us when they think it's appropriate for refactoring to happen. Or when it's time to update a dependency.

This is a lot different from than "no on besides them can have any insight on timelines, or budget, on which features are or aren't important.". I definitely value the input of other professionals when it comes to their field of expertise. All I want is the same in return.


Again, I'm not defending every single workplace here. Obviously some developers aren't treated well, and in those cases, they should find another job. I was commenting on the fairly blanket statement made.

But the way the conversation should go isn't the developer deciding when refactoring should happen. It's the developer giving an analysis of the cost and benefit of a refactoring (it will take X time, but will save Y work in the future). And the manager factoring that into all the other circumstances, and deciding whether it's worth the current cost.


> It's the developer giving an analysis of the cost and benefit of a refactoring (it will take X time, but will save Y work in the future). And the manager factoring that into all the other circumstances, and deciding whether it's worth the current cost.

This has been the ideal theory for decades and I really wish people finally stopped to think for two minutes, look around, read some history and finally understand that it's not working.

Is it really so hard to accept this reality? I am periodically mind-boggled by the amazingly resilient illusion that you can plan things as tidily. It never happened, not once, in my 20 years of career.

Estimating these costs upfront is more or less impossible. It's usually a can of worms that opens other brand new and exciting sub-tasks, the same kind that gives managers nightmares.

IMO you might have projected a little bit and arguing against some imaginary "us the programmers are better than everyone else". Meh. Only kiddos think and say things like that. By and large, almost all experienced programmers I've ever met are quite moderate and humble.

Problem is, they are too moderate and humble which leads to predatory businessmen micro-managing them to oblivion. I've personally witnessed high-profile programmers firmly saying "No!", putting their foot down and making a thing their way with more time budget. Their project and code was used, almost untouched, for 10+ years afterwards -- it was that good.

Many of us are expert artisans. You don't go to a professional blacksmith giving him wisdom how can he accelerate a certain process.


I think the key point for me is that once you control for all common human flaws, the developer is still far better suited to make an executive decision regarding a technology concern than anyone else on the team.

I recently made the mistake of deferring to the anxieties and fears of non-developers for so long that I almost forgot how we got to where we are today or what my true capabilities are.

You know exactly what those assholes are thinking when you say the word "refactor" or "iterate". It starts to become a psychological horror experience if you let it get out of control for too long.

Now don't get me wrong. At the end of the day I love and care for our customer too. I just don't let their minute-by-minute feature request stream live rent free in my head the same way the rest of the team does.


> I recently made the mistake of deferring to the anxieties and fears of non-developers for so long that I almost forgot how we got to where we are today or what my true capabilities are.

Ouch, dude, you didn't have to stab me in the heart. But yeah, 100% same. I am just now, at the age of almost 42, starting to finally crawl out of that. I am ashamed that it took me most of my life and career but oh well.

> I think the key point for me is that once you control for all common human flaws, the developer is still far better suited to make an executive decision regarding a technology concern than anyone else on the team.

True, but then you get blamed for "not understanding business realities". I got super tired of being hinted that I "don't get it" or whatever, especially having in mind that I launched products whose launch date I was not consulted about, repeatedly, and still get treated like dirt. I started pushing back on micro-management, fiercely and sometimes rudely.

You know what's the funniest part? It works. People start looking at you very differently and suddenly negotiations are an option. I guess it was possible all along and the business people are like "push him, he's a tech nerd, he'll fold".

We should start getting aware how are others viewing us and integrate this knowledge into our negotiation strategy.


Pushing back, standing up for yourself, not being a doormat. All this is key to being treated with respect. The problem is that today, we've got a perfect storm. Developers do tend to be more naturally non-confrontational. Combine this with an abundance of opportunities, and developers will just leave rather than confront and push back.


> It's the developer giving an analysis of the cost and benefit of a refactoring (it will take X time, but will save Y work in the future). And the manager factoring that into all the other circumstances, and deciding whether it's worth the current cost.

I don't think either devlopers or managers can estimate future savings in most cases, but I still think it's necessary to refactor just to not drown in complexity and slow down overall development speed. My approach is to reserve about 20% for refactoring and technical improvements and let the team decide internally what to use it on.


Why 20%? Just curious how you arrived at that percentage.


Thats why I said "about" 20%, so it differs based on project and situation.

Enough to get useful stuff done, small enough to keep most capacity for feature development. Also depends on the amount of technical tickets deemed relevant by the team


Pareto optimal


That assumes there are equal weights to the factors.


> It's the developer giving an analysis of the cost and benefit of a refactoring (it will take X time, but will save Y work in the future).

I think I just fundamentally disagree that this is the managers job. He tells me what needs doing, and I get that done as soon as possible within a given block of time (mostly sprints, so two weeks). They should be able to trust me to refactor as necessary within that block.


Well of course you can. But that sounds like a less powerful position. Being able to refactor a scope that can fit into a sprint without communicating it is (should be) like a non issue. If you’re manager cares, that’s too bad and sad. Personally no where I’ve worked has this been unacceptable or questionable. Hoping that’s the norm, but maybe not.

I think the more interesting scenario is large tech debts that take real time, like the change itself is weeks, or months. These need to be justified and planned and coordinated with managers.


For non-tech people the code should be a black box they shouldn't know the internal details.

In terms of refactoring there is no way to give a Cost Benefit analysis as that's too subjective. This should be a maintenance cost that's factored into the project at a high level. So you would say there a X days per year (normally about 20%) of the working days of the year spent on maintenance. Company I'm at has this and if you're doing this type of work then you have timecodes to book your time against.

That includes refactoring, re-architecture, tech debt, updating deps ...etc. PM's can't override this for their own project i.e. they have to allow for 20% off project activities in there Project Plan. Devs shouldn't spend more that 20% of their time doing these activities over the FY.

Example here would be in the UK we have something called a MOT which you have to legally get it's like a minimal maintenance you have to get on your car. Most people spend a little more though than the minimum.


> For non-tech people the code should be a black box they shouldn't know the internal details.

Ideally, yes, though in practice this usually breaks in various ways, especially when problems occur. E.g. the President of a country probably doesn't need to be a firefighter, but if there's suddenly a massive fire, I assume he'd need to be briefed in far more detail to help call shots around the type of response.

> In terms of refactoring there is no way to give a Cost Benefit analysis as that's too subjective.

I mean, you probably won't get a very correct Cost Benefit analysis, but you have to do something. The rest of your post is you describing that something - if you elect to spend 20% of your time on maintenance, that's effectively giving it a certain amount of weight compared to other things you can be doing with your time. That's certainly one way to work.

Personally, I think a more flexible approach is better. E.g. new startup-ish projects should usually spend a lot less time refactoring, for multiple reasons, whereas more established long-term projects should spend much closer to 20%, or maybe more.


> I assume he'd need to be briefed in far more detail to help call shots around the type of response.

I feel like this is something a lot of presidents (or VP’s, or Directors, etc.) think, but really, instead of taking time away from the professionals doing the fighting to explain to them the minutiae of blowback, the best thing they can do is say “I trust you, put it out.” And then leave the pros to do their job.


I think that is true when the goal is to get something well understood out quickly.

In other cases, it is more important to get things right and continual refactoring is the fastest way to find an acceptable architecture to a novel problem.

In either approach, when speed is of essence, a limited scope helps avoid either producing insurmountable technical debt, or an unterminating research rabbit hole, respectively.


An MOT is a bad example because they don't do any maintenance at all. An MOT is essentially a safety certificate to make sure you aren't driving around in a car that's due to fall apart and injure yourself and others. How you attain that certificate (by ongoing maintenance) doesn't matter.


No it's not. The MOT is a test that you have performed maintenance so the example stands. You can do 0 maintenance and still get your MOT. likewise in software you can do zero maintenance and still have a good well working product that follows best practice.

In the case I gave it's hours book on the maintenance project that is the test but you could very well have other measures and I've seen these in many projects. Things like linters, dependency bots, unit tests, e2e tests, code coverage, Cyclomatic complexity ...etc.


I would absolutely love to know what formula could possibly determine time saved in the future.


Perspective might be key here. "Yet we literally have outside people dictating to _you_ when they think it's appropriate for refactoring to happen. Or when it's time to update a dependency."

You work in a place where this happens. You could not.


I don't work anywhere right now :) But as a contractor I move around, and it feels pretty industry standard to me.

Where does it not happen?


> You work in a place where this happens. You could not.

You could, if lucky, but such sane places are becoming more and more rare.

In the 90s (and probably earlier but I don't know) developers were viewed much more highly than now in all aspects except salary. There was a lot of autonomy and authority, the role was respected as a professional. With the rise of agile in the last ~15 years it has been transformed into a low-authority replaceable-cogs type of job, to be micromanaged by everyone else involved in the product. Salaries have gone through the roof though, so I guess that's the tradeoff.

Maybe there are engineering-oriented companies still with the 90s culture?

(If anyone knows of any such companies, please share, I'll head over to interview there.)


I guess the closest feeling I've had to what you're describing is someone being condescending around matters that are considered non-technical? A PO or Manager essentially treating dev's like entitled children who can't focus on getting the job done.

My only counter however, is that this often is the case, and the manager is right that the very overpaid engineer essentially sucks at self-management.

Or maybe management as a whole has simply realised that they need to gaslight engineers into thinking their required, or maybe a generation of engineers have been promoted into a role they're unsuitable for. All possibilities, all likely.


counter point... doctors do face something like this in most cases. If you work at a large hospital, you cant just do whatever you want to a patient whenever you want.

you follow guidelines and rules for the most part, if you performed surgery before a patient had scans same day, youd be reprimanded by peers. that seems analogous to "don't do major refactor on release day".


Doctors and lawyers have way more integrity and status, because they are well rounded people as well as experts. They can put their knowledge to the best possible use.

In the software industry there is a lingering stereotype of the eccentric genius who is really good with computers, and really bad at everything else in life, and their computer skills make up for lacking a lot of basic skills of a functioning adult, such as lack of judgement and perspective.

It's expected that software developers are odd and lopsided people who need someone else to babysit them and provide judgement and direction, and by now this has become the norm in the industry and is an established culture in the workplaces, so it also gets perpetuated and it's really hard to even be a well rounded software developer.


As someone who has managed developers And worked as a developer I’m not sure that’s completely untrue. I’ve seen bored devops people starting to build their own ci/cd system, massive meetings where people started building their own micro service framework, a dev in my startup spent a month reworking things into sagas with no consideration that we have no product market fit yet, people that insist on “contributing their thoughts” even though they’re not the least qualified or experienced… I’ve seen millions and millions of euros get wasted on things that don’t matter.


> I’ve seen bored devops people starting to build their own ci/cd system

I've seen this too. It added tons of overhead before it was rightly cancelled.

> I’ve seen millions and millions of euros get wasted on things that don’t matter.

To be fair, I've also seen this from the business side. A great number of ill thought out unsuccessful projects, that are dispiriting to work on, as you know from the beginning that they will not be a success for concrete reasons, e.g. using an immature piece of software that doesn't support the features required.


This is not unique, at all, to developers.


> Every skilled profession has people who are managed. Doctors, lawyers, they all have managers, and most of them interact with the public, who can override their expert opinions.

Speaking as a lawyer, and as a programmer: your statement is utter falsehood.

In many countries lawyers are not 'managed' by other lawyers. In fact, most non-lawyers have no say in how lawyers practice, and are there to assist only in clerical/administrative matters. In some jurisdictions, non-lawyers are even barred from taking any cut in the outcome of matters (e.g. possible champerty/barratry), and definitely cannot override expert opinions.

We even have a derogatory word for those who 'manage' lawyers: touts.

In a similar manner, the non-professionals who think they can 'project manage' professionals often over-evaluate their own ability at management; I've seen them make plans that have no basis in engineering reality or technical compliance/requirements especially when it comes to development timelines. Your defensive posture of non-programmers is very indicative of the kind of mentality that true professionals must guard against. Professionals must do a job well, and not just be told to rush out buggy, unsafe products by non-programmers the majority of whom cannot even program to save their lives.


Interesting! So you're saying e.g. a legal practice can never be run by non-lawyers? That does seem to make sense.

Still, what about cases like a legal counsel for a firm? Surely in those cases, the manager of the lawyer is the CEO, who might not be a lawyer himself?


Generally, lawyers can only be managed by other lawyers. Exceptions apply in limited circumstances (e.g. see Australia).

As to your reference to the word 'legal counsel': a distinction must first be made.

The use of the word 'counsel' is often confused by non-lawyers. 'Counsel' is a specific term used to refer to lawyers appearing in court ('counsel of the courts', or 'court practitioners', or 'barristers').

Whereas 'legal counsel' is more specifically used in reference to in-house employees of companies, who handle regulatory compliance, general legal affairs of their companies, and so on. Such legal counsel may be legally trained/qualified, and may have had prior experience in legal practice. But they are not considered 'lawyers', in so far as they are not active/practising members of the bar at which they are practice, and in some instances are referred to as 'non-practising' lawyers. This is about the same distinction as e.g. a graduate of civil engineering, who has decided not to practice as an engineer and has gone on to some other field. Lawyering is however a very traditional profession, and the distinction between practising vs non-practising lawyers is very marked, and generally 'legal counsel' (as described above) are not treated as lawyers at all. It is not uncommon to find an in-house 'legal counsel' who has absolutely no idea what they're doing in practical/commercial reality, as they've never handled proper disputes or difficult contracts before except in non-contentious settings.

Thus in certain jurisdictions legal counsel are often prevented from performing any acts that must and can only properly be done by actual practising lawyers - e.g. 'legal counsel' cannot represent other companies/people/entities in courts (as they have no right of audience or locus to represent third parties), they cannot give legal advice as lawyers, etc. 'Legal counsel' as referred to in such context may be 'managed' by a CEO, but that's only because they are staff with legal knowledge, and not lawyers per se.


and yet i don't tell my plumbers to give me a standup for the day.

I tell them what i wanted achieved, and i pay. it gets done. If there's a delay, i expect the plumber to tell me promptly, and i re-evaluate the work.

Some programmers do get treated this way - consultants and such. But the majority is not - they are considered a cog in a machine.


Plumbers that you personally hire to your house are similar to software consultants. They're often more than "just" plumbers - they're small business owners, so they do a lot of things besides plumbing (e.g. marketing, business management, etc). Again, similar to software contractors.

But take plumbers working construction on a 30-story building. They're usually part of a team, and work together with dozens if not hundreds of other professionals. Do you think they don't have meetings where they get together with the other teams, discuss problems, progress etc?

I'm not defending stand-ups here, btw. Sometimes they make sense, sometimes they don't. But if your measure of "developers are treated badly" is "developers are asked to be in a 10-minute meeting to talk about their work progress", I think you're missing the forest for the trees in terms of what are and aren't good working conditions.


Yeah but it's because your plumbers have judgement and will not start to demolish a whole house because they discovered a sub optimal detail in the initial construction of the plumbing. Or install a high pressure car wash in someone's bathroom that they obviously don't need.


But the boss of your plumber might demand to have a daily standup with him or her if he is so inclined.

If you want to be treated as a payed consultant, become a payed consultant, with all the disadvantages and advantages which follow from that.


The plumber at your home is exactly like a freelance developer though? The plumbing equivalent of working as a dev for a bigco would probably be assembling engines in a factory or something like that, and factories most definitely have start-of-shift meetings. Even if it's just "pay extra attention to workplace safety y'all, the inspection might come by today".


> and most of them interact with the public, who can override their expert opinions.

Does the law office receptionist get to tell a lawyer which pieces of evidence should be brought to trial, and which shouldn't? What are the parameters and limits around 'managing' attorneys? I worked in a couple law offices (office stuff and deliveries) and while there was some 'management', it was generally 'top down', and had to do with billable time and court deadlines. Coordination between lawyers to help each other or work on a specific case together happened, but even then the deadlines were generally based on external dates, and were generally not company-imposed ("gotta get all these items done by Q3 for the Q4 launch!")


Sometimes it's a matter of hygiene and humane working conditions. You might be surprised at the amount of codebases that are just the equivalent of working in filth.

If your workspace has shit smeared on the walls, you don't ask if you can clean it up. You just do it.


Not only are devs micromanaged by any figure of authority with not much tech knowledge and sometime not even business knowledge and freshly landed from a different industry, if not from school, but devs are also happy to be, quite frankly, considered as mere children.

Other skilled professions do not either communicate with memes, have slides and swings in the office (as seen in some Google offices), etc. If part of the geek culture is being infantile, that's no wonder that devs got told what to do.

I don't know if the gamification of dev work went so far because of this, or if the causality goes the other way around.


I've said it before that these "copy paste" and "stack overflow" memes reflect badly on us.

People don't realise that others are forming real life opinions on these memes.


> People don't realise that others are forming real life opinions on these memes.

Maybe, but if any non-developer internalizes a worldview from those memes...that really just means they're blind to the usually worse offenses in other careers. While not perfect, measuring programmer productivity is infinitely easier than a long list of other job roles. At the top of that list is "middle manager".

We take hits on the chin about elitism, sexism, whatever other low-point of human nature. Meanwhile, whoever is spewing the critique fails to notice that in practically any other role, the *-isms are worse and harder to prove.


Forgot to mention the most obvious childhood markers of all: toys and treats! Again, I'm looking at you Google, which offices are (were?) loaded with nerf guns, stuffed animals and treats... at least, in the engineering buildings.

To some degree I believe management do this on purpose.


I've always worked in small companies and been responsible for getting the work done, as a solo developer in a lot of cases here was nobody to blame if things went wrong. In a larger company now but it's still a smaller place, have the same attitude to work even now.

Maybe it's this environment has made me a more serious person than some of the other developers I know. Don't use memes in conversations with collegues and certainly have no interest in toys and games in the office.

Would love to not be responsible and not treat work as a fun place but don't see how that's possible if you care about your job and the work that you produce.


The problem is programmers are relatively immature regarding business, and confuse hobby programming with professional programming. Lawyers might like reading laws, but they are hired for winning a case. Programmers will continue to need non programmers to tell them what to do until they truly accept that code is just a mean to an end and that they are really hired to add business value. The test is simple: are they more concerned about the tech or the core business of their customers and how they can help them. Being paid for their hobby was nice but that golden age is over.


I see software engineering more like architecture or civil engineering, even though you are skilled and have your own opinions you still need to hear your clients and have managers to take some decisions. I agree that the management relationship in the building sector is very different though, micro management is the worst


Lawyers arent just hired to win cases just as programmers arent just hired to "make things work".

Theyre hired to advise, to negotiate contracts and ensure compliance with existing and upcoming laws.

It has a lot in common with programming on a systemic level, I think.


Typically if you don't have development team, that really works directly with the customer (being mostly silent on random calls doesn't count) - without some kind of project manager, you'll end up with a lot of work done by dev team, which they falsely assume is necessary (unrealistic edge cases, premature optimization, misunderstood requirements).

Most of the teams I worked with, found direct work with customer way worse than some internal alignment meetings.

That being said, most effective teams I worked with(as dev and as supervisor), were managed by ex or active senior developers, who really put an effort into businesses side of the work, so they had full perspective of both sides.

Problem is: * First and foremost - most developers are not interested in this type of work. * Not all are capable * Most of these who do that, internally still uses dailies, unless everyone in their team is on the same level (which is extremely rare due to point one).


I'm interested and capable and yet there are no opportunities to transition. I tried for years to make it happen. People aren't interested in ex Devs for anything business, in mid sized UK companies.

I've gone off to found a company instead.


Maybe because programming in itself is not valued? Changing things with automation is, thus you need to have the right mix of domain knowledge and solution skills to be successful. As these often are not found in one person (exceptions confirm the rule of course ;-}), you need a number of people collaborating. Appreciation for the "other" side takes time and can only grow with mutual trust backed up by joint success.

Only pretending to know the domain and micro-managing into the "other side" is real and not helpful, as you wrote. This applies to programmers as well, they often mouth about how they know what would be best for the company and in reality just don't know all the constraints and background on the business side. Let alone how much of a social game creating software for others really is.


> I feel like we have so much leverage and don't use it at all...We're still attending stand-ups every day with non programmers telling us when we can and cannot refactor.

100%. It is completely absurd to watch. This sort of subordinate attitude pulls us all down. I think the issue boils down to:

* new devs are not going to rock the boat

* lots of people see software dev as as way to be successful within existing corporate hierarchies

* a lot of devs have very little experience of what life looks like with engineers at the helm

There's also the thing of most people seem to need hierarchies to feel secure. I don't quite understand it but I'm probably just weird.

Personally, I will succeed in partially decoupling time for money in my labor. It's just a matter of when.


> There's also the thing of most people seem to need hierarchies to feel secure. I don't quite understand it but I'm probably just weird.

Social organisms usually have some sort of ordering hierarchy. While most of the 'alpha' and 'sigma-grindset' nonsense is just that -- nonsense -- people are hierarchical animals.

It also means you don't need to give a damn about stuff that's above or below your paygrade. You do your hours on X, and other teams can worry about Y.

> a lot of devs have very little experience of what life looks like with engineers at the helm

A lot of devs have very little experience in a lot of fields. The best programmers I've met started coding at 15, went to MIT, and have been in software their whole lives; they have no paradigm outside of what they've been exposed to during their college-intern-jr.dev phases. Most of em that I know haven't ever worked a job flipping burgers or washing cars, either.

Makes it hard to break out of that mentality, and also means that a lot of very smart coders do piss-poor jobs managing & leading. Inevitably the shitty engineer-managers fail, and the business types are dragged back in.


>new devs are not going to rock the boat

I've done this as a new dev. In the end a few new devs aren't going to change things against stubborn old talent and a large swat of new devs passively standing by, until the proof is there. But that's the exact problem: these systems are so large, you need people to be willing to risk extrapolating a PoC or be willing to invest until the proof is there right in front of their eyes. With how risk-averse most companies are, the odds of this happening are miniscule. Unless you wish to invest your own time, effort and risk (which I don't think should be the answer, despite how often it is given).


So, my guess is that non-developers seeing Bill Gates, Mark Zuckerberg, Brin & Page, and other developers start business that then rewrote large parts of how society functioned, without much caring about what non-developers thought of this, would disagree. Developers are already perceived by many as acting in a high-handed manner to reshape society without caring about the opinion of that society. More use of "leverage" is not what seems called for.

Now if you're the dev stuck in a bad corporate hierarchy, sure, it looks very different. But programmers as a group have used their leverage plenty, just maybe not in the ways that would always be helpful to other members of their group.


There are such partnerships, they are called dev agencies. Usually good for partners themselves, less so for rank and file employees.


That is exactly the problem with the "Dev Hegemony" book; it spends the first half of the book showing how the current system naturally evolves from very human flaws like ambition and pride, then tries to spend the second half of the book going "Yeah but what if we all start treating each other as equals (except non-programmer, they get to be non-owner employees only)". World peace would also be an easy problem to solve if you can get everyone to just get along, but without a practical way of getting there it's just a pipe dream.


I mean I figure all I need to do is find one dev I get on well with to become a partner. In-fact someone floated it with me a few years ago and I was lukewarm because I didn't think I was ready (kind of regret it).


Starting an agency is the easiest part. Then after that comes the never ending grind to find and retain customers, who will never pay on time, refuse to budget time for frivolities like "testing" and "refactoring" and who will mercilessly shop out the work to as many of your competitors as they can find. Developer agencies have existed for decades and the reason that not everybody is rushing to work for them is because they are generally very poor place to work for compared to regular companies, even as a partner. They're often derogatorily called "body shops" for a reason.


As an independent contractor for the past two years, It hasn't been much of a grind finding and retaining customers. You have to put some work in, sure, but I don't feel it's particularly onerous. Never had to take someone out to dinner or cold call. People need software development skills and generally like it when you find them.

As for never being paid on time - I think the most late it's ever been has been 2 weeks. Pretty annoying but not the biggest deal in the world.


> As an independent contractor for the past two years, It hasn't been much of a grind finding and retaining customers....I think the most late it's ever been has been 2 weeks.

When it's you who's gettind paid a bit late... it's manageable. When those late payments affect people you have to pay... it's a whole different game. I've been on both sides of this and it's great when it works, and bad when it doesn't. Building up a large buffer helps, but can take some time to get to an effective place that gets you more than a couple weeks of runway.


Right. I mean I literally want to be co-owners of a business with one other person. Not hire a bunch of subordinates. A literal business partnership.


I'm also an independent contractor trying to find one or two trustworthy partners to go to the next step. I haven't found sales, collections, etc. to be so onerous. In fact usually I'm turning away work.

A few years back I tried forming an LLC with a couple friends, but they both have full-time jobs and they keep getting promoted, so it's never gone anywhere, and we're talking about winding it down. I'd love to try again with someone who is already independent and is ready to try something bigger.

I haven't read Developer Hegemony, but another great book about the dynamics of such a partnership is Managing the Professional Service Firm by David Maister. It's pretty dense, but it's full of great ideas.

I went into freelancing in part to avoid becoming a manager, and I think with the right kind of partnership you could still practice your craft while still helping to lead a small team of people---or maybe even just mostly collaborating with other partners. The Maister book has helpful things to say here about whether your firm focuses on work that is repeatable & leveragable (e.g. making Wordpress sites) vs unique, challenging, and done mostly by partners (which sounds more fun to me).

I'm in Portland, Oregon, if anyone is interested in talking about something like that. :-)


Maybe I misunderstand what you hope to get out of the partnership then? The Dev Hegemony book describes a situation where everyone (every dev at least) magically becomes an "opportunist" in the books terms, without going into what will happen to all the pragmatists and idealists. Nor does it have an answer for why the opportunist/idealist/pragmatist divide will not reappear inside the agency.


The scale of what I can deliver is somewhat limited as a single person.


There is a big difference between finding work for yourself and finding enough work to keep a whole team employed, as you would be doing if you ran an agency.


Exactly like any lawyer firm.


> We're still attending stand-ups every day with non programmers telling us when we can and cannot refactor. It's nuts to me that a skilled profession - that not many can do - lets themselves get micro-managed like this.

I mean, that's really doing it wrong. (yes, I know people call everything "standup" even if it as here has nothing to do with the idea of what its supposed to be, but still...) This is not universal in software.


That is why working under a non-technical manager is a strong deal-breaker for me. Fortunately at Zoho where I work, the management culture is rooted with an engineering background. Even the CEO himself, works on hard problems like compilers and dev-friendly formal verification.

I don't think I'll survive a single sprint under a non-technical decision maker.


> we start operating like lawyers with partnerships

One analogy is; lawyering is like singing a never-ending song, whereas software is like a self-playing piano.

The lawyer is needed until the song is finished, which is partly up to himself.

A software consultant delivers a self-playing piano and must come back, cap in hand, asking for more work to improve it.

When software is working, it is assumed to be "finished" or "correct". Additional work is assumed to either fix bugs ("earlier mistakes"), or implement new features and improvements (which are optional and involves a sales process).


This heavily assumes that maximizing technical delivery at all times also maximizes value. That's just not the case outside of a component team or very large technology org structure.


Would programmers actually get anything done without micromanagement?

Programmers like the search for beauty and solving interesting problems. They like simplicity, understanding, and privacy.

Users and business people like features that just work and don't care about code quality, and are perfectly happy with spyware business models.

Managers would probably run things into the ground with tech debt if they had full control.... but there might be a fear programmers would turn every project into something like KiCad and Vim put together, written in Haskell,with core use case features handled by "A simple 10 line user script".

Firefox's market share is probably a good example of what happens when developers make decisions(Except web devs and the like who seem to have different personalities).

Developers are always going to feel like they are being puppeted by morons because most developers don't seem to like the actual work their doing and sometimes even wish the product they make didn't exist.

I never see programmers saying things like "I fucking love the IoT! Check out this awesome new API Google invented! Let's replace every kerosene lamp in the world with LEDs".

Instead I see them telling me I should use dd instead of Etcher because Etcher is 80MB.

As moronic as managers can be, at least they somewhat see the value in what they build.


As a non-programmer who has sometimes joined standups as a product owner this seems like the developers are not standing up for their viewpoints in the process. The team’s I have worked on, we have an agreement that some percentage of the time should be devoted to refactoring and similar technical tasks. there may be a specific designation or it may be just a rough approximation but we respect each other enough to have a conversation about it and come to an agreement.

My role is to ensure that the overall project and deliverable is on track. I will ask questions to clarify statements made in standup. I expect questions coming to me about requirements, intent, and timing. If there are problems with external teams, I will do my best to work out that blocker. I don’t dictate how the devs do their work as long as they are able to deliver the features that were agreed on. If that is not going to happen, we can have a discussion about why and what adjustment needs to be made.

If more refactoring needs to be done, we can have that conversation. Talk to me about the cost and benefits of the refactoring and the impact on deliverables. It may be worthwhile accepting a delay in deliverables now. It may also be that those particular deliverables are important enough that the refactoring may need to be deferred even if the ultimate cost is higher. Those kinds of trade-offs cannot be done only by one part of the team but need to be weighed by the whole team.

This kind of open and respectful discussion is part of a well functioning team. If you are not getting that then raise this as a problem in retrospectives because it needs to be addressed.


This doesn't sound like a correct stand-up to me.

My understanding and how we are actually doing it: You say what you did and plan on doing, no comment are made, it's just an FYI for everyone else. And/Or you have the chance to present a problem you are having and you arrange a "let's talk later" with someone who can help you.

What you are describing sounds like a daily mini breakdown / refinement.


"I feel like we have so much leverage and don't use it at all."

Devs might have leverage. But in terms of actually contributing to the success of their company, the data seems to suggest otherwise, as far as their importance is involved.

It used to be all about technical talent and product excellence. But dive deeper into reasons why tech companies fail today, and you will see all points to Marketing. Not really in sense of Advertising/Acquisition, but Marketing and Business Strategy as whole. Research, Positioning, Model, Pricing, etc..

Basically more than half startups fail because they are building wrong product for wrong market (Marketing). Less than 10% companies fail because of issues with the tech side. I am not saying devs are not important and that there aren't big differences in the quality and speed of their work, but wouldn't go as far as claiming some sort of god-like status when devs are not really where the future of the business is decided (unless it's some extremely unique technology involving product, most aren't). As Brian Balfour says, the game has changed, it's all about distribution now.

That being said, I wouldn't tell a dev how should their do their job. I can only lead them in terms of what we should be working on, what part of product or what feature, but how exactly to code it? Well that's of course up to them. We don't really micromanage in our company. We just all sync on priorities and then trust each other to execute as per everyone's best judgment. But yeah, we're a tiny company.

That in a sense could be an advice, go work for a startup. Corporate micromanagement is a choice. To be honest, in a larger organization, I would be probably micromanaged too, even as a non-dev, and I would fu*king hate it. Don't think devs are special in this. Not like managers understand Growth and Product work either.


I totally see your point of view. However in my experience, there is not enough developer competency for this to fully work. I can imagine lots of developers going off task if things worked in this way - there would be too much freedom, and with that temptation, to over engineer and re-engineer


Amen - I tried I convince a group of people I'd been contracting with to set up a partnership model along these lines, specifically so we could carve out that kind of control. No such luck, people were happy because day rates in the UK were (and are) ludicrously high.


Why does the height of day rates affect it? (Honest question.)


Your willingness to try new ways is probably inverse proportional to your current wage.


Ah, sure, fair enough. Thanks.


> turn bosses into customers

Congratulations, you just invented software consulting agencies.

But jokes aside, what you want sounds a lot like developer anarchy (which in practical terms means replacing technical management with direct communication between all developers in a team) but there are two problems that are apparently easy to overlook with this approach:

1. Teams still exist in a company that is likely larger than just a single development team. You can use direct democracy to appoint representatives and delegate political power, but unless the entire company is a worker coop, there are limitations to the team's autonomy and there is ultimately a monopoly of violence at the top (i.e. the CEO can fire you, you can't fire the CEO) crushing down on you when a conflict can't be settled.

2. Companies still exist within capitalism and need to make a profit to be sustainable. The ugly truth about turning bosses into customers (i.e. going freelance or becoming an agency) is that customers are still your boss, there's just a chance you'll eventually have enough of them to be able to fire them and not kill your company. But to have any shot at getting there you'll need to rake in cash, not just to cover your expenses but also to build up a runway.


Just because you're standing up, and it's daily, doesn't mean your having a standup. You may just be part of a cargo cult.

A standup is a communication meeting, where only those actually doing the work should be communicating. Other people should listen in and ask questions afterwards. There are plenty of other opportunities for project managers and others to have an impact. Impeding the team's communication, even if well meaning, is counterproductive.

But, like most things, those who have power will always find a way to subvert available mechanisms to enhance their power, even when it costs more, takes longer, and pisses people off.


None of this will happen until there we have an objective metric for determining the tangible value provided by developer.

So, so many devs are great at building interesting tech, refactoring to more interesting tech, and so on, but entirely missing the mark of optimizing actual, upstream value delivery.

I'm sure other industries have their means of demonstrating that this or that practitioner can both deliver and deliver quality. We don't have this at all in the software industry. Furthermore the overly common team processes and team structures in use don't even allow for demonstrating this.


> we start operating like lawyers with partnerships, and turn bosses into customers.

I've definitely seen a number of consulting companies follow this pattern. This is one successful business pattern, no reason why more devs can't do this.

However, someone needs to be the sales person and moves away from programming. Does that happen in law offices as well? (I believe so, but that is mostly from reading, not first hand experience. See also https://www.merriam-webster.com/dictionary/rainmaker )


Are you honestly telling me that developers know best? All that leverage and you just want to not listen to anyone? Honestly you'd get further not leveraging anything than doing that.

I agree with your last paragraph and your first but that second is a doozy.


YMMV. All of the companies I've worked for have been developer run besides one, which was basically in a permanent state of collapse as everyone left a few months in.


>If anyone has read Developer Hegemony, I'm fully on board with that general premise - we start operating like lawyers with partnerships, and turn bosses into customers. Though that does require us to think of ourselves as professionals not nerds who are too smart for business.

I own a copy of Democracy at Work and know there are a number of software and IT cooperatives out there...


When your bonus is tied to making your incompetent boss happy, I’m sure you’d sit quietly through a horribly ran standup too.

I don’t see what is preventing anyone from operating like that today. I do that and I know many others do that. Yes sometimes there is cat herding to be done, but boundaries are needed or else you become a doormat. Stand up for yourself, time is limited in our lives.


I recently turned down a promotion to "Lead" because I witnessed the dynamics between the previous leads and the Project Manager.

A Lead pretty much has to bow his head to even Junior PM's.

It's like you're there to solve problems, and your opinion is only needed in moments of crisis. Even then people only defer to you in a almost disdainful way.


One of the best standup habits we had was just posting quickly in a slack channel at a certain time saying, "no update", or "blocked", it was low touch.

I have always thought that a standup is just a way of being micro managed. If an engineer is blocked, why would they not just figure out how to get unblocked? Isnt that their job?


The part of standup identifying blockers is to figure out when someone doesn't realize their blocked and/or they're spinning their tires a bit. If someone is blocked they should reach out immediately.

Instead, standups provide the team the ability to identify roadblocks the developer might not realize. Perhaps you're working on something and I've been down that path before, and know your solution won't work for reasons you're not aware of. I can tell you that now and save you several hours of banging your head against the wall.


Has this ever actually happened though?


Frequently, especially with junior developers.


Your point on leverage feels true. I just wish we’d use it to finally move away from the Google model of interviewing. Rather than having people do useless exercises on whiteboards we could do a 30m “are you a jerk” interview and then let the first year of job performance speak for itself.


> It's nuts to me that a skilled profession - that not many can do - lets themselves get micro-managed like this.

Physicians at hospitals know your pain, as their product manager equivalent is hospital administration micro managing how they can practice medicine.


That book really resonated with me; as in, it vibrated at exactly my resonant frequency and, like a wine glass, broke me. I couldn't enjoy professional programming in the same way once the veil had been lifted.


The last part is why. The general stereotype of developers on the biz or design side is that they are smart but incredibly narrowly focused and refuse to comprehend anything outside their domain.


Well, the customers must have a say in the planning and the tradeoffs that are being made :)

But what grinds my gears is when managers do not apply the same planning principles for their own tasks and deliverables.


Erik Meijer has a great talk addressing this https://youtu.be/2u0sNRO-QKQ


That's such an awesome idea and would do wonders for the dignity of this profession. I haven't read Developer Hegemony but I absolutely will now.


Non-developers are losig ground and they know it.

So, they try to pidgeonhole devs into non-business directions and infantilize them with stuff like pingpong tables.


I feel like lawyers enjoy this relationship because they control how many lawyers are admitted to the bar.


This comment kicked off one of the most interesting internet threads I've read in my life


I have hopes that remote and DAOs will be the tools to do this.


> non programmers telling us when we can and cannot refactor.

If these people weren't there half of the programmer population would be stuck in and endless refactoring loop.

Programmers are tools, tools need people to use them


I don't care for the dehumanization in this comment.

Sounds like the thinking that calls employees "resources" and considers them fungible cogs.


There is no dehumanisation intended. Everyone has a place and a function. You don't let a car mechanic drive during the race, or a jet pilot do the maintenance of his plane, well same for a programmer, you don't let them make project/business related decisions, because it's neither their job nor their responsibility and most of them are inept at making decision out of their little world, that's why we have project managers, executives, &c.

I've worked in companies in which programmers were gods and decided pretty much everything, I felt like I was back in uni projects, and that's not a compliment. Tech and codebases are virtually irrelevant in most companies compared to project management end business decisions.


Your analogies are not helping your point :) Programmers are not the ones driving the plane or flying the car - they are the mechanics and engineers.


I think it ends with everyone becoming a developer.

Making machines do things will never end. However, the tools for making machines do things is evolving very quickly.

The general art of being a developer will continue to get specialized and at some point specialists at any field would be expected to be able to program their machines.

Very few people are learning how to run their servers and much more are moving into AWS type of infrastructure and “cloud literacy” has become a thing.

I would expect layers of abstraction to continue to build up and programming to become something like using spreadsheets to do your actual task.

Eventually, “true programming” will scale back to building these tools. It would be rather niche, very high skill serious engineering - an elite, hard to get in profession that continues to fetch high salaries. They probably will have some kind of association similar to the lawyers/doctors, the code they write will no longer tolerate bugs as a fact of life but they will be responsible as any other licensed professionals. Skipping unit test will land you in jail or very high fine due to malpractice if something goes wrong.


I think this tends to overestimate the amount of time folks who aren't developers want to spend developing. Most spreadsheets, even of the complex variety, exist as steps in a chain of human decisions -- they basically replace notepaper and calculators. (Spreadsheets that act as inputs to software chains are usually created by developers.)

Having previously been involved in architecting and building enterprise low-code/no-code solutions, and then working as a strategy consultant thereafter (including partnering with other major low-code/no-code vendors), I can say that what the business wants to do is sketch out the rough plan of how a process works (with a couple of exception states) and have someone else go through not only the implementation, but the hard work of dealing with all the error states, data cleansing and ETL, deployment, and monitoring, while they get on with the hard work of actually doing their jobs. There just isn't an appetite for people to do two jobs at once.

What the article seems to miss is that companies are currently being much more aggressive in nearshoring work -- there's a huge boom in Mexico and LATAM development that's pushing up salaries there (as has happened in Eastern and Central European states). One thing that's distinct from the offshore boom is that, if you weren't a major consulting org, your offshore folks probably weren't colleagues, whereas with nearshore you're much more likely to see them integrated into the organization (again, due to TZ and the ability to fly someone in at the last minute for meetings). It'll be interesting to see what this does to the Mexican/LATAM economies -- Costa Rica, for example, with only $5mm people, punches way above its weight in software services exports.


Couldn’t agree more with your second paragraph. A phrase that’s been drilled into me in the past year is “undifferentiated heavy lifting that’s not core to your business”.

The things you listed just don’t matter from the business point of view. They don’t care how the app works, who runs it, what cloud provider is hosting it, what tools were used to build it, only that it works when it needs to and they will happily pay the bill so their employees can focus on making the company money doing what they are good at. Zack Kanter from Stedi talks a lot about this.


We humans reject the concept of exponential growth, specially when it can kill our way of life, and in that sense developers don't realize that 80% of our work could be automatize in the next few years.

I've been in the industry for more than 20 years now, and what's clear to me is that a lot of time is spent coding and debugging things that have been done previously plenty of times (Non Invented Here syndrome) and that in most cases are not core to our companies.

What I expect it's not so much things like Copilot or AlphaCode (that of course will be used) but serverless services that we'd plug into our solution (how many more login services do you want to implement AGAIN?), like we do right now with APIs, but at a higher level, with a standardized communication protocol between these services.

The same will happen the infra level, with only a few people creating and mantaining "low level" solutions while the rest of us will use abstracted services on top of that, like using Cloud Run instead of learning a massive APIs like k8s (I'm not saying that they're comparable right now, but at some point something evolved from Cloud Run will make learning k8s unnecessary)

What will happen once developers are freed from the most time intensive aspects of their job? Probably and for a few years the unmet demand for developers will cover our increase in productivity, but at some point I expect this job will face some of the problems you can see in other sectors.


While this seems intuitively true, it seems to me like the opposite is happening. Every advance or increase in technological efficiency creates exponentially _more_ demand for developers. Think about every tool and library that comes out; all the JavaScript front end tooling which made things “easy” has created literal armies of front end engineers. Even going farther back, think about Java, PHP, C++, all these came out to make developers lives’ easier and ended up creating more and more developers. We can even see it in the invention of the concept of high level languages over machine code, that’s also a human simplification.

Personally, I don’t see the train suddenly stopping and going in reverse anytime soon.


We are used to the Jevon's paradox. It must end at some point, but AFAIK there is no way to tell if automatizing 80% of our work will put us out of jobs or increase our salaries.


You don't need fancy AI to make us unemployed. Just back to basic would do. You list complex solutions which would just need even more programmers to maintain.


Was coming here to say this. Every time a more sophisticated system has been built to replace a simpler one, it requires MORE programmers, not less.


Ye. I think there is way to much "automation" nowadays.

It is like all these business systems corparations use. They would probably be better off with some secretaries with typewriters sending internal paper mail.

Systems that try to automate too much are too rigid to use in a sane way.

The nice thing with a no computer system, is that if you want to do something, just do it. An analogue would be writing math notation on paper or on a computer. I can hardly imagine even Knuth prefering the later.

Computers should be used for well defined tasks they are good at with rather simple programs, or the whole business need to adjust to what the computer allows.


Well, I think that the demand of developers is not so much because of the increase complexity of systems but because a lot of companies think they need developers, when they only need technology, and a lot other think than just adding more devs is the solution (the hyperinefficiency of the hipergrowth companies is at this point legendary, as the guy earning $1.5M/y getting and losing - but getting paid nonetheless - multiple tech jobs at the same time proved)


    I would expect layers of abstraction to continue 
    to build up and programming to become something 
    like using spreadsheets to do your actual task.
But in general, people have been saying this for 50+ years.

It hasn't happened yet for (at least) two reasons.

1. Increased demands on programmers have increased at least as quickly as their productivity has skyrocketed.

2. Non-trivial systems will require some non-trivial encoding of business and data logic. You can simplify the tools all you like, but at some point the tools are no longer the limiting factor and some complex logic has got to be dictated to the dang computer, somehow, by somebody. It is then going to need to be tested and deployed. It is also going to have to be built in ways that are maintainable. Most programmers struggle with these things in 2022; we are nowhere near making this easy for laypeople. If anything I feel that infrastructure has gotten way more complicated in the last 10 years.

Something does need to change. The current situation is unsustainable, with programmers commanding insane salaries. At some point it becomes a limiting factor on the economy.

But our tools are decades away from being usable by non-programmers. I see no possible solution besides increasing the supply of programmers.


Just because it has not completed doesn't mean it isn't happening. These days very small number of people know how computers actually work(how many know assembly or even C?) but there's a huge number of people building their businesses by programming machines do things.

In the last 20 years, people were using languages close to English to express their ideas to a computer and make it do work. You touch computer science stuff only when you dive deep in something that hasn't been commercialised completely and when the market is large enough it quickly develops into services where you tell the computer do things in "plain English". The cloud is a good example where 20 years ago you would have to understand the architecture of server systems and networking, these days it's just a form you have to fill and express some rules about accessing the resources in "plain English".

Similar abstractions have been happening everywhere, from AI to game development or graphics generation everything is so abstracted that all you need is to learn the tool through a few tutorials to be productive.

Many people start from the high level abstractions and dive deeper later on since many years. You first build your multiplayer game, make some money and then maybe you learn about stuff like DNS, TCP/IP. That's only possible because networking and server management is completely abstracted and the game developers only interact with the cloud provider's SDK that handles all kind of edge cases and optimisations.


This seems perhaps an idealistic take on how effective the layers of abstraction are.

To examine just one of the examples >> these days it's just a form you hav to fill and express some rules about accessing the resources in "plain English".

I'm of the opinion getting the cloud to run your systems effectively, and without blowing a large hole in your budget, may be a little harder than it seems from that sentence.


Because when you configure your AWS like this, you get public buckets and so on. With private datacenters, Dev Ops can mean Dev can be responsible for their own stuff; with AWS, DevOps is there to keep dev from sending all the company data to the whole world.


as far as expressing things in plain english goes, id think of the various voiceboxes. i can get the computer to turn my lights on and off using plain english, along with some other things


Those are simple states - binary on/off states for the lights, etc.

It falls apart fast when you start trying to explain actual engineering stuff in plain English: iterations, complex nested conditional logic, error handling, massaging and sanitizing data, etc.

You could describe each of those steps in English, but what have you gained at that point? You've still had to express all of the same exact complexity, but you've done it in a language that is far more clunky than a programming language designed for the task.

It's like trying to tell a musician how to play one of Beethoven's symphonies using English instead of sheet music. Even if you could, why? You've expressed the the same exact thing, but in language that is orders of magnitude clunkier.


Oh yeah, I remember people saying that about SQL a long time ago. People wouldn't need database engineers any more.

Same with Visual Basic. No more application developers. Just drag and drop!

You might ask yourself why VB6 or Delphi didn't simply rule the world forever. We had those ~25 years ago and with those tools, non-engineers could drag and drop their client/server database-CRUD apps together. Hell, you could do most of that stuff in FoxPro or dBase even earlier, minus the GUI.

    You touch computer science stuff only when you 
    dive deep in something that hasn't been commercialised 
    completely and when the market is large enough it 
    quickly develops into services where you tell 
    the computer do things in "plain English".
You're fantasizing away the enormous middle ground of development work that lies between "hardcore computer science-y stuff" and "things non-engineers are able and willing to accomplish."

The "plain English" bit is hilarious. Human languages are extremely imprecise. That's why we have specialized "languages" for lots of things: math, music, etc. Maybe you are living in another universe where AppleScript took over, I don't know.

Ask yourself if "plain English" tools have replaced engineers in other specialized disciplines. Can I tell my keyboard to write me a song in "plain English?" Can I design an airplane in "plain English?" Can I build a bridge in "plain English?"

    These days very small number of people know 
    how computers actually work(how many know 
    assembly or even C?)
Sure, only a small % of work requires low-level languages, or knowledge of how a CPU works.

But the bottom line is, somebody needs to describe complex processes to the computer in a very precise way that handles lots of edge cases, filthy data, errors, etc.

At some point, somebody has to write that logic and if there's more than one person doing it they'll need a development pipeline of some sort. And they'll need to do it in a maintainable and scalable way.

The tools are generally not the limiting factors here. You could replace Javascript and Python and Ruby with world's friendliest, code-free GUIs but you'd still have to describe those processes in an equal amount of detail and do it in equally maintainable and scalable ways, and I doubt it would be as terse or as expressive as the aforementioned languages.

    Similar abstractions have been happening 
    everywhere, from AI to game development 
    or graphics generation everything is so 
    abstracted that all you need is to learn 
    the tool through a few tutorials to be 
    productive.
I am not denying the power of these tools. However, I see these more as "opening up new avenues" as opposed to "eliminating the need for all kinds of nitty gritty development work that actually makes businesses go."

Drag and drop all the sweet zero-code shit together you want in Unreal Engine; somebody has still got to sanitize and massage the weirdo sometimes-broken JSON that our partner's API likes to return sometimes and handle the error states when it can't be fixed.


I much agree with that the complexity is the limiting factor and you probably will need specialized people to model and design not only what the computer does but how the humans and computer co-operate. When you're programming you are not only programming the computer but also the people who use your application(s).

Describing and designing and validating asynchronous processes is especially complicated.


Yes, at least at scale, BASIC is hard!

It's a wolf in disguise due to all the "easy" English words, words that of course need to be put in the correct, often grammar-violating order so that nothing is won anyway, for the compiler to make sense of the instructions and intent. You end up getting a language design that places more importance on a bastardized form of English like a text-based adventure game, rather than being a good map to the problem space that doesn't get in the way like a bumbling fool.


Yeah. I cannot believe that the "describe things to the computer in English" fantasy exists in 2022. At least from folks within the industry; it makes sense for laypeople to ponder it.


beautiful sum up!


Something that I think about, is that we live in the greatest tech driven economy in the history of humanity, and it grows at a rate of about 2% per year. What I don't believe is that every programming activity contributes equally to that growth. By "activity" I loosely mean anything from a solo programmer to a large software business. Some are assets to the economy, others are liabilities, yet they pay the same salaries either way.

Economic growth is close enough to 0%, that a decent rule of thumb is that whether any specific activity is an asset or a liability to the business that engages in it, is a coin toss.

This is reflected in comments made by developers themselves, to the effect of, "I don't know why we're doing this, it's going to cost a lot of money and make things worse." These kinds of comments are about projects that they're working on themselves.


It's hard to say, because so much of the work of keeping a company running is so far removed from income generation or anything else.

Does cleaning the company toilets contribute to the company's growth? I mean, not really, but on the other hand if nobody ever does it then it becomes an impediment.

A lot of programming work is like that.


with work from hone,everyones responsible for cleaning toilets, rather than having a dedicated cleaner


> "I don't know why we're doing this, it's going to cost a lot of money and make things worse."

I'm a step outside the domain so not totally confident in my own cynicism, but I feel like this phrase could be applied to practically every B2B startup I hear about these days. Selling CRM, project management and time tracking tools to (mostly) other VC-funded companies is just a way to skim more cream out of the economy without adding any true value.

I try not to judge individuals for working in these places; working at AWS I'm hardly in a position to throw stones. Increasingly though I am dismayed by the lack of real, society-enhancing jobs available to highly skilled tech workers in the sea of mediocrity and rent-seeking.


   every B2B startup I hear about these days. Selling CRM, 
   project management and time tracking tools to (mostly) 
   other VC-funded companies is just a way to skim more 
   cream out of the economy
I think it's OK to an extent. Not all of those apps are good, but the best of them do enable businesses.

   I am dismayed by the lack of real, society-enhancing 
   jobs available to highly skilled tech workers

I think the root problem is a lack of developers. The resulting crazy high developer pay scale prices smaller orgs completely out of the game.

We need to increase the supply of capable developers.


You better not look too closely at the rest of the economy then because most of it looks like that.


You mean we don't need 5 different grocery chains at every highway crossing selling the same 100 products under 500 different brands from 5 companies?


Competition between profitable businesses is one thing, loss-making startups paying 6-figure salaries for shitty tools is quite different - particularly when the latter company ends up with a billion-dollar valuation for reasons totally divorced from actual worth.


Could also be that much of the work is zero-sum. For example ads to some extent just shift consumers from one company to another without actually creating anything (although, some might cause new demand). The stock market is exists to allocate resources more efficiently, but some of the shorter-term stuff looks more like pouring a ton of math and software work into coming up with better gambling strategies...


> coming up with better gambling strategies...

"better gambling" means better predictions of the future. this means making use of all available information, which means higher market efficiency.

It's not useless to do better in the stock market.


> Economic growth is close enough to 0%, that a decent rule of thumb is that whether any specific activity is an asset or a liability to the business that engages in it, is a coin toss.

Well, you completely forgot about depreciation.


> 1. Increased demands on programmers have increased at least as quickly as their productivity has skyrocketed.

So does anyone know what the most productive programming tool is then?


I think it's the code library. You import stuff and use it right away, without needing to know 99% of the internal details. That has empowered us to do amazing things quickly. But having to deal with 100,000 libraries has become its own problem.


I think it ends with everyone becoming a developer

Not in a million years. When I was 10 I got my first computer and spent 12-16 hours per day on it every day learning BASIC, 6502 Assembly and many other languages.

All the adults at the time thought, “this is how all kids will be from now on!”

Except none of my peers did it. The generation after me only had a small percentage of people truly interested in computers and it hasn’t grown much since then.

I’ve interviewed a couple hundred computer science graduates from top schools for jobs and the vast majority of them didn’t touch their first programming language until college.

And these are computer science graduates! A large percentage of the total population is outright technology hostile and proud of it.


> Except none of my peers did it

That's a good point. I still remember how excited I was writing my first app in BASIC at school:

    What is your name?
    > John
    Hello John!
I was so excited I tried to copy the program on a floppy disk to play with at home but I could not figure out why it didn't work. I now realize the school used Apple II's and my parents had a PC at home.

I've always thought because today's generation grew up with smart phones, tablets, and ultra realistic games no one would be interested in getting started with coding. But reading your comment I realize the whole school was exposed to the same class and very few of my classmates would go on to major in CS.


How is not touching a programming language until college a bad thing? Some people just want to make a living out of software engineering and then go home and they aren't sweating over Web 4.0 or whatever we are on now.


There are exceptions to the rule. I know good engineers who started quite late in life.

But in general, I don't see a deluge of kids eager to become developers. Still seems very niche just like it always has been.


The other thing that gets missed is people who start young are so interested in computing they have lots of adjacent knowledge.


I know good engineers who started quite late in life.

My experience matches yours exactly. Almost all of the good engineers that started late in life in my experience are just plain extremely smart and would excel at anything they did.

One friend that comes to mind wrote his PhD thesis in chemical engineering and then switched to being a software engineer. He is too motivated and too smart to fail at it even though he doesn't have what I consider that intrinsic spark that led him pursue computers at an early age.


Yeah: the vast majority of software engineering work is far from rocket science, and any reasonably intelligent (upper 50%? upper 25%? upper 10%?) person can do it.

To me the real test is mentality. The spark. You have to enjoy it enough to do it for 2,000+ hours a year. That seems rare.


18 is awfully old to start learning something for the first time. Imagine saying the same thing about math, reading, or history. It's harder to pick things up when you're older, and it leaves fewer people capable of it. See also: learning human languages.


This is the saddest thing I've read on this website. I hope for your own sake you allow yourself to learn new things in the future.


It's not about what you're allowed to do, it's about what professionals in a field need to do to get to the top.

I've been programming since I was 8. I've been drawing since I was 29. I will probably never reach the same percentile of drawing ability that I have with programming ability. That's okay for me because I enjoy it anyway, but from the perspective of training people to do their best work, starting younger can lead to higher ability.


There's a huge gap between what you said, which I think is totally reasonable, and saying "18 is awfully old to learn something new". Of course learning something when you're younger likely makes you better at it compared to starting when you're older, but an 18 year old is statistically not even through 1/3 of their life.

Not to even consider the amount of people who get into their trade at such a young age. I'll be honestly surprised if they make up any significant amount of the population, and I'm saying this as someone in that boat.


people don’t generally tend to feel this way about training to work in medicine or hvac or law. or even language, really. obviously children learn how to use language, but training for a professional job as a teacher or a translator (or whatever) doesn’t start at age 8.


Kids learn how to read, write, and speak at least one language. Kids who learn two or more presumably turn into better translators. Potential lawyers have access to debate clubs and are trained in public speaking. I know one HVAC technician, and he's been working on cars and bikes with his dad since early childhood. Even doctors get taught biology.

I pretty firmly believe nearly anyone can learn to contribute in any area given enough time, but there's a lot of time between 0 and 18, and it's prime learning time when brain plasticity is high and responsibilities are low.

Is it really that controversial to say childhood shapes a kid's future?


No, the controversial statement in question was the claim that "18 is awfully old to start learning something for the first time" wrt programming.


It's late only if you want to become a top programmer. Top programmers are like top pianists, they start early and are very motivated.


I started at around 10, but there are plenty of programmers I look up to that didn’t. It’s an indication of intrinsic motivation and genuine interest, but there are plenty of people who find that much later.

Even today, 25 years later I feel like I learn things more effectively than when I was a kid. Certain types of skills benefit from experience and networked thought. Time per day/week is a much more limiting factor though.


it absolutely isn't in the context.

you start to learn maths in kindergarten. you don't start to program computers there, usually. there are very good reasons not to do that (screen time!), but it will make learning to program easier later in life.


Learning to program early makes later programming easier is an uncontroversial, anodyne statement. Saying that 18 is too late to start learning is, because that is discounting everyone who learns at that age or later.


I'd argue learning programming is a different skill compared to memorising laws or biology, it's math more than anything else (especially logic, set theory).

Lawyers and doctors basically need to memorise facts, apply rules and make deductions. Developers need to do that + math, hence why starting early helps you out, exactly like kids who did math or chess early on are better than their peers who started later.


Of course we try new things. The things you learn as a child get internalized very differently than the things you learn as an adult though.


This is absolutely false. I became fluent in Japanese in my 30s. Learned guitar in my late 20s. This is so wrong. You can continue learning until you are dead.


(edit: I started learning Japanese at 18. I'd say) I became fluent in Japanese in my late 20s. I'm close to 44. My wife is Japanese. I live in Japan. Japanese middle-schoolers (edit: ok, maybe high-schoolers) are more fluent than I am.

GP is not saying you can't learn. GP is saying you're unlikely to be more skilled than those who started earlier. Also, past a certain age, there's a more limited amount of time you can dedicate to learning new stuff.


I think the time factor is likely the most dominant when it comes to learning new things. The beauty of being a young learner is having much more time to do so due to having less responsibilities.


Of course, but you are not likely to speak like a Japanese person - maybe you will, after practicing for a lot of time and assuming your native language had matching sounds to Japanese (I don't know about Japanese but there are some guttural sounds in Greek you won't be able to learn as an adult unless you've been exposed to them as a child). Meanwhile a newborn would learn to speak like a native Japanese speaker in 3-5 years.


> I became fluent in Japanese in my 30s.

Me too, but I will never reach close to native level of proficiency. It's fun but just that.


I started learning to program in college. I am an EE, so maybe CS is a bit different. But I remember how intimidating it was to be in a my first programming class with the kids that already knew "everything". So, I bought a C++ book and worked through it cover to cover. Honestly, I had caught up with most the kids that already knew how to program after a couple of semesters or so and helped a bunch of them with our programming lab. I am sure there are some exceptional prodigies that are way better than I am, but it is quite impressive what one can learn even at an older age if one puts time and effort in.


Most professions are learned and mastered after people finish high school, so that's clearly not true.


To add, some professions even require a certain age or previous, (unrelated) trade, like paramedics. They don’t teach teenagers to do certain jobs.


18 or any age is alright. In college we had a class coded IS100, which stands for "Information Systems 100". It was so basic that unlike the other 1st year courses that were coded 101, this one was coded with 100. In this class people will be learning how to operate a mouse, close a computer, navigate to a website etc.

Extremely basic stuff that was completely new to kids that grew up in a poor family that couldn't afford buy them a computer. This is in a country where college education is free or close to free, you just need to score high to get into a college and as a result many smart students who later become engineers learned using a computer there.


Hocam?


One can learn a human language later in life, you just need to have a reason to do it. Kids are better at language acquisition because they have strong desire to communicate with people in their environment. There is no magical LAD that makes it easier for children, they are just extremely motivated.

Adults can be motivated also, but the situations for such motivations necessarily decrease as they are able to do what they want without learning new things. They aren’t less capable of learning, just less motivated to learn. That is easy enough to fix (extreme interest is a good motivator, for example).


That's just incorrect. Children have increased neuroplasticity and will always outperform an adult in terms of language learning - not to mention you lose the ability to learn to make certain sounds after a certain age.


Critical period hypothesis is somewhat debated in linguistics. Children and Adults just learn differently. Children have far less inhibitions and more time. The second part of your comment sounds like pseudoscience. I learned German in my late 20s and speak to native speakers every day, where I live bilingual is the norm and a whole lot of people acquire second, third or forth languages as adults.


I think about the first “generation” of computer programmers, many of whom (e.g. Knuth, Richie) wrote software we still use today. At best you could say that it lends itself to a different set of skills. For example, scholars of a second language may know a lot more about its grammar and linguistics than even native professional writers. The native writer might never be able to translate their own work well after learning to speak a second language. There’s probably benefits to growing up “native” in programming, but there’s plenty of examples of people starting later and doing well.

I do know of too many people who have changed careers into programming to write off the likelihood of learning it after 18.


How is not touching a programming language until college a bad thing?

I am the OP you're asking this of and I don't think first touching a programming language in college is a bad thing. And I also don't subscribe to the belief that learning has to become more difficult as you age.

What I was doing however was responding to the statement, "I think it ends with everyone becoming a developer". The reality is that almost all people have no interest in becoming a software developer. Most software developers have no idea this is true, but it is. This doesn't make most people's lives less valid or make them "dumber" or anything else pejorative - they're just different.

On the one hand you have people like me who stared fascinated watching my 5th grade teacher program a VIC-20 in Commodore BASIC that led my parents to invest a giant sum of money into buying a Commodore 64 for Christmas when I was 10. To give me a 300 baud modem when I was 13. I couldn't afford to buy computer books in the mall bookstores so I would go to a "book warehouse" near me as often as I could and I tell you I would literally get butterflies in my stomach as I approached the computer book section. I wrote my own BBS program at 14. In 1988 I read the Waite Group's Turbo-C Bible cover to cover repeatedly.

There are people out there who get just as excited about and have as much proficiency with bass guitars. I have a good friend that is like that. I respect his ability and interests as much as I hope people respect mine.


It is not bad for the individual (one can become a decent programmer without playing around with programming in childhood), but it is bad for society and our education system in general. Programming early can add a lot of color and value to a child’s development.


This is something I've realized as well. There's only a few of us that are excited to do this work and that's what you need to do it longterm.


>A large percentage of the total population is outright technology hostile and proud of it.

Can you expand on this? I was one of the CompSci because I loved it people - and find it hard to fathom this - and could use some insight.


I'm not the parent poster, but I see this as well.

I think the majority of folks are comfortable with Netflix and smartphones and social media, but these are dead-simple and mostly passive experiences explicitly designed for the lowest common denominator in terms of technical proficiency.

Beyond that, a lot of people kind of hate technology.

And why shouldn't they? It's fucking horrible. Stuff breaks easy and gets obsoleted. Social media is a hellscape and actively makes them feel bad, a steep price to pay for seeing pics from their nieces and grandsons. Everybody hates the shitty touch controls in their cars that require you to click through a bunch of screens instead of just twisting a physical knob. Tons of consumer tech is literally spying on you. Most of it is user-hostile in some way or another, and at a minimum everything is trying to get you to sign up to some subscription service and will also stop working as soon as the manufacturer loses interest in keeping those servers running so you never feel like you actually own everything. You can't even enjoy Christmas morning any more; it's five hours of software updates.

Oh, and Slack and other communication tools are the thing their manager uses to harass them every five minutes. I bet a significant portion of the American population experiences a literal spike in blood pressure and cortisol levels when they hear that Slack notification SFX.

There's a lot of cool tech out there, but consumer technology is a fucking nightmare.


> I’ve interviewed a couple hundred computer science graduates from top schools for jobs and the vast majority of them didn’t touch their first programming language until college

In the US?!


The problem is, at the end of the day, business logic needs to get encoded in the program. So all the abstractions cant abstract away the domain specific "if else" statements. Infrastructure itself will be a solved problem, and we will implement interfaces like serverless lambas, but it still remains, the logic of the business needs to go in the code. OOP itself is meant for building relationships between sets of business logic, and coding that, as has been discussed, always ends in complexity and needs thought


Fred Brooks' classic "No Silver Bullet" outlines this fundamental principle of software engineering physics. Written in 1986, the paper has its own wikipedia page: https://en.wikipedia.org/wiki/No_Silver_Bullet


It's not only that we need to code the business logic in a programming language. To a degree we must also design those business processes. That is programming too, programming your people.


In a few thousand years, programming will be a forgotten profession and only program archeologists will remain to scavenge code from the past into uses for the present. At least that’s what Verner Vinge wrote in one of his science fiction books.


> only program archeologists will remain to scavenge code from the past into uses for the present

You mean CODEX like AIs?


Re: cloud stuff, sure I do less management of physical machines, but because I understand how stuff actually works, when I worked at an AWS shop, I was the go-to person for debugging and fixing problems with compute or network. And because I can write understandable code, I was able to move the infrastructure from point and click to repeatable terraform deploys that actually have a chance to fulfill the promise of cheap and scalable infrastructure (rather than rebuilding "every CPU is sacred" but in the cloud). Clear thinking, applying the power of "make the computer do it" and knowing how stuff works is always going to be super-useful, as everything turns into software.


Nocode tooling available today is already extremely powerful.

That doesn't mean non-technical users can really leverage that abstraction well.

What I've seen is that the abstractions give the most leverage to developers and developer adjacent people... Think people who started in comp sci, but don't really enjoy writing code.


> I think it ends with everyone becoming a developer.

Unless we get genetic modification tech to improve abstract thinking skills and other traits, it's as likely as everyone being able to become a painter, a musician, a diplomat or a mathematician.


The question is, how? Computer literacy has already peaked and the body of knowledge one needs to understand to make a difference is steadily going up.


I have more hope in education improvement than tooling improvement. There's that famous study where it was found most people can't communicate the difference between xor and or. If you can't do that then you can't read/write specs even for yourself.


Programmers are increasingly needed for *everything*.

I mean, we've entered an age, where a farmer can't milk their cows, if the software of the milking robot has a hitch - can't plow their fields, when the gps-software that drives the tractor isn't working.

Apart from that, programming is becoming every other job - as far as domain knowledge is concerned.

I've personally experienced (multiple times!) professional accounting companies complaining about not being able to just modify official invoices (totally illegal) but having to cancel and re-issue them with a new invoice number (how it MUST be done). -> It's now the responsibility of the programmers to understand how accounting works. Professional accountants just click a button and expect the software to "do the right thing". -> As in this case, the professional accountant no longer adds much value, they'll probably be entirely replaced by software at some point.

A more relatable example might be Taxis drivers. It might take years still, but at some point, all Taxis will be self-driving vehicles, created by robots in a fully automated factory. Programmers will be needed for the driving software and the factory software. You might still need some mechanics to repair the factory robots - until when that job is done by robots as well (that have to be programmed).

It currently seems more likely to me, that we are headed towards a future, where programming is the *only* job - and everything else is done by software.


For taxi drivers, the most important change was already done and it took decades: it wasn't replacing them with self driving vehicles. It was replacing very competent drivers having a perfect knowledge of the town and of the optimal paths with drivers with absolutely no competence needed.

We don't have software driven vehicles, we have software driven humans.


And with their knowledge moat they got lazy and built an industry where a lot of people dreaded getting taxis because of meters, getting ripped off, lack of accountability, etc.


> We don't have software driven vehicles, we have software driven humans.

Wow, this is a very prescient quote. I think we will hear this repeated often in the years to come.


If someone following satnav is a software driven human, what's someone following a paper map?

The quote takes a very dystopian view of the augmenting of humans with technology, sure there are people who will blindly follow the directions/"knowledge" of a machine, but just as many will be consciously and intelligently using the machine as a tool to further their own goals.

As above, so below. The comment mentioning people above/below the API doesn't remember how much background stuff us developers use every day to be capable of so much more (and also less) than the developers of yore.


The quote takes a very dystopian view of the augmenting of humans with technology…

And really expands possibilities of the Chinese Room Argument.


Another similar formulation: Venkatesh Rao says people are now divided into those living above the API and those living below the API. Those below the API are the software-driven humans.


There are two ways to look at it, though:

One is to lament how previously specialized professions like taxi drivers become replaced by low-skilled laborers who just need to drive and use an app.

The other way is to appreciate how technology augments what humans can do and be more productive. It's not like all accountants got fired when Excel started being a thing: It's that they got way more work done!

There is no sign of a decrease in demand for labour in countries like the US or Germany: Quite the opposite, we have to get a lot more efficient in order to sustain an aging population!


>There is no sign of a decrease in demand for labour in countries like the US or Germany:

In fact there is growing evidence that increased automation actually increases demand for workers in the economy. It might reduce employment in the actual factory itself, but the remaining jobs there are more skilled and technical, the higher efficiency draws investment and allows companies to scale up, there's a significant trickle-down to the surrounding economy.

https://www.forbes.com/sites/adigaskell/2020/06/18/why-deep-...

This makes sense when you look at the history of automation, but it's only recently that the processes that underly the way automation actually stimulates and enhances an economy have been better understood.

There is an argument that AI is different because it will substitute humans more completely, but we are a long, long way off from anything like that. The kinds of automation we have now, in terms of the effects on the economy and employment, are much more like automation over the last century or two than strong AI that renders humans fundamentally obsolete.


Yes, exactly. Arntz et al [2019] also put this into a much more reasonable perspective that „We'll lose our jobs due to Software and AI“:

https://www.researchgate.net/publication/334386191_Digitaliz...

They also conclude that only 9% of all US jobs are actually at risk due to automation, using a much more reasonable approch that Frey and Osbourne [2017] who conlude 47% of the population with a risk > 70%:

https://www.sciencedirect.com/science/article/abs/pii/S00401...


Example relevant to what I do: the abundance of affordable software leading to people being able to make fully and professionally mixed and mastered albums in their bedroom didn't lead to all the studios closing. A mix engineer will always blow away anything I can do with the same tools as a generalist. What those tools did was enable people who previously couldn't afford to hire a studio to make music at a reasonable standard.


I remember this free book making the same argument (I've read it many - maybe 20? - years ago but I recall enjoying it at the time):

https://marshallbrain.com/manna1


Sounds quite similar to my accountants-example, doesn't it?


Indeed it does.


> It was replacing very competent drivers having a perfect knowledge of the town and of the optimal paths with drivers with absolutely no competence needed

So the drivers were replaced with organic hardware (other, non-knowledgable humans) and their knowledge was replaced with software (google maps, or whatever Lyft/Uber use for maps)?


Wait a few years, we might have software driven humans. And that's a much bigger deal. Replacing the driver versus replacing the map.


> I mean, we've entered an age, where a farmer can't milk their cows, if the software of the milking robot has a hitch - can't plow their fields, when the gps-software that drives the tractor isn't working.

Seems like quite a dangerous situation, if something like the Carrington Event knocks the grid offline for an extended period of time.

https://en.wikipedia.org/wiki/Carrington_Event


Seeing how we coped with Covid has made me wonder how well we would manage an event like that. Supposedly we are well prepared in the UK but that was also the case for Covid and we know how that turned out.


The average age of death for Covid is higher than the average life expectancy. You’re going to have a hard time convincing people that is a true crisis.


Wow...

"It's not a crisis, because (mostly) only old people die."

I'm speechless.


What about the state of the health care system? Hospital occupancy?


keep in mind, accounting is a broad field. I dont see accountants being replaced. There are three main sections of accountants:

Audit, Tax, and Corporate.

Corporate is the cushiest and most likely to be automated. Auditing is much like consulting and require human conversations and complex analysis. Maybe one day with sophisticated AI, but not any time soon. Tax constantly changes and need accountants to understand and translate to the programmers on a continual basis, and there will always be adept accountants with better interpretations for $$$ hire. Could they be automated? sure, the easiest way would be to have the govt automatically file, but they wont because the industry doesn't want that.


I agree and I don't think many people understand the implications. They complain that AI-driving is not good enough (it's not) but they hardly understand the consequence if it was (millions will need to pivot to something else).

I have a much better example: Invisalign -> https://www.invisalign.com

This is straight into doctors (orthodontists) territory. To explain the concept, a scanner will scan your jaw and then Invisalign will produce a series of aligners that you need to take in order. The orthodontist job? It's just to follow up on you in case something goes wrong. As they get better on this, the orthodontist job becomes less important and he can take multiple patients. The guy I consulted with told me that he can do some of the follow up remotely thanks to the iPhone camera.

It's these small things that are going to drive massive demand for software developers in the future. Software was limited to accounting/communication in the last decade. Now we are getting in a decade where software will apply to lots of other fields: Transportation, Medical, Tourism, Food, Banking, etc... The demand is about to go up by a LOT. The demand for skilled software developers is going to heat up beyond the wildest of our imagination.

ps: just my opinion.


While the need for Orthodontists is going to go DOWN.

I see the same thing with the cloud. I’ve racked a server or two in my career, run a cable or four to a managed switch, and deployed PCs to desktops…

ALL of that is going away. There’s going to be a small subset of people managing datacenters, rather than populations of people in metro areas managing datacenters for companies of any size. Laptops are deployed pre-configured and they just hook up to the available Wifi…that labor is going away.

Good? Bad? I dunno, it just is…the Buggywhip manufacturers went and did something else.


>we've entered an age, where a farmer can't milk their cows, if the software of the milking robot has a hitch...

They're still perfectly capable of milking their cows the old way, it just doesn't make any sense to do so. I think this is a non-problem.


The individual farmer could do it and could fill a milk pail for his family's needs for the day, but he couldn't fill a tanker truck for Krogers. He could walk the fields and sow seed by the fistful from a bag over his shoulders, but he couldn't drill 30 acres per hour like he could with a modern tractor and planter.

Technology in agriculture is a force multiplier that has allowed the fraction of the population employed as farmers to crash from 90% in colonial times, 83% in the year 1800, 55% in 1850, 31% in 1900, 5% in 1950, and down to less than 1% today.

That 1% could not feed the world without automated milking parlors and high-speed tractors. If Y22k hit John Deere's DRM this spring, some 50% of fields would lay fallow, and we'd have 50% less food in the fall....it would be devastating.


>If Y22k hit John Deere's DRM this spring, some 50% of fields would lay fallow, and we'd have 50% less food in the fall....it would be devastating.

I actually doubt it, those aren't the only tractors in the world, or even the US. It would be annoying, and expensive, but hardly the end of the world. Also consider that the fact these things are automated makes us less susceptible to other forms of supply chain disruption that might affect a highly manual process, like a pandemic for example.


John Deere has 53% of the US tractor market and 60% of the combine market. They're not the only tractors in the US, true, just most of them.

The machines are run late into the night, 7 days a week, during the narrow window of weather appropriate for planting and harvesting seasons. Prices would skyrocket, which would reduce demand, and yes, consumption would change, and yes, food would be imported...but at the margins, millions would starve.


Crops have to be harvested in a fairly narrow window. You can't just let the crop sit there for a month while you wait for other people to finish using their tractors.


man, thats terrifying from a birds eye view especially given how terrible lots of us are at writing good code


Right now developer salaries have greatly risen at the top end - the max has risen much faster than the median. Much of the top end growth has been at companies benefitting from easy investor capital.

I think this will follow a typical business cycle. When the next recession comes, investors and companies will focus on shorter term ROI, and many high end developers and long term projects will seem too expensive. Reduced competition will dramatically affect the top end, and non-salary benefits may drop quite a bit. The median salary will probably remain sticky but lose a bit when adjusted for inflation. Eventually the next expansion starts but it may take a very long time for the top end competition to heat up to this degree again.

I don't expect low-code, AI, or developer productivity jumps to have any real impact anytime soon; there's no real evidence of a true jump in the productivity of the overall project.


> I don't expect low-code, AI, or developer productivity jumps to have any real impact anytime soon; there's no real evidence of a true jump in the productivity of the overall project

I respectfully disagree. I don’t think those productivity gains will be evenly distributed throughout the industry, but I think AI and developer tooling will evolve radically over the next decade and programming in 2032 will look quite different for some. Things like Github Copilot are just the beginning. Once programming becomes a matter of operating the AI, there will probably be very different hiring strategies and incentive structures for developers.


It's certainly possible, I just don't see any real evidence for it yet. AI has a history of promising demos and then very long (decades) stalls in progress. I could be wrong, but I'm skeptical of the current round of AI as well. It will generate code in the same way as AI generates online articles or music - not very well.


I think that by 2030 we should have conversations that produce most of business software. Think of CLI + AI. There are already people working on this. My bet is on Microsoft or them buying winning startup


My experience is that the problem with producing business software is not the lack of programming knowledge. Rather it's a lack of people who understand the business and are capable of abstract thought and formalizing processes. I don't see AI helping with that any time soon.


But who will verify results of this conversation? Who will be the one to make changes, create changes, verify that changes are correct? Who will take blame if error happens?

Customer or user of this software?


Agreed, I see junior devs in their first job every semester. Some people with CS backgrounds, other with engineering backgrounds.

Software development sucks so much, folks waste ungodly amounts of time with trivial stuff like setting up their programming language correctly* or figuring out how to use undocumented libs.

I'm confident AI and investiment in dev tools will make programming much more accessible.

However, senior devs are another story. They are generally paid to make architectural decisions and solve weird problems. I don't see their salary bring affected to much by said technologies.

* Using dev containers made that A LOT better


Author overlooks three important things.

First, with amount of software engineers doubling every ~5 years, there is mostly a shortage in software developers with experience. Someone has to teach. Someone has to steer, make decisions based on knowledge, someone has to design architectures, know what coupling matters, what patterns apply, what type of solutions will haunt you later on, etc. etc. etc. No junior can do these things, because the juniors lack the experience of failing several times or maintaining or replacing legacy over years.

Secondly, software development is about so much more than writing code. In fact, I spend least of my time typing in code. If your job really is typing code that others told you exactly how to write, then yes, fear for your job. Not because of copilot or better IDEs but because what you do is easily automated away. If people doubt me, I urge them to look at their git-log over the last weeks. I daresay that most will hardly write/edit more than 100 lines of code a day. That's less than 5 lines per minute.

Nearly all software development is about understanding the domain and applying software to solve problems in that domain. About researching what problems are most urgent and what solutions have which tradeoffs. Which patterns apply, how to decouple, where to draw lines, how to keep the software agile and maintainable over decades, and so on. Hell, even "where to put my code" is a hard problem that copilot and IDEs fail miserably at.

Thirdly, the no-code movement is a forever-september thing. It has a niche, but that niche is not as big as the no-code salesmen would like you to believe. I've seen "lo-code" come and go for decades now. Every time it promised to take over all the software development only to carve out a niche and be very good only there. More practially: most such no/lo-code systems lack a lot of essential paradigms (source-control, accountability, copyability, testability). But they are either far too generic and therefore just as complex and hard to manage as "real software", or they have a specific niche and shine there! (but only there).

So no. Software engineers, especially those that have been in the trade for years (even more when decades) are in high demand and will remain so. Tools won't replace them. If anything, tools (which are build and maintained by those developers) only increase the demand.


If a low code tool can overcome the inner platform effect, then they'd win. But if you want all the paradigms like version control and what not, then code is the best tool for a representation that interop's with the rest of your non-functional requirements. Maybe we need to embrace coding a little more. Not the people in this forum, but the people trying to show us some sort of better "no code" way.


> The markets are doing their thing. Everyone sees tech workers making money and just generally having a good time and they think: “Hey that sounds like a good career choice”.

It seems like you could replace tech workers with any other profession that sounds like a good career choice.

Can literally everyone (or even the majority) learn to code? Do they want to learn to code? In my experience, the answer to both questions is no.


I think almost anyone could learn to code, but they get to the doing of it and find out that they don't want to. Most people don't want to plan, do logic puzzles and problem solve all day, and when they figure out that, that's the majority of it in type form and some meetings thrown in, so you can do it in groups... the movie romance dies away pretty quickly.

WTF! There's no cool 3D graphics and hacking while we listen to techno music?!? This is a bunch of dudes typing all day after going to a couple of stand-ups and messaging on Slack and clearing out bugs on JIRA. This sucks. I'm out =P

EDIT: No offense to anyone... I meant the universal dude, of all orientations, genders and creeds.


I mean, I deal with cool 3D graphics and listen to dope tunes while hacking away everyday. I’m a game developer, it’s why we do :0)


> I think almost anyone could learn to code, but they get to the doing of it and find out that they don't want to.

I agree with this 100%. I think most people could learn to do our job, they just don't want to.

People frequently tell me they'd like to change careers and they've been eying the software field. I explain that it's one of the best lateral jumps to make as we have all of the resources necessary to learn available freely online. I then suggest they spend one evening a week programming a video game or building a website -- some kind of self directed project. More than half of the time the person I'm telling completely zones out and loses interest. "Hm, maybe..."


It’s not hard to make a living writing code. That’s why people with a high school education can do it.

Obviously it’s not for everyone, but it’s also not reserved for a select few who have had a life-long passion for messing with computers.

Look at the medical profession where there are tiers of expertise / earnings. Not everyone can be a neurosurgeon, obviously. But many people can make good livings. And the profession attracts many people for that reason alone.

> It seems like you could replace tech workers with any other profession that sounds like a good career choice.

Yes, that is exactly what is happening.


There are other differences that make it easier to move into tech too.

In my country at least, the number of students that are permitted to study medicine each year is strictly limited.

Similarly, in law to get a decent job one has to complete a pupillage which can be extremely hard to get and is often based on family connections etc.

Finance can also be quite classist/nepotistic.

So compared to these other top professions, software engineering is more meritocratic and accessible.


> It’s not hard to make a living writing code. That’s why people with a high school education can do it.

Its not hard finishing college, that's why people with a high school education can do it.

Having a high school education doesn't tell you how smart the person is. Maybe they would easily finish college if they tried?


I agree that it is not hard to finish college.


A supply factor that this article doesn't consider - a big chunk of the new wave of graduates are software engineers in different parts of the world. And building a team in those countries will be a big trend. Rippling did this well, building a team in India. That should reduce the comp in the Bay Area.


People have been predicting that outsourcing would lower US salaries and take US jobs for over 20 years and it hasn't happened. Nothing is different now.


This is what's very different now:

https://www.statista.com/statistics/792074/india-internet-pe...

2009: 5%

2020: 50%


That stat alone doesn't tell us much about the nature of remote work. Yes, more people in India have internet access now, but that is not a recent development and it predates the pandemic.

2019: 48.48%. The year before that, 38.02%. In fact it's been around 34-35% since 2016, which was 6 years ago. No change in the amount of outsourcing so far!

Having internet access is not going to magically turn everyone into software developers, and even if it did, all the other challenges of outsourcing wouldn't disappear.


> Nothing is different now.

1. Venture capital is making major investments in India, Europe, Latin America, and Asia. At a scale never before seen.

2. Offices will never happen again. Now employees could be in Antarctica for all you're concerned -- as long as they're on a relatively friendly time zone with the rest of their team, it'll work.

This time is different.


Offices are still "happening" and will not go away. The current optimism about remote work is overly exuberant, and many people will be disappointed about the pullback of remote work over the next few years. Also, people are dreaming if they think Silicon Valley companies will continue to pay Silicon Valley salaries to workers in other locations, even if remote work continues to be broadly accepted.

Furthermore, India and much of Europe (especially eastern Europe) are definitely not friendly time zones for collaboration with colleagues in North America.

It's really not different this time.


> The current optimism about remote work is overly exuberant, and many people will be disappointed about the pullback of remote work over the next few years.

Mostly anecdotal evidence follows.

At my company,

- Most teams have two or more remote ICs

- A large percentage of teams have the majority of their ICs in a different location than their manager.

- A large percentage of ICs are now no longer near an office (or in the same state as an office)

- Offices are still ghost towns

They're not going to pull a magic wand and make all of us go back. Cat's out of the bag. They couldn't afford to fire these folks, either.

> Also, people are dreaming if they think Silicon Valley companies will continue to pay Silicon Valley salaries to workers in other locations, even if remote work continues to be broadly accepted.

I've received near-SF wages in Atlanta for eight years. Before the market pullback, my total comp was $500k a year. It's still ridiculously good.

> Furthermore, India and much of Europe (especially eastern Europe) are definitely not friendly time zones for collaboration with colleagues in North America.

We're building out offices there.

I've had meetings with Europe, Australia, and Japan this year. A first.

If your company won't, new capital funding new startups will take advantage of those workers and bring them into the fold.


And it remains to be seen how well cross-organizational coordination in tech will fare across such disparate time zones.

And surely some of that funding would be going to local tech companies in those areas, and not simply branches of North American multinationals. All of these regions have their own businesses too, you know. Not everyone is raring to work remotely for Silicon Valley.


From the USA, there aren’t very many friendly time zones. Both Europe and India are absolutely terrible time zone wise. I absolutely hate having very early and very late meetings (i.e., anything before 9 and after 5 Pacific time). Central and South America on the other hand…


It's somewhat OK if you're in the Eastern timezone. I work from Serbia with a NYC team and it's bearable.


If you work in a particularly hot climate maybe sleeping during the day and working at evening, night would not be that bad.


Legal hasn't suddenly drastically changed. Customers (the big customers) care about where their data is; which jurisdiction(s) it is stored in or passes through.

Employers also care what laws they have to deal with regarding employment and what they have to do in terms of benefits and stuff. Employees in multiple states is difficult enough, having to deal with employees in several countries is a significant burden that many employers would prefer not to deal with, or can't afford to. That hasn't changed.

Over the years, I've worked with developers from Germany, Denmark, Russia, Canada, India, Brazil, and even that most bizarre of alien cultures, DC-area government contractors. Even if we can all speak the same language (albeit with different accents), different cultures have markedly different assumptions and understandings and ways of communicating and working and different values. That can be quite difficult for management to deal with. That hasn't changed.

When you add up all the language, cultural, legal, timezone, foreign exchange, and other barriers, what really has suddenly changed? Not much. Most companies will still prefer employees that fit in as much as possible and don't cause lots of extra work for the company or cause them to lose sales with potential big customers (because of the cross-jurisdiction legal stuff).


Do (hired) developers code? There is no commitment or dedication in software today and it shows from the sloppiness and intrusiveness, to the wholly undifferentiated design. There is just too much easy money that has attracted too many random people. "Developers" overestimate their value too much, and they better enjoy it while it lasts


In my opinion there is a huge market for teaching programming literacy. Learning to code as a means to an end that isn’t programming itself. Plenty of smart people who benefit from grounded computational thinking and communication that don’t necessarily need the nitty gritty detailed skills. Think designers, managers, sales etc.


> don’t necessarily need the nitty gritty detailed skills

I think that's the challenge.

Understanding computational thinking isn't very useful if you waste your afternoons fighting against crap like encoding, Windows paths, line endings, networking, setting up PostgreSQL correctly...


The system works!

In reality, we will likely see some aspects of software development become less well compensated, while other aspects will continue to be rare and highly valued.


That’s where I thought it was headed…in 2001.


The median salary for software engineers is not that high. Still ~double the median household salary, but it's not the only profession that yields that.

So you were not completely wrong.


5% speed improvement in my tooling would result in about 0.0000001% improvement in my productivity. first of all, so little of what we do is about the tooling. yes, it's very helpful, but no tool, no matter how good, can tell you how to migrate a monolithic service into microseconds microservices with no downtime and no data loss. maybe if we didn't fuck up the original implementation such things wouldn't be needed but there's still trillions of lines of code out there and they aren't going to rewrite themselves. Google and Facebook aren't going to be overthrown by some hot new IDE with autocode built in.

secondly.. 5% is nothing. our tooling is so effing slow it needs a 1000x improvement. not in how fast we can write code, just to compile and test the damn stuff.

speaking of, amazons reasoning tech sounds interesting.

at best this tech will cut out some L3 maybe L4 jobs, but L5+ isn't writing code anymore. which in itself will be interesting because new grads are hired at L3.. how will they ever get hired if those slots are tightened up?


If a 5% speed improvement represents 0.0000001% of your time, then isn't 1000x speed improvement still only 0.002%?

I don't understand how you can simultaneously argue that almost 0 of your time is spent waiting on tooling, and also that your tooling is so slow that it needs to be 1000x faster.


Alright, let's do some legit math here. Compile times are often about 15 minutes, so a 5% improvement would shave off 45 seconds. That's not too bad. But because that's still a very long wait, I'm not gonna twiddle my thumbs for 14 minutes and 15 seconds, I'm going to go do something else. Probably some social site to kill a few minutes, maybe check some emails, who knows what. Then I'll come back 20-30 minutes later to check if my tests passed or failed. So that 45 seconds isn't really saving me anything because I've already moved onto other things. We have to get that down to a few seconds to avoid distractions. 15 minute / 1000 brings it down to less than a second.

And then my other point was that depending on my day, I might not be writing any code at all. For the past few weeks I've just been reading and writing documents. So no amount of speed improvements will help me.


I think of interns as L3 and new grads as L4.

I love having L4s around. They don't have distractions so they can get stuff done


Maybe it's different at my company. I don't know what interns are classed as... not sure if they're L2 or L3-intern. But people that are fresh out of uni are always L3. A few years of industry experience is L4.


> our tooling is so effing slow it needs a 1000x improvement. not in how fast we can write code, just to compile and test the damn stuff

Not just that. Why do we have 30 tools for 30 programming languages. JetBrains has one IDE per language (approximately). What happens when I have a project with two languages? It is a mess.

Don't get me started about build systems. Something like Bazel is greatly needed which would help with the performance and metadata we can extract but we're decades away from demonstrating this to the average programmer.


Jetbrains has one IDE (IDEA Ultimate) and redistributes it with various built in plugins for cheaper licensing and marketing. If you get the more expensive Ultimate, it has support for all languages via the same plugins.


As far as I know there is not a way to install CLion's C/CPP support as a plugin in Ultimate.


True, I was wrong. Ultimate supports everything except C stuff (CLion) and .NET (Rider), so there's three. I wonder why that is? Legacy tech debt maybe?


Ya.. I wasn't so sure about 'Bazel' at first but it really is great having one build system for every language that mostly just works once you get used to it.


All it takes is two things:

1. The investor bubble to bust - and I think this will happen due to the soon to be rising interest rates making capital much more difficult too secure.

2. One of the big employers dumps devs into the market - say one of the FAANG companies has a sever downturn and lays of thousands of devs. Don't think it can happen - Facebook and Google are just a few regulations away from mass lay offs and downsizing.


People forget when IBM shed programmers like crazy in the 80s and 90s. Also, the 80s in general were very harsh to programmers. Our industry has definitely seen downturns before, the dot com bust 20 years ago being the most recent.

But honestly, I’m more worried about aging out of this career then encountering another downturn. While things are better than they used to be, the pressure to continuously increase one’s value every year will necessarily wash some programmers out simply over time.


These off the cuff remark are the typical 'armchair commentary' we get in HN and somehow they get a lot of votes, but they are really low grade 'feel good' efforts.

While FB/Meta is in a vulnerable position, they still make 30B+ a quarter, and a net income of 10B.

Google in such a strong position, that it seems there will be nobody that will/can disrupt it, even though their search results haven't been the greatest. Even if they got 'broken up' by the regulators, Search, Youtube, Google Apps, entities by themselves will probably be bigger and employ more people combined (the Baby Bell, breakup is a good precedent of it, where the sum of the pieces was larger than the original company).

The reality is tech is becoming more and more important for everything in society. Eg. Cars: Tech was about 10% of the cost of a typical car in the 90s, now is over 35% and increasing. Cars can't be build because lack of chips and not lack of manpower or aluminum/steel.

The pandemic increased the acceleration of all tech by at least 5 years, and increased demand overnight. Even when things start normalizing, some of it will stick along, as for many people the day to day life has changed, and some 'work for home' or hybrid work is here to stay.

Will we see soft hiring seasons? Absolutely. In the next recession, probably hiring will be softer, and people will have harder time to find a job in the short term, (think 2008-2009), but expect in the long term that technology will take a higher percentage of the economy and tech hiring will be going up on the long term.


I think the answer is closer to this, and also a lot scarier. There are vast swaths of the economy that have got to be over-leveraged and just weird from all these years of low-to-negative interest rates. We really don't know where things will be at until we have to turn up those rates, and the Fed is walking on a precarious tightrope. If interest rates go back to where they were in the 80s, it could be a bloodbath in the streets.

That is why rates may end up going slower (in the grand scheme of things) than we might expect. There is a plausible scenario of the entire economy being in the VC heat-death scenario in the 80s that older tech folks had previously been expecting, but has been pushed out hitherto because of the combination of easy returns and easy money.

Then of course there's how it will play out in tech. Historically, returns in tech have indeed been driven by VC capital, which is one of the more speculative areas of the economy. But speaking as someone working at a company that isn't a FANG, it seems that there's a chunk of tech that has become more like traditional finance in terms of its microeconomic structure. You have highly leveraged workers whose labor is being amplified by a combination of machines and capital, but the capital isn't so much speculative as it is boring and predictable.


#2 already happened in 2020 with Uber & Lyft significantly cutting their engineering workforce (but not fully like a bankruptcy might entail).

I remember feeling concern about the local SF job market at the time, it was totally unfounded.

The demand for software engineers that can ship product is very deep.

FAANG engineers might not like the non-FAANG offers they might get at smaller firms, but they are solid jobs & you can easily retire on the timeframe set by the previous generation's yardstick (62-70 y/o).


Isn't 70 y/o pretty old to retire by previous standards?

The concern I have is if anywhere is actually going to hire a 65 y/o programmer that can't afford to retire.

Obviously, age discrimination is illegal but that doesn't mean it doesn't happen.


I referenced 70 because I checked & that's the age at which you withdraw max social security benefits.

Looks like actual average in the US is ~64 y/o:

https://crr.bc.edu/wp-content/uploads/2018/05/IB_18-10.pdf

I would worry more about age-ism, but I earnestly see very few "senior" candidates period (i.e. >35 y/o+) at all.

It may speak to where I've worked, but I think it also speaks to how young the field still is.

Once engineers hit their 30s I think the job-hopping rapidly slows down.


Regulations? Meta itself seems to be losing marketshare all on its own just fine, with maybe some regulatory work by the iOS App Store.


> rising interest rates making capital much more difficult too secure.

Can someone please explain this to me or point me to some reading material. How does interest rates and investment in tech companies are related? How do we know that interest rates are going to rise soon?


Regarding interest rates, they are already up. I got a house 4 months ago at 2.85%, and now they are 4.2%. I can't speak to whether investment in tech will drop. Lately, there is so much spare money floating around investors can't find homes for it. Inflation suggests to me _more_ desperation to have the money invested, not less. And the net cause seems to be a steady stream of wealth consolidation allows a smaller subset of the economy to hold more and more wealth.


All investment opportunities, to a first approximation, are competing with one another for capital. When interest rates drop, newly issued bonds become worse investments - but you've got to invest in something, so money flows into everything else. When interest rates go up, the opposite happens.


I can’t answer about the relation to tech investment, but the federal reserve has a meeting every month and they have been explicitly saying for the last few meetings that they intend to raise interest rates several times this year.


I’m coming to the uncomforable conclusion that we’re ALL doing a dance. Programmers make a product, Managers manage the people, HR protects the company, Legal protects HR, the Shareholders leverage the Executive Staff to make their profit margins.

But in all of this, very little ACTUAL STUFF happens.

(I say this as a security guy, who’s line on the balance sheet is ALWAYS in the cost column.)

So much of what we do is bullshitgrinding that it doesn’t really pay to look at the whole thing too closely. So refactor for the 99th time, and review the 12th latest greatest web server or database connector or infrastructure as a service solution and try not to get too hung up on it…


On any given day, fewer than half of people with jobs really need to work. A lot of people attend their job and don't really do work -- not because they're lazy -- but because deep down they know it doesn't really matter.

We (collectively) invent work to slow ourselves down, or create jobs that don't really need to exist. We know that people who want jobs need to have things to do, but maybe that there just isn't that much that needs doing. George Jetson pushed a single red button for a living. There's a certain truth behind why the show's creators gave him that.

This has been the case at least long enough before the movie Office Space came out to have been well understood by then.



Your final sentence nails the fact of our work (and by proxy, business investment) is short-lived. So many apps that I've contributed to in the business world end up getting tossed after a few years. It's often because of some change in the business - like some new manager wants to deliver some fancy project so they can put it on their CV.

So long as this is going on, there will be a need for developers


It's why it's called busyness ;)


costs don't go on the balance sheet :p


I have depression issues, and have been coding for 10 years now. I don't know if I'm bad at programming, or if it's because I lack a degree, or because I have been chronically unemployed and lack experience, or because I'm too selective with the jobs I apply on (I can't bear php or js stuff like angular), or because I live in france or in the wrong city. I may have an atypical profile.

Programming has not been very good to me.

When I hear "programmers are in high demand", I don't see it applying to me.


>When I hear "programmers are in high demand", I don't see it applying to me.

This is a very nuanced statement and depends highly on region.

When you hear "programmers are in high demand" it actually means "senior/experienced programmers in the company's stack who can quickly solve the issues piling up on their desk, are in high demand".

Juniors or devs with near zero experience in that particular stack are getting rejection letters as most companies never consider on how transferrable dev skills are and prefer to wait until the perfect dev shows up, instead of giving someone new to the stack a few weeks to ramp up.

At least that's how it is in my current country of Austria, but back in Eastern Europe, the demand for devs is so high due to US off-shoring and Western EU near-shoring, that companies will hire absolutely anyone who can write a for loop and even give them trainings to ramp them up quicker. It's literally impossible to be unemployed there even as an inexperienced slow learning coder. Sure you won't make six figure salaries, but you'll live far more comfortably than 90% of the population.

IMO, just like in dating and real-estate, the dev jobs market is very much about "location, location, location".


Which countries you have in mind saying Eastern Europe?


Romania, Poland, Czech Republic, Hungary, Slovakia, Bulgaria, basically most former eastern bloc countries that are also EU members.


Programmers are in high demand, even in France.

That said, there are some major caveats: - Salaries posted on HN are way over the top compared to anything you can get in France - Most of the recruiting is done through SSIIs/ESNs (capgemini, sopra, atos, etc), which is often shit pay, shit management, shit missions (which is not really a boon if you are already depressive)

For stability and peace of mind, the most straightforward way is to either enter software dev in a big corp (banks, etc, which have in-house teams), or some "éditeur logiciel" whose product is software. Video game development is a big no unless you knowingly want to be exploited for a while until you burn out.

If you have experience, you can increasingly ask for full or close to full remote work if you insist a bit, but most of the time you will only be offered 2 to 3 days of remote work a week, which implies relocating.

Although I don’t like saying this, if you do not have experience you could prep up a bit by working publicly on FOSS things that are relevant to your interest, to have technical things that you care about to talk about during an interview, even if it is totally unrelated to the domain you interview in.


Plumbers are in high demand too, but if you are a plumber with depression, lack of experience, that doesn't want to touch waste pipes or shower units, and live in the sticks, it won't apply to you either


I know, but it's not like employers are begging people to work for them either.

The job market has always been a market that favors employers, not candidates, whatever the domain of work.


No - At some levels and in some locations the employers literally are begging people to come work for them.

If you're outside the US, that may be a big factor - but my guess is it's a "you problem" based on what you've said above.


>At some levels and in some locations the employers literally are begging people to come work for them.

Begging is useless without payment to match.

>If you're outside the US, that may be a big factor

The jobs market outside of the US is nearly orders of magnitude worse, especially in some EU countries. The attitude, leverage and power balance devs in the US enjoy over their employers is far greater than anywhere else.


I mean... we're interviewing and it's about as close to begging as a company can get.

- We allow any code submission in lieu of having to do something custom for us (adds quite a bit of time and cost to review)

- We pay you for the time interviewing

- We have all interviewers send thank you emails afterwards

- We offer to give you specific feedback about why we decline to make an offer if we decline

- We accept re-applications in 12 months - 6 if you can show meaningful self improvement (ex: bootcamps, school, specific projects, addressing feedback from above)

- We have above normal pay/benefits for the area we're in

Basically - we spend between 3 and 5 grand on each interviewee just for time spent, not accounting for the initial lead generation time and not accounting for the bonus we pay for recommendations from current staff when hired (12-20k in that case, depending on level).


I wasn't referring to your company, but to the attitude of the companies in my (EU) area. What your company is doing far exceeds what I saw here and saddens me to a degree that I don't live in the US.


Consider applying for remote jobs at US based companies. It's a lot easier now than it was 3 years ago - lots of places in my area that would have previously hired only for local offices are now accepting full time remote from a wide swath of areas.

The biggest challenge is usually the tax implications for the company of making a foreign hire - it's hard to be the first remote hire from a new region/country (mainly because it's really expensive up front to get a wrap on the legal side) but relatively easy to be the second.

I'd avoid the traditional US hotspots (ex: silicon valley/NYC) and go for places in the south/southeast (Atlanta/Austin/Raleigh/Tampa/Nashville).


It's easier to blame individuals instead of blaming the system.

We will just disagree from here, that's okay. I left a comment to give my point of view, not to get a lesson on life or some patronizing, coaching motivational advice from an internet stranger.


Ditto!


Depression will distort your vision of reality and make things look worse than they are. It will also make it much harder to take even small risks for greater gains.

In all likelihood, you're stuck in your position because you're limiting yourself in some ways. I'm not saying it's easy to get out of, I've been there, it's not.


> I don't see it applying to me.

It only applies to a small set of people from top unis or those who have the FAANG brand names on their resumes.

Otherwise, I know several people who tried for months to switch jobs and finally gave up due to exhaustion.


I have NO university degree and ZERO affiliation with anything remotely FAANG (I'm based in Europe, where relatively few people work for FAANG). There are no globally recognisable names on my CV.

I am still the highest paid person I personally know - by a rather high margin, because of my experience as a software engineer.

Since I dropped out of university almost 20 years ago, I have been consulting and contracting, focusing almost exclusively on Python, PostgreSQL and, more recently, AWS. This simple combination (I usually refuse to learn anything new - I don't think there have been many noteworthy new technologies in the past couple of decades) has made me millions over the years and I have never been left without work for any significant period of time.

Since the pandemic began, I have actually found myself contracting multiple roles concurrently, from my home, each paying significantly above the average developer rate. I've found I can do up to 3 - 4 at a time and still deliver results for each (declining to attend regular meetings helps).


Good for you. I have experience with all 3 of those technologies and it's nearly impossible to even get an interview.


This is honestly baffling to me, I can't imagine how this can be. I get multiple emails each week from recruiters looking for that exact skill-set, and this is AFTER I've tried extra hard to unsubscribe from all such emails.


My final straw was when I was asked to code this problem in 20 mins :

https://leetcode.com/problems/reverse-nodes-in-k-group/

Maybe I'm just too stupid.


There's something wrong with your resume or other personal information then. I was in the job market recently, I live in the boonies, and there is nothing impressive on my resume. There were literally not enough waking hours to schedule a call with every recruiter flooding me through LinkedIn. I had at least 5 interviews per week. I don't know what you're doing wrong, but it's something.


lol I found the same awful experience at the above poster when I tried to look for something to bump my salary recently.


To be fair when it comes to France, any job being in high demand generally wouldn't apply there. That country is the capital of unemployment from my experience, and I don't see it getting any better.

But yes, lacking a degree is a big part of your problem. Fortunately, it's purely specific to France and you'll get better luck elsewhere in Europe. In my case I made my escape into the UK back when it was still possible.


That does not match my experience at all, devs are in a very high demand in France, and once you have some dev experience on your resume you start getting a lot of Linkedin spam.

As for unemployment rate in general, France is not amazing but it's far from being 'the capital of unemployment' (0.4% above the EU mean rate) : https://tradingeconomics.com/country-list/unemployment-rate?...

Did you have anything specific in your experience that could explain why you did not succeed there ?


> Did you have anything specific in your experience that could explain why you did not succeed there ?

I didn't have much besides self-employed/freelance experience back then (which was enough to get me a job in the UK) however even low-skill, non-tech jobs were impossible to get.

Maybe it's improved since then and if so I'm glad. I might out of curiosity apply to a couple jobs with my current profile (now having ~7 years of work experience) and see what kind of offer they come back with.


Thanks, this comment really helps. Seems like it's a lot of wrong place, wrong time.

It's true that not having degrees is like being a leper for some reason.


Salaries in Europe are definitely subpar compared to other locations.OP should try moving to Australia. Lots of jobs and the salaries are very good even for junior/mid.


I'm guessing if you don't live in a big city (Paris, Lille, ...) & stay on low-level programming (like C), there is really less demands.

However, I guess most come down to your network, your confidence and how you represent yourself (having the same "cultural" code as experts).

My 2 cents is to treat yourself well first (a healthy mind for a good body), travel more, and work on public stuff / connect with others.


> When I hear "programmers are in high demand", I don't see it applying to me.

Because it probably doesn’t apply to you. It’s not that programmers are in high demand but that hiring programmers is in high demand, which is massively different. It’s the difference between people who solve problems as compared to people who reproduce the same problems to solve.


In what city do you live and what languages are you seeking?


South of france, neither toulouse or Marseille. I'd prefer lower level languages, and as it was said in another comment, they're in lower demand.


Lawyers and doctors have always enjoyed “good times” and there’s no reason to think they’ll stop. Why not programmers, too?


Lawyers and doctors in the US effectively have union-style control on the supply of lawyers and doctors, via the Bar Association and AMA.

Developers have no such labor association. We also have headwinds like H1-Bs, that push down our wages.


I think that's true, but the interview process for a SWE is (I think) unusually rigorous. I'm often surprised when I talk to my friends in other fields who get hired after 1-2 hours of get-to-know-you interviews. Meanwhile, almost every SWE job has 4-6 hours of technical evaluation proctored by the incumbents. It's true that if someone has done 12+ years of college, med school, and residency, you can trust a little more on their resume, but I've also turned down SWE candidates with impressive sounding pedigrees after seeing them bomb simple exercises.

Maybe we've figured out how to haze our way to exclusivity without centralized gatekeepers?


It's just a more inefficient way of gatekeeping because instead of doing the Leetcode gauntlet once and you get a license or credential that actually means something, you have to run it at every single company you apply to.


Perhaps eventually developers will organize one.


I don't think big companies will ever allow it. They're floating on bliss from being able to push in federal laws that exclude programmers from overtime pay. Places like Google would pay trillions to keep that going and push down wages more and more.


H-1 is a good point. Where are the H-1 lawyers and doctors?


There are lots of H-1B doctors in the small towns and rural areas that are typically underserved by larger hospitals and networks.

https://www.americanimmigrationcouncil.org/research/foreign-...

> Just over one-quarter of doctors in the United States were “foreign-trained” as of 2017, meaning that they received their medical degrees from schools outside of the United States.

This is one of the challenges of the H-1B visa and that tech specialities are dominating the lottery it makes it more difficult for other areas to get doctors willing to work at, well, rural doctor rates.


if it's anything like where I'm from then you can't practice with a foreign license unless you convert it to a local one. the same organization that make a cap on the number of doctors/lawyers is the one that allows or reject the license conversions (permits). so foreign labors as a loophole does not work. when there were attempts to remove this the doctors went on strike. the justifications is removing the cap on number of doctors by allowing foreign labor would increase the variance in quality of the doctors and increase the risks to the patients.


Programming skills are more universal and don't depend on the country. Whereas lawyers and doctors have different standards and laws to operate on. It would be tough for immigrant to justify to spending 4-7 years studying laws of another country just to take a chance with H-1. If you fail to obtain that you virtually unemployable by your country and need to spend more time learning laws of you country


Developer burnout or attrition for various reasons is a real thing.

If salaries drop, even more programmers will leave the industry, so its a natural balancing effect.


Programmers don’t have to go to expensive law school or medical school. Programmers don’t have to pass the bar. Programmers don’t have to work 80+ hours a week for years in residency. Programmers don’t have to work face to face with infected people and put themselves in harm’s way.

Times are, imo, much better for programmers than doctors or lawyers right now.


That makes me think, doctors and lawyers have intensive training to minimize risk for other persons, each day our field has more impact on other's life, I wonder if we are heading to something similar...


> have intensive training to minimize risk for other persons > I wonder if we are heading to something similar...

We can't, unless there is like one standard CPU architecture, operating system etc for server, desktop, mobile phone with minimal updates....

how much to study and memorize, and about the zero-days..


Lawyers have not had good times since the GFC. Biglaw stopped being as lucrative and no longer needs legions of associates to grep dead trees.


Lawyers weren't doing too hot about a decade ago, especially entry level. Part of the problem is that just about every university wants a law school. Revenue is similar to a med school, but with far lower costs to operate, it's basically free money.


Lawyers and doctors are certified by their respective boards and bars. Individuals who aren't certified can't perform tasks that certified individuals can without non-trivial legal consequences.

I'd say they aren't really the same.


Actually AI will catch up to doctors and lawyers much faster than to programmers.

In addition, programmers have many more domains that they can switch into, however, once the AI lawyer or AI doctor will rise, it would practically replace 90% of the lawyers/doctors.


this is true for certain activities like structured information retrieval; legal opinion and binding decisions are not what is on the line.


Right. But most lawyers and doctors work (let say 95%) is routine work. The problem with fields like law / medicine, is that you invest heavily one time (I.e. at school) in a fixed body of knowledge (law or medicine) in order to enjoy life time of protected earning.

Note that the earning are protected mainly because you invested the time , and not because of some union.

The Achilles heel of those professions, is that the body of knowledge is constant, with very long half life. But this also make it ready for automation.

With AI, the destruction of those professions will be quick, as they will get destroy from the bottom. I.e. Even if AI make existing lawyers twice as productive, it would mean that you would need half the lawyers. If AI will make them 4x productive, you would need 25% of lawyers, etc.


With law I believe it can happen, but with medicine, not really.

Most visits to doctors it is not about only about the sickness (or information about what medicine to get), but about socializing and somebody inspect you and discuss with you. This part, the machine cannot do. It is what is called the doctor-patient relationship. If you think this is not the important part, I cannot disagree more with you.

I go to doctors only when I really need, but the pattern I see around is that people have a real long lasting trust relationship with the doctors they see.


That's not true for lawyers as far as I know (at least in the UK, I suspect the same applies elsewhere). There's an oversupply of law graduates, getting a pupillage/training contract is very competitive and fees are being cut.


Compared to law and medicine, programming is still a young profession. The field is evolving rapidly, and a lot of people are joining it. If the incentives remain as high as now, supply might eventually outstrip demand.


Lawyers and Doctors have

1) Very stringent standards, particularly in Medicine

2) Doctors guilds control how many people get medical education

3) It's often 'nationally dependent' wherein skills are not always transferrable.

'Software Developer' can mean anything which has advantages and drawbacks.

I for one, wish there were some kind of 'basic general certification' frankly. There are so many 'not very good devs' I would like to just completely ignore people with out some basic qualifications.

The 'exams' might even require us to 'fill in some blanks' in our education because all of us have blind-spots!

It might be hard to do something like that from an examination level though.


Don't forget both lawyers and doctors also pay huge liability insurance premiums to practice.

And the licensure bars and boards also ensure some level of customer safety by having an independent agency to turn to in malpractice cases.


Because they are smart and have built up huge barriers for entry. In contrast everybody can get into programming without any training.


In fact within our industry we’ve nearly normalized developers spending their own time and money to pull more entry level developers into the work force. I’ve participated in several weekend beginner bootcamp type things, donated money, and will probably continue to do so, but considering the incentives is interesting. Of course these bootcamp things get peanuts of sponsorship from whatever big players are in town, but the real sponsors are the volunteers that spend their own time on these things, and for what? Brownie points for helping beginners, or making tech more diverse, or so that tech companies have bigger, cheaper pool of labor? It’s cynical, but the tech biz really are masterful at propagandizing people against their own interests and distracting us from the fact that they have made the most absurd amounts of money imaginable.


Is this actually true? Are CEOs lamenting their engineering labor costs like this? I feel like by now the cost of engineering labor has demonstrated its value over and over again, to the point where most CEOs accept the cost of engineering is more or less a shared expense across competition, and not really something to be reduced.


Well we have a couple of particularly vocal CEO here (Quebec, Canada) that are always moaning (sometimes on national television) about how they cannot find talent or that gaming companies "steal their talents". All the while they pay comparatively low salaries when compared to what you get working remote for US business. Worst case is when they IPO, they get a fat exit and the workers get nothing because company don't generally give equity here.


> or that gaming companies "steal their talents"

As a former developer in the game industry, I find that difficult to believe. The average career of someone in the game industry is ~3-6 years[1], as devs get chewed up, burnt out, and then discover that by leaving they can (usually) have greater work/life balance for about 50-100%+ more money.

I lasted about four years myself, and as much as I'd love to work on games again, I couldn't accept the pay cut or the long hours. So I just make games in my free time now. I may take a 3-6 month sabbatical at some point to work on it full time again for a little while, but it'll be my own projects.

They must treat their devs pretty poorly and/or pay garbage salaries there if gaming companies are successfully stealing devs away.

[1]: https://www.gamedeveloper.com/business/the-great-video-game-...


Most likely they are just lying


Unfortunately yes, especially companies that are not IT but have a fair percentage of IT employees and salary bands; they don't like to have different salary bands between functions, like finance, marketing, IT, logistics, HR, etc. so there is a pressure to keep IT salaries in line with the other functions. Even if the IT guy with 2 years in the company brings 10 times more value than the 2 year bean counter, "equality, diversity and inclusion" is interpreted like "pay all people on the same salary band the same", especially when in finance/accounting you have mostly women* and in IT mostly men and you want, as a company CEO, to eliminate the "gender wage gap".

* At least in my country for most, if not all companies.


These are the kinds of companies that can't keep good talent.

Then then go to consulting companies and pay above market to get someone like me in when they need new systems built.


Depends on where. Hearing the complaining openly at some companies. My former employer openly called American engineering salaries too high during all hands.


The value of an engineer depends on the business. When software engineers are scarce, their compensation increases until the market has priced out sufficiently many potential employers. Those employers will then decide that they are better off not hiring a (good) software engineer than hiring one at the market rate, because the engineer does not create enough value to justify their cost.


In eastern Europe they are lamenting because any dev that can speak english is getting rates that they cannot compete with.


Unless you've raised a ton of money, I see no way that a new company can compete at these prices. With one decent engineer going for $160k all in, there's no way to do that if you're bootstrapping.


>> The CEO’s job is to make as much profit as possible. Higher salaries mean less profit. It’s just how the game works.

I find this to be too simplified. The CEO's job is far beyond salaries alone.

>> Even if we enter some new dotcom bubble and a good chunk of our startup ecosystem dies off, demand for tech is spreading across all industries, businesses everywhere are digitizing their operations and these systems will need to be maintained somehow.

We're not in a bubble. We are in the information era, and cloud computing is helping drive this. The demand for programmers will go away when the human population saturates on robots and AI. We've barely scratched the surface.

The only other possible downturn I can think of is a social reaction to reject technology, and enter another type of 'dark age'.


> We're not in a bubble. We are in the information era, and cloud computing is helping drive this.

We're not necessarily in a bubble, but perhaps there's some foam?

There will be a plethora of programmer jobs for some times go come, but we've also gone quite a while without a meaningful market adjustment, if you will. A point may come where global politics force the industry to be far more regulated than it currently is, and the days of being a self-taught coder with no degree and a poor understanding of security may come to an end.

Maybe that doesn't happen, but if it does, I see lots of jobs disappearing and evolving into new ones, and a fraction of people in software deciding it's not worth it anymore. The world has only just started to truly realize how we effectively control information, and if the sentiment towards Silicon Valley gets worse then there may be more calls for regulation and new regulatory bodies; in which case jobs may open up within that new bureaucracy.

That hypothesis is irrelevant if the industry was operating at a moderate pace. There's a lot of money still coming from somewhere, and backpressure can create a wave that pushes revenue back in the opposite direction, if even briefly.


> There will be a plethora of programmer jobs for some times go come, but we've also gone quite a while without a meaningful market adjustment, if you will

We just went through a massive market adjustment. COVID forced a lot of layoffs and hiring freezes in March 2020 (even in tech) and all the COVID-boom stocks like Zoom, Peloton, etc. have all crashed 50%+. Valuations in private funding are being cut as we speak due to interest rate changes. But salaries for senior engineers have only gone up.

> self-taught coder with no degree and a poor understanding of security may come to an end

This was never a common outcome in the first place, despite what blogs and bootcamps might have you believe.


Yeah, it's an idiotic, 1-dimensional proposition in a game of n-dimensional chess. Tech employees are competition for a company's profit after all the other companies trying to commoditize their complements (https://www.joelonsoftware.com/2002/06/12/strategy-letter-v). Starting with AWS. Treating programmers as cost centers was precisely what hindered the previous generation of companies in competing with tech.

There may well come a time when a company can compete for gross margins without competing for tech employees. But it'll require a case more rigorous than this blog post.


Exactly, this is real progress, we are automating whole swaths of our business sector and it’s making us way more productive and profitable


What about when everyone becomes a programmer (now) and starts a race to the bottom?


I don't think everyone will want to become a developer, and it's not a job just anyone wants to jump into. Compare this profession to say, the legal industry (the first example I could think of), where being a good lawyer/barrister is a high-paying profession. The law career may have a saturation point and regionally limited, but it still exists and pays well in certain situations.

I could see entry-level developer job salaries drop, but rise rapidly with experience. That would be my first indicator of a market correction.


Not everyone has the cognitive ability to become a programmer.


You don't need to be a genius to be a programmer. Yes, some people aren't capable of it, but there's a lot of middle ground. Being capable of being a programmer is more about disposition imo than intellect. I've had at least 2 people I don't regularly talk to ask me about getting into it as a career switch, and I say that it's fine, but more miserable than you'd expect.


I don't see it happening because I mostly see business people who don't even want to compete with me.

They just don't care about the same things I do care about.

I see people don't even care to spend 30mins on learning how to use their computer in a better way that would save them hours of clicking.


Just saw an article earlier about google lowering salaries in some state.


Boom and bust has always been a part of this business. It was hard to find jobs as a programmer in the early 90es, the early 00es, and around the financial crisis in 2008. You can be sure that there will be a bust again, simply because hype and optimism eventually will overshoot.

It has nothing to do with "no code", more productive development environments, or end-users becomming programmers. That pressure has always been in the business but the increase in productivity has always been gobled up by increase in complexity, expansion of scope, or automatization of new business.


> I can’t think of a single product that was able to get to any meaningful degree of success running on top of some low code platform.

I don’t know if this is just wildly ignorant or intentionally misleading, but thousands (maybe millions?) of businesses and organizations run low code platforms to develop their internal DBs and inventory and POS systems, among other things. Access and FileMaker are well known examples that lots of small businesses have used over the years. Bigger corps use all kinds of form generators, DB query systems, RAD tools, and low-code environments. These days devs are now using low code game engines to make games that have hit the top-10 lists on your favorite App Store.

Demand for devs is still going up, and low code platforms aren’t going to suddenly change that, but this story point should probably have been left out because low code platforms are hugely successful and are growing, and devs are still in demand despite the success and increasing market for low code platforms. It’d be interesting to explore why, but it’s just wrong to claim low code isn’t successful.


The low code examples listed are not code. They are forms that essentially pull data from a database and display it with little if any interaction, validation, and/or processing. These tools have been available for decades.


Right, and this is exactly what the author was referring to and claiming has never worked. The claim by the author included all “no code” platforms, so both sides of the line you’re splitting.

MS Access certainly is a coding environment though, enough so that there’s some debate over whether it’s “low code” or not. It comes with a programming language and APIs. Same goes for something like GameMaker. Perhaps by definition low-code and no-code platforms are typically quite domain specific. And lots of no code platforms grow into low code over time to evolve into adding more control but not get so technical that they need people with CS degrees.


> Developers are expensive. If a new IDE shows up and they can deliver stuff 5% faster, that means you can have 5% less programmers! Of course the IDE has a price, but it is almost certainly cheaper than 5% of a developer.

Dev work is nebulous enough that very few companies seem willing to invest in dev tooling.

I've heard nothing but good things about JRebel for example and how it keeps people focused instead of sword fighting during long builds.

I am aware of one company that uses it, and they only allow it for senior devs.


If you just made all your programers 5% faster, they are each more productive. Now you should hire more programmers because there's a wider range of things they can profitably work on.


Yes, this.

There are a lot of engineers out there that spend a most of their time in meetings, or writing design docs, or reviewing code, or a few dozen other thing that are not actually writing code. Much of that can’t be automated away. If you could get a machine to do much of the programming, that just leaves you with more time to work on the harder problems the business has to deal with, which likely means you’re able to deliver more value, and thus deserve more money.

And just think about it, what were Cobol and Fortran developers paid way back in the day? Not nearly as much as an engineer with comparable experience and education as someone today. They were also far rarer in breed. Engineers are paid much more today because the better tools we have now make us more productive and more capable of unlocking business value. Paying an engineer $500K today is worth it if they’re able to unlock $5 million in profits.


In essence, Jevon's paradox - https://en.m.wikipedia.org/wiki/Jevons_paradox - where in certain conditions (which plausibly apply to programmers) increasing efficiency results in an increase of consumption, not a reduction of it.


I’m aloud to pretty much expense any tool at work as long as I can justify it. I’ve never been asked to justify it, but that’s the rule. I think I probably expense a hundred bucks a month on tools.


Care to share which tools are these? I've been in the field for several years and can't remember the last time I shelled money for a tool. What cool stuff am I missing?


Mostly ngrok, grammarly (for documentation), postman and IntelliJ.


Please make a list for the rest of us and post about it.


I shared it in a sibling comment.


> No code / Low code

> People have been trying this for decades and it never really worked out. I can’t think of a single product that was able to get to any meaningful degree of success running on top of some low code platform.

Every. Single. Squarespace. Website.

You probably interacted with several today without even knowing it. They’re everywhere.

No code solutions have almost entirely killed the need for developers of shopping carts for small businesses. That was my bread and butter in the mid aughts.


My impression is that low-code/no-code fails when it tries to be a generic, domain-agnostic solution for everything. People then realize that programming languages are complex for a reason.

However, when it is applied in a highly domain-specific way, like the shopping carts you mentioned, it's quite good. Basically, when that happens, actual programmers are solving the programming-related part of the problem and the product people have actually invested time to think about what parts must be customizable through low-code/no-code. The smaller problem space means better opportunities for simple tools.


Yeah but you won't build Squarespace competition on top of a low code platform.

This the way of thinking I expect author is following.

If you want to run a webshop they yes - you probably don't have to write a single line of code.

But then you get "wordpress developers" who are people that are configuring wordpress for others and this is where fun begins - because wordpress is mostly installing plugins + some configuration and business people should be able to configure it. Then still you get those "wordpress devs" and don't get me started on SAP consultants :D


It seems like the entire economy is like one great big software project, running late, and adding as many people as possible. What did Fred Brooks say about this?

I think it will end when the rest of us can't produce enough to keep the programmers fed.


A lot of programmers work to automate and manage huge shifting businesses, saving sometimes 100x more than they cost. Many others work in effectively the entertainment industry: social media, actual entertainment, apps, and games.

I don't think the entertainment industry will ever stop: there's always someone with money and an idea to capture the world's attention. I don't think businesses will ever stop growing and evolving, therefore needing programmers to migrate and rework their systems.


I think it ends with everyone being a programmer of some kind, and with the field splitting into more distinct fields.

Like comp sci split from math, and with all the specialisations we see already, it has to go that way.

You dont just study to be an engineer or a scientist anymore, you become a "microbiologist" or a "structural engineer", and a similar thing will happen to CS + IT.

Already, around the second or third year of university (if not earlier), someone interested in web dev will have a vastly different skillset than somebody interested in embedded. Someone interested in dev ops may have no clue about how to organize your stack when writing x64 assembly, and some software forensics person may not be able to tell the difference between React and Angular.js.

Already you dont hire a "programmer", you hire a "frontend engineer" or "embedded C programmer" etc.

I believe Low- and No-code solutions will continue to be added to any software it can be, making more "regular" people from other jobs aware of what conditions, loops and variables are. They may not ever become programmers the way we know it now, but they will be (and already are) programming their machines.


As a web/fullstack with over 10 years of experience and based in EU, I don't have this impression at all. Actively applying and interviewing, still receiving absurd requests to resolve various quizzes, teasers, and assignments. Potential salaries top at miserable 70-80k EUR in countries where monthly salary deductions reach 40%. It's better for me to basically seat idle and work on some personal projects.


>Actively applying and interviewing, still receiving absurd requests to resolve various quizzes, teasers, and assignments. Potential salaries top at miserable 70-80k EUR in countries where monthly salary deductions reach 40%.

Let me guess, France/Austria/Belgium?


Germany, Denmark, and even Netherlands after the initial 5 years discount... list me one country where it doesn't apply...


Netherlands to a degree has at least some big tech willing to pay six figures and up, while that's almost not the case in places like Austria and Belgium while still having insane taxes.

>list me one country where it doesn't apply

If you can move to Romania and freelance, you can pay as little as 1% tax. And if you freelance for six figure US gigs you've basically hit the jackpot.


In post-Communist countries you have social services, healthcare, and infrastructure at the level worse than in US. You're alone with your cash and every doctor, agent, and clerk looks to screw you over. It's attractive only for natives with family locally, or maybe for someone who is fluent in the native language. No salary would convince me to relocate alone to Romania.


>In post-Communist countries you have social services, healthcare, and infrastructure at the level worse than in US.

True to a degree, but not everyone can get a visa for the US, and unless you've got an outdated view of the eastern block, safety there far better than some shady parts of some american cities. You can live without a car and walk everywhere without the fear of being mugged by homeless people or opioid addicts.

>You're alone with your cash and every doctor, agent, and clerk looks to screw you over.

That's true nearly everywhere you move as an expat without knowing the local language, laws and customs , even in Germany.

> It's attractive only for natives with family locally, or maybe for someone who is fluent in the native language.

Not really, plenty of entrepreneurial people in the tech space from Germany, Austria and the UK, etc. looking to make money and tired of the governments robbing them blind, relocate their online businesses to Romania for the massive tax saving.

>No salary would convince me to relocate alone to Romania.

Understandable, but you asked me to point you out a country with lower taxes than those, not to a country with perfect social system AND lower taxes.


boohoo, you only earn 70-80k EUR bruto. And you get to live in Europe as well?? You realize the average bruto salary in (west) Europe is around 20-30K EUR? Even lower in east Europe? You are one of the luckiest, most privileged people alive. Yes, there's always a bigger fish/higher salary. But christ, get some perspective.


I pay 40% income tax (56% marginal), 25% sales tax, 35% electricity tax.

It means 80k doesn't go as far as it'd seem, especially with increasing interest rates.

Considering Americans are on 200k+, it just doesn't compare.


I doubt it to be honest. I think that that's just the bragging, survivor biased subset of the top enclave of salaries you can get in crazy expensive places like SF, NY, CA. And you still have taxes there, and (health) insurance costs a lot more.


Tax burden in those areas (all-in, including federal, state, and city) reaches 50% at the highest tiers, but 50% of $200k+ is a lot of money. Health insurance for bigger companies is usually very good, with about you paying around $400/month out of pocket for your family with a low/no deductible plan. That comes out pre-tax, so the true cost is lower, depending on your tax bracket. The end result is that insurance ends up being significantly less than 1% of your income at that level. That highlights the main problem with health care in the US, which is that your quality of care depends entirely on your employer.


I don't consider this entitled. The highest income tax rate kicks in at around 55-60k EUR and any amount above this translates into miniscule increase in net salary. Then EU companies during recruitment use the same tools and platforms as US companies who pay >>100k. No Amazon and Gloglle and silly startup in Berlin, with the salaries available over here I'll not study 2 months for your cutthroat tests and assignments.


>boohoo, you only earn 70-80k EUR bruto. And you get to live in Europe as well?? You realize the average bruto salary in (west) Europe is around 20-30K EUR?

Have you looked at property prices in Western EU cities? Those salaries don't buy you anything decent. Meanwhile in places like the bigger cities in Texas, any skilled employee can pay off a big house in no-time.


Regardless of whatever trends exist now, every programmer must spare a thought about their future. Ageism is common in this industry. No matter how great a programmer you are, you will be eventually discriminated. And this area of work is notoriously famous for giving people age related diseases.

If you have a gravy train going for you. Ensure you plan your retirement well, take care of your diet, exercise and relationships.

Nothing really lasts forever, even if does for a community, individuals always run out of luck. Instead of thinking of yourself as a superman, imagining yourself as a mortal with limited resources, time and luck will have you better placed for the future.

YMMV.


When I was 10 years old I sat next to a guy on a plane who sold linoleum tile. He said he used to be an aeronautical engineer, but lost his job when the bottom dropped out of his industry.

I've always remembered that, and try not to take the good times for granted. But damn it's been going on 25 years now, and except for little hiccups in 2001 and 2008, demand for programmers is still white hot.


Other people know more than me but the times are good now because the VC's are playing a ponzi scheme (or greater fool theory). Each VC puts in money simply because the valuation will go up and there would be another VC wanting in, willing to pay more. VC's have a lot of cash for whatever reason, low interest rate or Oil money. Hot take: this is not dissimilar to what is happening to crypto.

We developers are just part of this game, just incidental to the story. We are paid more simply because a lot of money comes in from top, not for some nonsense like we are "highly skilled" or the supply is less.

Yes I do believe that most of tech is a zero sum game, hardly any value is created.

Where does it end? It ends when VC's get bored of this game and move on to other things


Very few comments in this thread talking about what he concludes as the most likely scenario- "decreased investment". Just what will happen when the Fed is forced to raise interest rates? What will happen to the unprofitable unicorns or even unprofitable public unicorns, much less all of the no-name startups burning VC capital? Surely not every Uber is going to get the Amazon treatment and allowed to keep burning until it finds a way to profitability.

Of course, if ZIRP is reversed, who knows what will become of the economy in general.

https://www.readmargins.com/p/zirp-explains-the-world


It's already over. You can get over 1% on CDs now.


Author seems to have a zero-sum world view, and think that everything that goes up must come down. I personally don't believe in either. Tech workers will continue to be in increasingly high demand, and automation, AI and "no-code" or "low-code" will only intensify that trend, not diminish it.


Agreed. Especially around three to ten years after the no code solution is implemented when it needs a full rewrite in order to do one of: decrease vendor lock-in, satisfy audit requirements (i.e. meaningful peer review), incorporate professional change management, or scale beyond what the relevant no code framework was designed for.


> think that everything that goes up must come down

It doesn't take a pessimist to be remember Newton's apple.


No, but it does to nonsensically apply the same rule in literally every context. People and programmers are not apples. The economy and society are not gravity. I don't know what else to tell you, and I guess it doesn't matter, since people with a zero sum world view can't be convinced otherwise.


There's a difference between believing there is a zero sum world view, and observing that every economy that has ever existed has generally operated on boom and bust cycles.


I don't think anyone is disputing the economy has boom and bust cycles. But despite the great depression, two world wars, the cold war, the dot com bubble, the 08 crash, insane environmental destruction, climate change etc etc the economy is growing, the global population is growing, and rather than more and more people fighting over less and less resources, pretty much everyone everywhere is healthier and more prosperous than ever, and extreme poverty is on track to be completely eradicated in our lifetimes. (this doesn't mean there aren't exceptions, or that there isn't still lots to do to address climate change, global economic and health inequality etc etc, but it does prove the peak oil doomer zero sum world view people wrong)


We have been living at the sufferance of zero interest rate policies for the last dozen years. It's hard to say what happens when the music starts slowing down.

It just sounds like an extremely hubristic statement, mocking the very fates themselves, to claim that unlimited economic growth will continue indefinitely, that this era of cheap capital will continue indefinitely, and there wouldn't even be one economic downturn or hiccup to disrupt the gravy train. This is the stuff that "things that aged poorly" is made of. Optimism is one thing; irrational exuberance is another. I would rather not laugh in the face of whatever the Fed is planning to do about inflation.

Hell, this was published just hours ago.

https://www.bloomberg.com/opinion/articles/2022-02-20/federa...


Literally no one is saying there can't or won't be downturns or hiccups - eg, two world wars definitely qualify as both. But even they couldn't stop the overwhelming, hundreds-year long global trend towards simultaneously increasing population AND prosperity. (which, btw, definitely did not have zero interest rates) But hey, don't take my word for it, see for yourself at eg https://ourworldindata.org/extreme-poverty


Then I simply disagree with your framing of "everything that goes up must come down", which seems to ignore the existence of downturns and hiccups, and "tech workers will continue to be in increasingly high demand", which would be a much more questionable statement during such a downturn or hiccup. Such an economic slump would not be a pleasant time for anyone, even tech workers. It is good to be on guard against such an eventuality.

It’s also quite the fallacy to compare overall economic growth with one sector, or even one region. Ask how the Rust Belt has been doing these days in comparison to when Detroit was at its acme.


For people that are entrenched in the industry, 10+ years of experience, specialized skill sets, etc. I really don’t think the genie will be put back in the bottle. The people that will be burned the worst are newcomers or people with narrow skillsets in areas that fall out of fashion.


It's worth considering that the number of programmers/technologists working in an organisation serves a signal for the organisation as an financial instrument. An organisation wants to show growth in terms of revenue but also in terms of their capabilities, and frequently the "human capital" of the organisation is a metric that is reviewed here.

If you've been in a growing tech company you've probably seen this as colleagues being hire when you're not clear why, and the organisation actually preferring solutions that require many more employees.

There's also just the factor that – today at least – technology management have a lot of social power within their organisations. Most managers wants to grow their little fiefdom – which usually means more headcount.

Anywho, I guess I'm just saying there's a lot of factors completely unrelated to automation that incentivize hiring more programmers. Many of these factors are not rational for the organistions themselves – managers growing their headcount to the limit their structure will allow. Some of them are though! I.E. company as a financial instrument rather than as a business.

The model of a business selling something and optimizing process for maximum profits is in there for sure, but from my experience there's a lot of other factors that end up building a very incoherent whole.


Everyone in the US or a wealthy country pushing for 'remote work' is outsourcing themselves, ultimately.

In the short term we get 'remote work' which some see as a benefit.

In the medium term, absent the apparent advantage/willingness of people to be 'on prem' (in the minds of employers), it's just as easy to hire someone at 3/4 price from Europe, 1/2 price from E. Europe, 1/4 the price from India.

While 'language and custom' do present real challenges, let's not kid ourselves: the CFO can use very, very powerful language to push for 'offshoring' because they will talk in terms of 'raw dollars'.

The 'advantage' of devs. from wealthier nations is abstract. Their costs are not.

In much the same way companies made a fairly narrow choice to jam everyone into 'open work spaces' - which drives a lot of people nuts and I think harms productivity, but is quantifiably cheaper ... companies may opt to 'cheap out'.

Many will.

I'm wary that the material realities of on/off prem, language, culture, time-zones, communications etc. will be lost in the mix.


This framing of "cheapening out" or "The US and other wealthy countries" is a little bit behind the curve. Tech talent physically moves frong HK to the mainland nowadays, The salaries in some Eastern European countries are higher than in Portugal or Spain.

It's not cheapening out as much as it is diversification at this point, and give it another generation and it'll simply be people having the choice to work anywhere in the world.

Companies don't just 'outsource' any more but primary want a presence and access to incredibly large markets with domestic talent.


You're completely ignoring Canada and Latin America. Good luck dealing with timezones and language barriers.



Despite efforts to make the job of programming easier, the reality is that it has just expanded the scope. The cloud made it so you had to learn cloud computing in addition to Linux management. Containers add a whole other vertical of problems. It just goes on and on. The need for expertise doesn’t go away with more abstraction


I actually think high quality programmers will always be in high demand. Poor programmers will get found out sooner, as the general population becomes more technical.


When will they become more technical though? My kids are far less technical than I was, and their friends as well. Try as I might, they simply aren’t interested. Is some other part of the population experiencing some kind of technical advancement amongst their kids? I don’t see it.


I think there was a generational sweet spot.

That is the first generation where computers were affordable enough to be 'personal computers' and in the home.

Before then kids weren't able to be exposed to computing early enough, and after then games became shrink-wrapped so kids didn't have to become technically minded to get their games running.

Children growing up with a computer at home were doing all


The general population isn't becoming more technical though, as an example almost everybody drives cars but most people aren't mechanics. People see cars and tech as a means to an end, which is perfectly fine for me. They just want to get their job done.


In the same way that giving IPads to grade schoolers does not make them more technical.


Short of a dot-com burst I don't see a glut of software engineers. However, as the career prospects are better known there are more people thinking they want to be a software developer, will this increase the talent pool and decrease demand?


I think the long term trend will be good for software people but I think something like the dotcom crash is due. There are way too many questionable startups only made possible by excessive investment money floating around. I was around in 2000 and once investment started drying up only a little the whole house of cards came down because a lot of companies "sold" unprofitable products to other unprofitable companies. I think the situation is very similar now.


I feel like this has already been happening for years... probably longer than that. I find a lot of the people I work with were pulled away from other industries. Some industries are over-represented, like science, engineering, education.


I think amount of people who are capable of and willing to do software development is limited. Most "normal" people I know would hate the job.


I can't remewthe last time my job bottleneck was writing code to implement requirements.

The bottleneck is almost always migrating a legacy system, getting buy-in, interviewing users to design the system, or waiting on requirements to be finalized.


The bottleneck for me is always myself.

Getting started can be hard sometimes.

Getting into the flow of programming where code comes swiftly is tough.

Getting stuck on problems by trying too hard to optimize / design / abstract is common.

At the end of the day its really these things that prevent me from being a better programmer. I have nobody to blame but myself, yet nothing I do really seems to move the needle. Presence of mind and clarity of thought are both a fleeting resource. I have to strike while the iron is hot, and the iron is (unfortunately) not always hot.


I used to struggle greatly with ADHD, but recently got on Vyvanse and that was a HUGE help for me.

My older brother also struggled, but he got on Prozac for his anxiety and he has suddenly become a freaking SUPER HUMAN. I've considered dropping the Vyvanse and switching to Prozac because I have multiple friends and family on it, and the switch from before and after is STARK. And POSITIVE.

It is amazing to see someone who previously struggled with anxiety, to the point that I thought this friend DESPISED ME, to becoming a friendly pal of mine after he found a way to manage his anxiety (prozac).

Our brains are not designed to sit in an office and coax functionality out of a computer. We used to be hunters and prey creatures, and that lizard brain still lives in our bodies constantly reminding us of that a sabre-tooth tiger could be lurking nearby. Or that the other human across from you may try to steal your chance at successfully breeding etc...

Point being, humans have a lot of emotions in modern life that are not always helpful for the way we now have to live.


Interesting. I took prozac a view years back and it didn't do much for me, but I wonder if I was on too low of a dose (20mg). Can I ask what strength your brother is on? Also, I'll consider asking my Dr. about Vyvanse. I was diagnosed a long time ago with ADHD but haven't been taking meds because I don't like Adderall... something about taking an amphetamine for the rest of my life doesn't fit well with my generally sports / physically active lifestyle.


Your honesty is relatable and refreshing.


More code is being written every day than is being retired. With an ever-growing code base, comes the requirement for more resources to maintain it. Demand for developers is outpacing supply.


I think the current bubble is very short-term. It has been caused by 10 years of low interest rates pushing a lot of capital into startups. The recent "next level" hysteria in the market has been caused by stimulus packages in the US and elsewhere. A lot of what we're seeing is very reminiscent of the .com bubble.

That said, I do think underlying the froth is a genuine increased societal demand for tech and programmers, so I don't expect the collapse to be as bad as .com outside of crypto and other hyper-bubbles. Yes, there will be a lot of job losses, but most engineers will just experience a salary correction rather than years of unemployment.

In the long-term, a move away from traditional programming will be necessary to solve the supply problem. Tools like SquareSpace, which are really awful but do genuinely replace a lot of low-end web dev work, will eventually mature and begin to eat up a decent amount of engineering demand. We'll see lots of these Excel-like apps that can serve low-end business needs reasonably well and remove the need for engineers.

I think there are a lot of opportunities making great creative tools for programming-like tasks, just as we've had a few decades of innovation in graphic design or publishing tools - there are lots of very valuable startup opportunities waiting for those who have the motivation and patience to figure out how to help non-nerds do nerdy things.


What's so awful about SquareSpace? I've recommended it to a few people who were after a website.


It's a slow, clunky, imitation of what could be. It's a useful tool only because it's the least worst such tool we have. I would also recommend it and similar tools to non-technical folks.


If you look at the data, software jobs are growing far faster than new graduates in computer science. A quick google search just found well over 250,000 open positions in the US, with about 50,000 new graduates per year. I'd bet we have decades more of high demand and high salaries for people with CS degrees. I wouldn't be surprised if low-code tools get good enough to power many websites though, so I wouldn't be quite as bullish on code-camp graduates seeing the same growth in salaries.


Maybe if we didn't make "Teaching Computer Science" such a worse job than "Being a Software Engineer" in so many ways. Of course, then we might accidentally start making "Teaching" a good job for folks, and that apparently is completely counter to the American ideal.


the question is, how many of these jobs pay enough o comfortably provide for a family, buy a house and retire, and have enough in the bank to not have to worry about money.


not many jobs will offer those end results - CS related or not.


>The CEO’s job is to make as much profit as possible

It doesn't have to be, especially if the company isn't public. Their job is to run the company successfully. Part of that is keeping their employees happy and retaining talent.

>If a new IDE shows up and they can deliver stuff 5% faster, that means you can have 5% less programmers!

That's never how it works though. As workers have become more efficient they have been required to output more and more.

Overall a lot of claims in this blog are unsubstantiated


>> I think this is the most likely scenario. Much of the demand for programmers right now stems more from expectations of future profits than from profits being currently made.

This is certainly true for companies that sell software. Revenue comes from sales of software that already exists and developers are paid to create the next version. Software companies are in effect charging rent for something they have already developed that has zero production cost. SaaS takes this to the next level. Most of these companies are ripe for destruction by Free Software unless they provide something of exceptional value that is hard to replicate. To me the examples are things like Mathematica, or high end CAD or FEA software. PCB design has recently been conquered by KiCAD 6, though there are (apparently) some very high-end tools that have exotic capabilities still. Blender is really taking a shot at animation software, but there is still plenty of room for the high-end.

Most software written today is not product software, but internal software to automate things, or otherwise make a business run. Until FLOSS takes over in this space there will always be a need for programmers simply because new companies are always being created that need automation.


Although there’s a constant supply, there’s still an exponential demand.

Programming has already created two classes of developers for many years. Those blue collar type programmers who translate requirements into code. And those white collar type engineers who design & gather requirements into concepts.

I think the big divide here has always been for the latter, but companies still need the former to train & skill up to the latter.

I don’t see this trend slowing down for the next 20-30 years. Hell, even by 2025 there will be a deficient of over 150 million. I don’t think we can train nor repopulate fast enough for that demand which will only increase each year in addition.

Times will be great for knowledge work jobs in the tech industry. Not just programming. Programming experience will only become more required for non-coding jobs. Especially any people management or leadership type role.

https://blogs.microsoft.com/blog/2020/06/30/microsoft-launch...


> No code / Low code...But hey, deep learning was in a similar spot in the 90s and now we have real products that use it and it mostly works.

The major difference is deep learning because viable because parallel computing became 3+ orders of magnitude cheaper. Low-code isn't limited by technical limitations, so I don't see anything changing anytime soon.


Don't really see low code replacing developers. You'll just see low code specialists like you do right like with CRMs and things like mulesoft.


As to what might happen regarding "Decreased Investment", this article [1] has an interesting analysis for those working in rapidly funded startups. In short it predicts a Minsky moment, but in relation to funding "speed" rather than "value".

> Does the compression of timelines in venture change the distribution of terminal outcomes for venture-backed companies? [1]

> A Minsky moment is a sudden, major collapse of asset values which marks the end of the growth phase of a cycle in credit markets or business activity.[2]

[1] https://pivotal.substack.com/p/minsky-moments-in-venture-cap...

[2] https://en.wikipedia.org/wiki/Minsky_moment


It doesn’t.

Software has been eating the world, and there’s no sign of it stopping.

Software gave people like me a way to escape dire economic situation, and is the main reason for increased social mobility as in having lots of options. That is attractive to every young person out there, and the demand goes up and to the right.


I prefer to see it in terms of power hierarchy. Programmers gained a massive advantage in the past decades because they got to dictate to everyone how to work - they went higher in the power hierarchy. This has taken the wages of everyone else lower and lower and lower, while tech companies have been making absurd margins (yes there is a lot of zero sum in it). But now free money is over and inflation is showing its teeth very sharply -- here in europe it is going to be a massive problem.

What you will see is more and more tech-savvy business owners in all other sectors that will reign-in the overpriced-snake-oil salesmen of the big tech and will put developers back in the lower ranks of the cogs hierarchy.


All the inflation around appears to be either supply side issues (cars, anything with semi conductors, transport issues, …) rather than demand side. There’s definitely more money sloshing around but everyone appears to be using it to jack up equities prices.

Any other price rises i can spy right now appear to be opportunistic profiteering rather than over demand. Eg the used car market is already back to many cars not moving since Christmas because they’re over priced where last summer there was a legitimate shortage, no adverts were old adverts for used cars back then.

That said, It doesn’t really matter with inflation whether it’s based in supply or a demand side cause or not, there will be effects just because enough people believe they will come. A self fulfilling prophecy to some extent. Im not sure the effects will be the predicted ones though, i suspect a less rational outcome because the basis for this is not on rational ground.

As for programmers, I’d have to take the other side of that bet. No code has yet again proven itself to be the same useless junk it was in every other incarnation of that idea. Code is infecting everything today. Good luck buying even just a tv without apps, never mind a new car without some junk entertainment system tacked on. Still no one has solved how to write good code - the type hype just now still leads to building the wrong thing for the customers needs and still produces software with very basic security vulns like XSS.

Until someone can solve making code writing reliable, the status quo will continue.


*flippant comment warning...

three things coming together: 1.things like sharepoint and powerapps getting better 2.more exposure to programming/abstractions/scripting in the general populous 3.widespead documentation / agreement of standard business processes (maybe in the form of templates for number 1)

underlying all of this is when people wake up to the fact that data and process matters, software doesnt.

there will still be developer jobs but they will look more like how infra jobs look under cloud computing - higher skilled, harder, bigger implications when you mess up.

There is value in a few amazing coders out there but IMHO most developers add most value by getting to know how to abstract thier particular customers business.

EDIT:formatting


The article claims "Supply goes up" and "Every year we get more Computer science graduates."

Except, that isn't true. The number of CS graduates as a proportion of all graduates is actually falling, while the need for programmers is steadily rising.


I don't know if it will happen, but a thing that needs to happen that will probably tamp this down: Liability, and/or other real consequences, for bad programming.

We have this for construction and other fields, and it's basically nonexistent here.


It ends the same way it did the last time. A giant tech market crash, more market consolidation, and massive cultural change.

At this point, things move so fast that operating systems are shipped with known bugs, security flaws, and half baked ideas. They are then patched via over-the-net updates. I think this will stop. People will go back to more traditional development paradigms out of necessity. They also won’t completely trash code bases that have been patched to the point of being reliable and somewhat secure just to make a new thing no one wanted.

My only other prediction is that people will eventually be more conservative about picking tech stacks. They will want proven track records.


> At this point, things move so fast that operating systems are shipped with known bugs, security flaws, and half baked ideas.

Wait, you think that wasn't the case 10, 20, 30+ years ago?

OS releases are far better now than they were pretty much ever.


What/when was the last boom?


90s which ended in 2000/2001


Talking specifically about the web: We're still not there with no code tools and I don't know if there's a future where the very specific needs of businesses could be met with building blocks. Sure, designs, basic features--but there always seems to be the one feature that is outside of the scope of what the tool can provide and it can send the project tumbling down trying to shoehorn it in. The tools are getting better, but I think we're a long way away from where developers, at least strong backend or full stack developers need to worry about their job security.


Standups with the entire team of say 5 or more people are usually a waste of time for most of the attendees. Most attendees are simply listening to updates and questions between the lead and the other team members. Sure, sometimes an unrelated developer might have insight but it is rare and can cause rabbit hole conversations that waste other team member’s time.

Would prefer a structure where an experienced developer (preferably in area of work) meets one on one for 10-15 mins max with each team member, provides advice or refers the team member to resources to aid if needed.


I disagree with this, but only under certain conditions.

The standup is meant to be a quick meeting, 10 minutes tops.

It's not there for deep advice between lead and each team member.

Instead it's there to get everyone on the same page.

The deep one-on-one meetings still happen separately - when needed.

And as you say, the added benefit of daily standups is that sometimes a team member has great insight into someone else's issue.

Of course, standups that take too long are terrible.

But if done right, they're helpful, and a lot of fun.


> Instead it's there to get everyone on the same page.

How do you get everyone on the same page if you cannot ask questions?

This whole standup theatre is complete bullshit. It's true it's a waste of time. You say it's only meant to be a quick "status update" 10 minute meeting. What is the point of that if then asking questions is not allowed? If it is allowed then it's true though that for the majority of listeners it becomes a time sink.

The best performing teams I've worked on had no standups. They talked to each other when there was something to talk about and they only talked to the folks who were genuinely interested in the conversation.


> How do you get everyone on the same page if you cannot ask questions?

Not sure why you're asking me this. I welcome questions in standups. And if a question warrants a deeper one-on-one chat, then at least everyone knows an issue has been raised (but often someone on the team knows the answer straight away). They hardly take up time.

I suspect we have radically different experiences of standup meetings, hence the polar opposite views.

In my case (16 year old company, high functioning team), daily standups are fun - hardly a chore - and you'd be surprised what surfaces in these team meetings (that wouldn't necessarily come to light if we were to simply rely on team members to speak up only "when there was something to talk about").

Just to be clear, I'm not criticising your lived experience. I don't doubt you for a second.

But let's be fair - it really depends on the team, and the environment, and the desire for daily standups to begin with.

It's a bit like project management tools. If abused, or used solely for the sake of ticking boxes, they're annoying. But if used correctly, and everyone is happy using them, then they certainly add value.

I suppose it's highly anecdotal.


It just seems highly unlikely that you are having 10 minute meetings while accomplishing anything beyond the usual small chit chat and banter that almost all social interactions require. It takes ten minutes alone to just gather a group of people, look at each other, and say hello.


I can understand why you’d think that.

If I may challenge your idea: what you describe only applies to teams that don’t meet often.

My team has been doing this daily standup for years. Those 10 minutes are succinct - and effective. We save the chit chat (of which there’s plenty) for our lunch breaks and after hours.

I wouldn’t blame someone for doubting it’s possible. But yeah, in our case, it just works.


Perhaps you work in a culture that does not have this limitation, but, in the US (a country often criticized for being work-centric), a 10 minute meeting that accomplishes anything meaningful at all is virtually impossible. In fact, the more often people meet, the more banter that crops up.

Anyway, you have something special at your company that I’ve never seen. Cherish it.


“get everyone on the same page”

Meetings are expensive, even if they are for short periods. A simple email should be typically sufficient for information that is common to all team members. I’ve almost never attended a 10 minute meeting with more than 2 people. They are always longer, especially if they are fun.


I'm amazed that web site development and web servers are still so complicated. I expected that to be all drag and drop by now. Most of it isn't very original.

As for programmer employment, we've seen that collapse before. Go read SFGirl from 2001.[1] Check out the Pink Slip Parties.

[1] https://web.archive.org/web/20010302012226/http://www.sfgirl...


If a company is doing something remotely cutting-edge, no amount of low-code tools and low-skilled labour is going to help. Information retrieval, PB-scale data processing, robotics, embedded, graphics and video at any significant scale are all still too cutting-edge and will be for next decade as far as I can see.

You hit performance and architectural limits pretty quickly when doing things naïvely. At this rate, growth at tech sector is going to create more demand (than created supply) for able programmers for at least a few decades into future.


It won't end for anyone who can stay relevant and adopt to the new tech.

It will end for everyone who is stuck at current tech which will be "past" in the future, both for tooling/tech and most importantly mindset.

For example, React Native and Solidity pays well nowadays. I know they will probably be replaced with something else (in terms of demand, not necessarily an immediate technical successor). If I don't start learning whatever-comes-next-in-my-field, I'd stay in the past, no matter how good I get on React Native or Solidity.


React Native is dying AFAIK? Its mostly Flutter and ML that's souring?


Definitely not dying and doing great. Flutter might be on the rise though I don't believe in long term success of it because:

- Opiniated UI, even worse: Designed by Google, maybe personal but I think Google sucks at design and dislike any design coming from them even if I don't know Google created it in a blind test.

- From Google who's known for abandoning many projects

- React is already solid and very easy learning curve from React to React Native

- The obvious one: JS vs Dart. I don't know Dart, it might be a better language than JS, but MANY people already know JS and wouldn't be super comfortable with learning another, less popular language whereas RN is on top of JS(X).


This is such a childish view of the matter. Imagine if a scribe in medieval times wrote an article. "Times are great for people who know read and write now. How does it end?"


I don't see your point. I'm sure it would have been very interesting back in 15th century to investigate the impact of continued literacy among the populace and its effect on various power structures: religious, economic, and political. Today we know the answer so obviously in hindsight it seems obvious and maybe trivial, but I don't think it would have been at all trivial back then to investigate the matter.


>> Higher salaries mean less profit. It’s just how the game works.

This is a false premise. As a counter argument it would be easy for a rational CEO to pay higher salary if the net incremental ROI on that salary is positive.

Then there are second order benefits. A players attract A players, and so on. So you have to also consider the non-financial ROI on the higher salary to attract the best talent.

There is a reason why the top companies are willing to pay higher salaries ...


“If the economic conditions change and investors decide they don’t like tech companies anymore, many of them will inevitably close. Less companies hiring also means less demand.”

This is a cyclical phenomena, but most likely the factor listed to have the greatest impact on developers.

As for high developer salaries, scarcity of knowledge with newer languages and frameworks along with the demand for experienced candidates with those skills certainly has an impact.


> Much of the demand for programmers right now stems more from expectations of future profits than from profits being currently made.

That's a big assumption to make.


Many very large software companies are making handsome profits. We will probably see changes in software employment if that fact changes.


Programmers enable a lot of bullshit jobs indirectly. And it's easy to write software that serves no purpose whatsoever without everyone noticing.

The most famous business failures were hardware or offline companies like Juicero, WeWork or Theranos. Software companies tend to survive, grow and "produce".

The end of great times for programmers will be the end of bullshit jobs for everyone else, and that crisis will be much bigger.


> Higher salaries mean less profit. It’s just how the game works.

Software is sold at massive markup. No sane tech company is complaining about profits at the moment. Sure they'd pay less but if they bump our salaries they're still making insane profits. At the moment it's COGS for any CEO and they've already accepted it's high for software.


I remember reading almost the exact same post around 15 years ago. Back then, the boogyman was outsourcing to India rather than code bootcamps. I am not saying programmer salaries will never go down but my feeling is that demand will keep outpacing supply for a while. Software is eating the world after all.


Are they really though? I'm nearly ready to call it once and for all, toxic and disruptive industry, after 12+ years in it. Expectations are unfairly high, salaries make absolutely no sense, 60+ hours a week are expected even though contracts talk about 37/40hrs/week on average. Meh.


Every company that hires three junior developers today will need one senior developer in a year or two. It's great to be a developer today, even better, if you are a senior dev.


This will probably settle with many new small companies succeeding at software products. The interests align beatifully when the SWE are de owners of the business.


If it ends, it could end with companies asking for more experience. Everybody in the current group would be okay but new grads would find salaries lower.


>I can’t think of a single product that was able to get to any meaningful degree of success running on top of some low code platform

Microsoft Office/Excel


When the easy money stops. Web/mobile tech is subject to vast over investment in the USA relative to the real economic value it creates.


Programmers will end like most factory workers: there is no reason most of the current programming can't be automated at some point.


Programming is mostly r&d and i dont hear people trying to automate researchers, rather enhance what they can do.

I'd argue if a programming position can be automated, that position wasn't needed in the first place and someone can make better infrastructure that doesnt require it in the first place. But to build that infrastructure you do need competent programmers, and if you dont a competitor will, and eventually replace you.


That's why I'm saying "most of current programming", thinking tasks like doing React or Wordpress stuff


I've been hearing op topics, exact things more or less, in differen shapes or forms for over 3 decades. And yet here we are.


With the prices going down thanks to the comoditization of offshoring, and everyone becomes another cog on the machine.


It ends with ecological and societal collapse. While the party lasts tech will increase. But eventually in the next 20-50 years tech will mean very little given we don't have the resources to continue growth exponentially.

In the interim, programming is a great way to make money, but at some point it's going to lose its prominence.


How do you figure that tech will mean very little?

What cannot continue is economic growth, for the simple reasons that the planets (and its biospheres) resources are finite.

What will continue however, is technological growth, as in technology will grow ever more complex, and solve more difficult tasks. Techn development is not inherently tied to economic growth.


Collapse is a complex topic, but essentially if the thesis is true, then we are on a downward spiral in complexity and in population. At some point, high tech will be replaced with low-tech solutions that take less energy and less maintenance.

To me that means that there will be programmers, but far fewer, focused on only vital areas, like climate change adaptation.

I don't think it's an immediate concern, but if I was 22, I'd keep my eyes peeled and try to specialize in some field that would still exist if our main concern was trying to make do with less.

I know that's a bleaker outlook compared to the techno-utopianism that's common. That said, growth is a part of our system, so I think the outlook for programmers is great in the short term. Capital is going to try to maximize value right up until we fall off a cliff.


> At some point, high tech will be replaced with low-tech solutions that take less energy and less maintenance.

Low-Tech does not automatically lead to a reduction in maintenance and energy required.

Example: The electrical train systems of today are safer, more energy efficient and require less maintenance than their steam driven ancestors.

Another example; Todays circuits require orders of magnitude less energy and maintenance than those built with electromechanical relais or vacuum tubes.

Technology declines as a result of major catostrophes, be they socio-economic (like the fall of ancient empires) or as a result of natural catastrophes.


so make your primary job programmer, but maybe develop fallbacks in carpentry and subsistence farming.


Yep! This is my plan. I already work remote, so I plan to continue that as long as I can. Eventually, I want to buy land in a remote place that is as climate change resistant as possible, like the upper midwest.

I want to build an off grid house and learn how to grow most of my own food and develop strong ties with the community, as well as learning lots of non-tech skills.


>> The CEO’s job is to make as much profit as possible. Higher salaries mean less profit.

Such a silly reductive view, if this was the case Facebook/Meta engineers wouldn't be some of the best paid in the world.


i am a programmer but don’t get paid for it yet. hopefully i can get out there before the crash (if any)


uh, with more greatness?


The super volcano under Yellowstone explodes, triggering end times.


Epic Solar flare or some capitalist downfall is more likely than those lame-from-the-same-pre-2000 book of predictions that servers wouldn’t need IT staff as they get easier to manage (that didn’t age well!).


times are great? why?


As far as I'm concerned, the ride stops when the masses realize how easy the profession actually is. You can take any office worker and train them to an entry level competency in 3-5 months, when they will get a job for at least $75k. They are then in a new career path and will soon make 100k. I find it impossible to convince non devs that being a developer is easy. They can't fathom why we get paid so much if it's easy. They think I must be lying, or lucky, or too smart. No! It's fucking easy!

As soon as people catch on to this, and start doing bootcamps with the same frequency that they do weird "side hustles", it's over. The current scarcity is mainly because people don't realize they can become developers.


I used to think the same. I too find my job to be easy and sometimes feel like we have this secret in the software industry that our work really isn't all that difficult these days, with so many hard problems having already been solved. I say this as a senior SDE in big tech.

But, after watching a close friend, who is brilliant—she had gone to a top 30 college undergrad and had top SAT scores—go to a coding bootcamp (the same one I went to) and fail to switch careers I started to think differently. For some reason, I find the work easy and interesting, but many others do not.

I do agree that many more people are capable of becoming developers than choose to do so. But this fact may not be as interesting as it sounds. Many people just don't like coding. Many just don't want to spend their days writing code, even if it pays well.

I don't mean to downplay irrational reasons why people choose not to pursue software. The industry still has a lot of negative stereotypes about what the work is actually like that prevent people from giving it a try, but, more and more, I've noticed that, even among people who know such stereotypes are false, many just don't want to do the job, even if they could.


I just find it hard to believe that the typical office worker likes their job enough to do that, but would find software development at ~double their salary not worth it. People don't want to spend their days in Excel either, but millions do.


At a very high level a typical set of office worker tasks includes performing business processes, modifying processes in response to changing needs, optimizing processes, discussing and planning the future with co-workers, all in the context of the business domain and organizational structure + strategy. It frequently includes some kind of automation in the form of Excel sheets, e-mail / document templates, and the like.

This broad description basically sits on a spectrum with what many programmers do, even more so if RPA catches on in these type of jobs. Doing this stuff today isn't always a cakewalk, and there are plenty of full-time office jobs that already pay $75k sans writing code.

I'm pointing this out because there is a lot of cloudiness in what "the profession" is, what a "dev" is, what an "office worker" is. Someone with the title "software developer" could be working on firmware, cloud systems, or they could be scripting business logic & setting up processes for other workers. Someone with the title "market analyst" could be writing a couple emails a day and dragging files and folders around -- or they could be coding automated reports, constructing data feeds, and collaborating on VBA/JS macros in Excel.


I don't know if I buy that. The better bootcamps now have a screening process -- I don't know if they'll just take any office worker. And the crappy ones just churn out schmucks who can't get hired.

Statistics for boot camps are kept pretty tight, and a lot of the "we can get you a job!" claims only come true if/when the bootcamp hires you to teach after you graduate, thus allowing them to puff their numbers.

Anecdote: I know 3 bootcamp grads, and only 1 of them is working as a coder. She had a BA from a good school in Econ before doing the BC, and is now a senior dev. The other two weren't technical or math-y, though at least one was able to turn the BC into a STEM Recruiter job since he had an HR background. The 3rd, a Sales guy trying to change careers, is, as of today, still not a coder; I think he's doing some sort of IT support, and could have gotten an A+ and Net+ and been as competitive for that gig -- at a fraction of the price of a bootcamp.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: