The quality of your average tech worker has completely nosedived in the last 10-15 years.
All these huge companies wanted more products, more marketshare, more money, etc. They needed more people to pull this off. They started lowering hiring standards across the board because there just weren't enough people in tech.
Simultaneously, a huge portion of the world saw tech salaries and wanted in on it so they started taking every quick certification, bootcamp, degree, etc to get into tech.
It turns out that compared to the dedicated nerds of the previous generation, most new people just don't care that much about tech and don't want to go deeper than the bare minimum required by their job.
So I think tech overhired by a LOT, then they realized all these new people are actually net negatives on the company, and we are slowly correcting.
I think a solid 50% of people in tech are still on the chopping block. You can do much more with tools + really smart people in the year 2024 than you could before.
> The quality of your average tech worker has completely nosedived in the last 10-15 years
I find this opinion hilarious: Almost 30 years ago the second most popular software product was Windows 95 (Doom was the #1), which couldn't run for a few hours without BSODing. Almost 20 years ago the average tech worker was building atrocities with Visual Basic, MS Access and PHP.
Meanwhile today it was announced that Google decided to lay off several compiler engineers working on LLVM, but sure, please tell me more about those low quality bootcamp kiddos that are ruining everything...
Yeah, Windows 95 BSOD'd a lot; however the people developing it were doing it for the very first time, largely with single-digits of megabytes of memory to work with and with a CPU that not only contained bugs[0] but was so slow that every single operation would take milliseconds of wall time[1].
It's not comparable, it's almost to the level that rendering a character on my screen right now consumes more CPU cycles than the entire operating system would have done in a day.
PHP was good, actually, but it's borne of its time- it's easy to use 20:20 vision of history to say it's a bad design when fundamentally:
A) it solved problems
B) it was working with the best of human knowledge in language design at the time
C) it remains one of the most well optimised web languages to this day, even a variant from the era would easily outperform any django webapp, I'd put money on this.
We stand on the shoulders of giants, good abstractions and lessons from these periods are what make our software so robust today. We made it a lot slower though.
[1]: a 60Hz CPU executes an clock tick once every 16ms, Pentium 1 was the most popular CPU when Windows 95 released, and it had a clock speed of 60/66MHz; it was the first to be able to do two instructions per clock, meaning it had a best case scenario of 7.5ms assuming the 66MHz option was available.
On [1] aren’t you off by several orders of magnitude, 1MHz is 1,000,000 Hz. You’re correct for 60Hz but then use that value for MHz which is quite a lot more clocks per second.
30 years ago you could do a lot with 8-16 mb ram and no internet connection.
Nowadays... unplug a modern computer from the internet and it's mostly useless.
Want to write a simple document, using a basic word preocessor? You're going to need at least 1GB ram, mate.
VisualBasic allowed writing GUI-enabled applications that would be essentially be standalone, shipped as executable files sized at (usually) less than a megabyte that would mostly "just work". Nowadays most people write GUI app in the form of frame-less web browsers displaying html and running Javascript, requiring more than one cpu core and 2-4 GB of ram to be barely usable.
Yeah I definitely can see how "The quality of your average tech worker has completely nosedived in the last 10-15 years".
As a system engineer... I have worked with some developers that constantly had memory issue ("I need more memory to run my containers, please increase the memory quota in the (kubernetes cluster) namespace") and reasoning about memory was just a non-existant skill. They just had no idea what was using memory in their software. Literally, their ass was mostly saved by the fact that process management nowadays is "good" at restarting software and managing retries on failures. Knowledge about the underlying operating system was also very very basic.
Such people, while surely filled with good intentions, were mostly useless without some random Medium article instructing how to apply some generic fixes, or without some stackoverflow question covering their exact use case.
But wait, it's way worse than this. There also are people that got promoted to managers, don't have a solid technological background (besides the little coding they've been doing before being promoted) and just can't understand the willingness to optimize and "geek out" on performance issues (and the value of such activities).
Oh and don't even get me started on nowadays a "senior developer" is somebody that is good at "delivering value to customers" instead of somebody that knows the ins and outs of the problem domain and has mastery over the technological stack.
Working in tech nowadays is a lot less about tech, it seems.
GP left a fair number of qualifiers in their post which you have conveniently ignored so that you could cherry pick an example of people being laid off from roles perceived to be comparatively hard / prestigious.
This is a bad-faith argument. If you disagree with what they’re clearly trying to convey, and want to make a case against it, I’d be keen to hear one without cherry-picking some compiler engineering layoffs in a company where the vast majority of others are obviously doing something very different to that.
If you feel my argument is in bad faith it would be useful if you could provide evidence for OP's opinions. I personally find the comment quite uninsightful, unfounded and specially disrespectful towards those affected by this new wave of layoffs.
Your comment is basically akin to complaining how rough and slow it would take a person with a machete to trailblaze a path through a dense forest as opposed to how long it takes a person to travel the beautiful paved road that was built on top of that trailblazed path. Somebody had to thrash through all the shit for that superhighway you are traveling on my friend.
I was around back then. We didn’t have a magical browser box that you could type a couple of key terms into to get a thousand articles, code samples, philosophical discussions from hundreds of smarter people than you who have already solved your problem a dozen different ways and got to chose, and improve one. You didn’t have hundreds of languages, libraries, and frameworks where you could selectively pick the right tool. Back then you had your problem and you experimented and invented until you solved it generally with the one or two tools that were available at the time.
And with the benefit of nearly 40 years in tech…I can attest to the OP’s opinion that the quality of the average tech worker has nosedived since then.
Haha I was talking about this to a colleague the other day. That if it was like 1987 and we wanted to use a fast Fourier transform, we would pull a book off the shelf like numerical recipes in C hoping it was there and implement what we find there. Then after a couple weeks of making sure that worked we would realize it's not fast enough so we would probably do things until it does what we want. Today you just use any one of the one liner libraries that offer it like scipy, pytorch, etc. but that's amazing because it offers us a sophisticated solution that we can use out of the box to design our own sophisticated solutions.
Couldn't agree more. 30+ years of experience here. 10% of developers do 90% of the work. 10% of developers are so bad that productivity for the team would improve if you fired them.
The LLVM layoffs notwithstanding, OP is still largely correct in general. Your typical developer could solder, knew how the CPU works, knew roughly how many clocks would retire a MOV, ADD, MUL, etc, was cognizant of wait states, understood cache lines, protection rings, instruction decode, memory mapping, etc.
Try asking a typical Django or React developer today what a cache line even is.
Not 10 or 15 years ago. Maybe that level of knowledge was common in the 70s or 80s.
It is not possible either now - there is just too much complexity for people to know all layers of a stack. Look at how much more complex just a CPU is. Transistor counts in CPUs have gone up by five or six orders of magnitude since the 8 bit era. We now have all kinds of complex optimisations like out of order execution.
I still marvel our younger IT workers when I mention that in the 1990s, a very large percentage of sysadmins could not only program, but knew C and regularly wrote ad hoc tools in C (or Perl. Perl? What's Perl?).
These days, if you're a sysadmin who can program, you're considered some sort of wizard.
Senior sysadmins in the late 2000's knew how to code for sure, usually perl and some C (though not to the level you'd want them writing code full time).
More fancy sysadmins who were "modern" would learn Ruby.
The issue with "sysadmin" was that it suffered an inflation treadmill, what used to be a very difficult job where you would need to understand programming, debugging, compilers, OS design, distributed systems and so forth became a title for senior helpdesk staff; who, naturally, did not know those things.
Steve Wozniak, the Steve that built the first Apple computers, knew what every component on it did, and he wrote every line of the ROM. Nowadays it's, "Well a Stackoverflow comment said if I add this line in the config, it will work, and it did, so... I closed the ticket.", but the complexity of computers nowadays makes it unfeasible to learn everything inside out, at least not if you need a gazillion developers (I think only a low low sub-percent of them know their software and hardware stacks inside out) because everything needs to be software-ized.
All abstractions leak like a sieve. Being a developer at ANY layer of the stack without knowing what's happening underneath you is default gross incompetence and a recipe for eventual disaster.
Are we cool with surgeons not knowing what mitochondria are, or how cells work? All you gotta know is where to put the scalpel, bro. I'm sure it will work out just fine.
I don’t expect my doctor to know the component parts of atoms. Nor do I need them to.
I don’t expect a designer to know how a pixel works.
At some point in the stack we have to draw a line and say this level of detailed knowledge is not generally useful to the majority.
In terms of generic web development today, I would be fine at drawing that line at low level system concerns (memory allocation, cpu cycles, hell even compilation)
I recently had a conversation with a senior React dev about Redux with years of React experience. I mentioned during a dry run that you can clear the store, if a local storage store (default), with localStorage.clear() from the dev console. Response: "Cool, I didn't know that.". This guy is good at creating UIs, but that's an example of 'How this works' knowledge I'd of expected him to have. It's a pervasive epidemic of learn just what you need to ship, instead of learn what you need to ship a quality product and hone the tools in your toolbox.
Lots of people here completely missing the point - understanding assembly in the age of infinite cloud computing resources is equivalent to saying a guy who can build a wood hut on his own is more able to get jobs than a modern day construction worker. Even though they have some random or deeper technical knowledge, it doesn’t make them any more employable or relevant
I saw the same influx of salary-seekers during the dot com boom. Generally people that don't have a native curiosity and love of computing just don't pan out well. I'm not gatekeeping or against them trying, would love it if people were more honest about it but honesty isn't part of the hiring culture ("why do you want to work here?" "money" is frowned upon as an answer.)
It would be like if there was a housing boom and residential house framing ramped up base hourly pay to $100/hour with lunches provided and relaxation chambers onsite. You would see a huge influx of people going to it for the money rather than someone who is really into construction, knows load bearing tolerances, code for hurricanes or earthquakes, how to properly frame stairs, etc.
Right. New kids will deliver you 50k LOC React frontend app with additional 1GB of `node_modules`, where you may actually feel good with PHP website sprinkled with jQuery.
Yes, you are right. There are plenty of those too in that group sadly. But my point still stands - most of the smart, up to do date, experienced devs who have "seen it all" were probably writing god awful VB6 and PHP code when they were teenagers.
The "new kids on the block" were the ones I saw that decided that statically typed languages suck and dynamic is the way forward, only to discover how static types help and now they are bolting on static types to their dynamic stack.
I'll also say that I am not trying to generalize the newer generation as bad programmers; I have worked with plenty of very smart younger devs. It's just much more harder to come across them now in a sea of mediocre bootcamp devs.
But I agree that the layoffs probably aren't based on programmer quality. When big companies lay off, they generally select products, and if you are unlucky enough to be working on that product...off you go. Merit-based layoffs are more stealthy and continuous, those are definitely being ramped up now, but probably not in the numbers companies need to cut due to overhiring or hiring in the wrong areas.
I will fully discard your argument because Windows 95 was a tech marvel of its time, huge leap in consumer OS segment and stayed unchallenged until introduction of Windows 2000 Professional, which was actually more on the workstation side.
BSODs were mostly avoided by using MS-authored drivers.
> It turns out that compared to the dedicated nerds of the previous generation, most new people just don't care that much about tech and don't want to go deeper than the bare minimum required by their job.
The average day-to-day work is menial, for example investigating a production issue. That isn't something that people get into as a hobby, that's work in a very unromantic sense. Maybe having good systems background can help in those situations, but it's a little more subtle
I do think though that it works the other way: people who actually care about computers don't care if tech is no longer a money field, they just want to work on computer stuff. And maybe earlier it wasn't as much of a money field. But that doesn't imply that the enthusiasts are better at fulfilling work responsibilities. A lot of work stuff is business specific and not very dependent on systems background
So yeah, I don't think the part about over-hiring is necessarily wrong, but this sort of "I'm better than you" attitude isn't very becoming. Maybe you should give this [0] a read, just to see a different perspective
I'd never heard of that novel or these laws, but as an American it would be unthinkable to consider them anything other than Kafkaesque dark satire. The idea that they've moved from that to some sort of positive model for egalitarian society is downright creepy. Specifically, because they all contain the hypocrisy of saying explicitly as a group the very thing they're telling the individual not to say. Taking the side of we is therefore willfully brutal and repugnant to individual dignity. Presented without irony, is this some sort of fascism-light?
Scandinavians know there is some truth to it. But it has more to do with what you say and do. You can DO amazing things but you are not allowed to brag about it. This is better than bragging but not doing which is very common in certain places.
But this mentality has basically vanished from the main cities in the last three decades because marketing is a competitive field, and you need all the tools in the toolbox to not get eaten alive.
Sure, if you like. Although a nit: communism historically doesn't mind celebrating individual heroics or achievements as such, if they're seen as selflessly done in service of the cause. The view of society as a single organism which knows better than you is more central to the definition and prototype of fascism, even if it's also a result of any kind of collective reordering of a society.
This doesn’t mesh with my experience. I work at big tech co that went through massive rounds of layoffs. The criteria was a mixed bag. A lot of it was just entire products and associated teams being dismantled.
Also, one can make the argument the other way as well. Tools and infra have gotten good enough that the work doesn’t require the “dedicated nerds”. So let’s get rid of the highest paid engineers as that’s the most impactful to our bottom line. They coincidentally happen to be the nerds. We can always hire them back for less because what else are these nerds going to do in this market.
Not saying that’s the case but demonstrating that it’s pretty easy to string together silly notions we might have to explain reality.
What the GP said was how the bloat accumulated and why the professional quality went down. What you are saying is how they are trimming the bloat. It's not possible to trim by skill alone, because it would leave all teams and products understuffed. But the bloat is there and it is insane.
I don't think it is at all. OP put up a strawman on how the tech world is packed with incompetent new arrivals, but corporations are firing people indiscrimately by blindly cutting branches out of their org tree. Corporate's criteria to fire people is deciding which product they can either slow development, freeze, or even completely eliminate. This has absolutely nothing to do with the skills of whatever employees are covered by the firing rounds. These are pure business decisions which bear no connection with the tech aspect of their services. This is not bloat. In fact, I know for a fact that one FANG was still hiring midway through a round of layoffs and their HR was adamant in not even considering internal transfers for those positions. Are we expected to believe that employees who not only passed last year's hiring bar and received positive performance reviews are bloat, but today's new recruits are competent?
This assumes "professional quality" was somehow higher in the past when it's really just looking at the past with rose-tinted glasses. There was plenty of crap and even bloated software written in the past.
> Also, one can make the argument the other way as well. Tools and infra have gotten good enough that the work doesn’t require the “dedicated nerds”
Has it though? In my career I have seen people die on that hill, but only because their entire careers were utterly reliant upon those tools. If you cannot execute without a given set of tools you are in no position of objectivity when it comes to those tools, which results in a lot of catastrophically bad business decisions.
How do you define "quality" of tech worker? What is the profile of people being let go? Are they recent hire or people with quick certification, bootcamp, etc? A quick review of people on LinkedIn with Open to Work badges seems to paint a different picture than any claim of people being let go are of lower quality.
> It turns out that compared to the dedicated nerds of the previous generation, most new people just don't care that much about tech
Now having experienced almost three generations coming to adulthood, I hard disagree with any such notion of newer generation not being dedicated and smarter. I have lot more respect for the newer generation: what they have been served with and how they are handling it. The changes in last 10 or 20 years have far exceeded any changes I experienced in my first 0-40 years of life. And, imo newer generation has been adapting much better to such changes than the previous generations including my own. This adaption might come across not being "dedicated nerds" to previous generation but with the rapid change taking place, ability to adapt quickly trumps being "dedicated nerd."
> A quick review of people on LinkedIn with Open to Work badges seems to paint a different picture than any claim of people being let go are of lower quality.
Come on, LinkedIn (CVs) tell nothing. Having been somewhere is often made up as great experience even if did not yield much experience, or one wasn't as involved as claimed, and even been somewhere is a lie.
It is usually the opposite if I check known colleagues: null performers and phonies have outstanding profiles, the good or super programmer somtimes an awkward or does-not-care profile.
(But yeah, also admitting: Companies usually fire by the wrong metrics and even direct managers, which are bad managers, do not know the real differences of their underlings..)
> notion of newer generation not being dedicated and smarter
Full agree, I don't think it is much new generation vs old, there are the same kind of great peoples.
But computer science really changed and grew. E.g. I strongly remember when I joined university 20 years or so ago how all tutors claimed how big our year is, and that before us everybody knew each other (profs/tutors/students), but now impossible. But still, 80-90% of the students that survived the first semesters just belonged there and would be great coders or computer scientists, with the right skills and ambitions. If you look nowadays, it is different and maybe more similar to other professions, and more 50/50 between those guys, and the other half which is ambitionless and/or just lack the skills, mindset, whatever, and would have better become something else. If you aks those some even freely admit they hate their job, but only drawn by the money.
> It is usually the opposite if I check known colleagues: null performers and phonies have outstanding profiles, the good or super programmer somtimes an awkward or does-not-care profile.
I don't think you're interpreting these things right, and I'll tell you why.
I know superb software engineers who don't update their online presence and didn't even bothered to update their LinkedIn profile to change their profile photo uploaded over a decade ago. I also know newly graduated software engineers who have a well polished CV complete with profile photo taken by a pro photographer.
The key difference in both cases is that the newly graduated software engineer had to go through recruitment processes very recently, while the superb veters software engineer climbed the corporate ladder and eventually landed a CTO role in the very same company he joined over a decade ago, and during this time he had zero need to even look at job boards.
Most people don't update their CV if they aren't applying for jobs, let alone update profiles in job boards if they are not applying to jobs ads.
Failing to update your profile in a job board does not correlate with expertise. It correlates with the need to seek a new job. Whether you're a lazy incompetent fool or a software engineering wizard, you won't update your LinkedIn profile unless you feel you need to switch jobs.
I didn't interpret, just said as it is for people who I definitely know how good they are; yours is exactly one explanation why that is so also for fresh vs settled ones that way that I wouldn't (and didn't?) exclude, thr list of details is endless, didn't bring them all. Just wanted to say: evaluating experience and skills of people by LinkedIn profiles is pointless - you cannot compare objectively there.
> I didn't interpret, just said as it is for people who I definitely know how good they are;
You explicitly claimed that from your observation "null performers and phonies have outstanding profiles, the good or super programmer somtimes an awkward or does-not-care profile."
The matter of fact is that, contrary to your claim, the amount of care you invest in things like CV and job board profiles is not correlated with competence. It's instead correlated with the need for a career change.
Those who are not looking for a job are far less likely to waste their time curating a profile or online presence they gain nothing from. They're wasting their time. If instead you find yourself looking for a job, whether you're unemployed, about to be fired, or just looking for a better role, you're going to work on marketing yourself to prospective employers. This means updating your CV, updating a job board or two, and perhaps even write a cover letter.
If you're not wasting your time thinking of switching jobs, you don't waste time with the likes of LinkedIn.
It's like a healthy diet: most people only start to become mindful of its importance when they experience any form of health scare.
Yeah, that's something I've noticed. I've worked with a few people who were brilliant and really good at something, but that something wasn't writing software. Even the people who are frustratingly bad at their jobs have rarely made me think they're idiots. They're just not very good at the specific thing they're being paid to do.
The set of people who pass the hiring bar at companies like Google are not modelled as the normal distribution of the average person you come across in the grocery store.
Companies like Google are not known for hiring morons.
It is a self-aggrandizing dumb comparison, and should be dismissed as such. It reads as a desperate attempt to rationalize away these layoffs as something that only applies to others.
The quote from George Carlin, a comedian, is about the general population. The quote itself is from "Doin' It Again / Parental Advisory: Explicit Lyrics" a special in 1990 and predates Google.
I always point that out too. And someone always points out that with distributions average and median are the same. That still doesn't make it right. What's worse is it's a quote about intelligence, so you would think it would use the correct word. Maybe that's the joke, it's summer
dumbed down and smug in the knowledge that everyone that hears it thinks "thank god I'm above average!"
Explains why tools like Copilot seems to be so hyped, yet I as an old school developer really can't find a place where it would help me.
I guess most people really benefit from a tool to autogenerate an Erathostenes sieve in Python or a fizzbuzz function in Prolog or yet another React Router boilerplate.
I do really benefit from finally having a way to use shell scripting. Life is too short to remember hundreds of one letter options with obscure combinations
In my experience, this mostly is because the “hiring bar” at big companies is frequently disassociated from actual day-to-day work. It typically looks like
1. memorizing some 50-200 or so domain-specific general patterns which problem solutions will tend to follow, so you can solve an arbitrary leetcode medium in under 20 minutes time — a skill that has almost nothing to do with the plumbing and product work most tech-workers engage in on a day to day basis, and
2. Speak at length and extemporaneously on arbitrary from-scratch system design problems — a skill which, while sometimes important, is a skill which most engineers that aren’t working at a startup won’t usually exercise more than once or twice a year, as the ratio of work on “systems already designed” to “systems needing new designs” will overwhelmingly skew towards pre-existence.
When your hiring process and day job are so thoroughly dissociated and unrelated to each other, it’s not a surprise current workers who aren’t actively studying for a job change couldn’t pass it.
I don't think this is a significant factor in the size of the layoffs. Many high-level and high-tenure employees are being laid off, and this is apparent in many of the various companies that have had significant layoffs in the last couple years. The lay-offs have also heavily affected departments outside of engineering: product, marketing, finance, etc. have all been affected.
I don't disagree that companies greatly over-hired, but I don't think the largest portion there has been bootcamp/cert/junior SWEs.
I was just today reminiscing about how smart and capable everyone seemed at the startup I interned at in 2013, and comparing them to the quality of engineers I’ve worked with in the startup scene more recently. The difference is stark. I can’t pin down one reason but I agree with you. Too many bootcamps. Too much resume-driven development. Too many engineering managers whose only programming language is JQL.
Another factor here is that for many reasons startup equity (and thus startup comp in general) is much less valuable statistically than it was 10-15 years ago. Everyone knows that it a shit deal now, so many of the best engineers, those who can, are in big tech now. Leaving startups with lower quality talent pool, or those who just need a first job.
I say that with sadness and fond memories of ~2013 startup culture.
I started my career in start ups, they ask for 150% and want to pay 75. It's only fresh graduates and bootcamp people who fall for the equity scam nowadays. Everybody who has been around the block knows they are going to dilute the hell put of it before any liquidity event. It's really a shame beacuse I am far more productive in that environment I just wish I was able to eat more of what I kill there.
> The quality of your average tech worker has completely nosedived in the last 10-15 years.
Not my experience. I've conducted 300+ interviews at FAANG in the last 15 years and haven't seen a meaningful drop in the hiring bar.
Based on anecdotes, I have another theory. The people that build the systems got old and are unwilling to work the crazy hours. The new folks are hired to work on maintaining these systems, and it's not fun. So overall both quality and speed declined.
Not arguing over original point of quality of tech workers but...
> I've conducted 300+ interviews at FAANG in the last 15 years and haven't seen a meaningful drop in the hiring bar.
Are FAANG interviews these days an indicator of anything besides being good at FAANG interviews? Some of the best people I worked with that brought real value would never pass big tech leetcode interview...
> Some of the best people I worked with that brought real value would never pass big tech leetcode interview...
FAANGs don't do leetcode. They may ask about data structures knowledge, but even their coding interviews are aimed at evaluating stuff like system design, write readable code, be coachable, and even soft skills.
I interviewed at Microsoft and I got multiple questions lifted straight from leetcode word for word. Some easy and some hard DP questions.
Google and Netflix had hard questions as well, but they were all novel questions not on leetcode/hackerrank. However, CS algorithms 101 were fundamental to both.
It's much easier and faster to build greenfield systems that have no historical baggage than it is to retrofit older systems that already support large customer bases and revenues for newer features.
Not just pure tech systems, but businesses and non-tech systems as well. And tech companies have gotten large with a ton of baggage and customers
>The quality of your average tech worker has completely nosedived in the last 10-15 years.
Initially this sounded right, but then I remembered all the trash that showed up during the dot boom and subsequent booms. It's just hiring booms attract a lot of people who want to make easy money and those people think programing is easy money (it's not). Bootcamps and even CS degrees are just step one of many steps and many people think step one is the only step and don't work to further their craft. Once it becomes evident that an org has way too many of these people, it starts to cull. Unfortunately, large orgs are very bad at "rightsizing," and just "mow the lawn."
The problem with tech is to be good at it, you have to be passionate about it to the point of near obsession. That's a requirement above and beyond just being smart or good at math, or even good at programming. You can fill a room with brilliant but dispassionate people and you won't get much done.
So you're right in a 15 year cycle, but this cycle has happened before and we're now in the the culling part.
> The quality of your average tech worked has completely nosedived in the last 10-15 years.
It's definitely got worse over the last ~10 years, but the pandemic era hiring from 2020-2021 took it to new level of bad at many companies. Anyone who had done some basic coding was getting hired at a great salary in 2021, and these individuals are still around today – some now calling themselves "senior developers".
But I completely agree with what you're saying here about about it no longer being nerds working in tech. A nerd who isn't the best coding will still engage with me about technical details, be keen to learn new things, and care about their work. A lot of the new guys I work with today are just doing the bare minimum and they think you're an idiot for caring.
I have evidence to support this - I lost people from my team who were offered "senior" or "lead" developer roles when they were nowhere near the level required for that kind of job. The level they were at was already a a stretch that they were growing into.
...and with a salary boost to match.
Now in the UK salaries have flatlined and the number of jobs is way down, and this no longer seems to be a problem.
"I see no hope for the future of our people if they are dependent on the frivolous youth of today, for certainly all youth are reckless beyond words. When I was a boy, we were taught to be discrete and respectful of elders, but the present youth are exceedingly wise and impatient of restraint." (c) Hesiod
I agree with the opinion, but not the timeframe. I would say the quality has nosedived over the last 8 years, based on my experience as the tech interviewer for potential new hire interviews across the last 12 years on different projects.
The last 4 years I've gotten comments from co-interviewers of 'you ask the hard questions', 'you are tough'. These are in response to questions such as 'What are your thoughts on composition vs inheritance in object oriented design', 'What are the arguments for Java as a better language than Javascript? How about the other way around?', 'What's a progressive web app, what aspects of React and what react-related libs help with creating one?'. These were all directly relevant to the positions being interviewed, and some of those were for senior engineer positions.
These are questions that I think should be answerable by any senior frontend engineer, and most senior software engineers.
More charitably: Many people who 10-15 years ago would have excelled in a non-tech field have since then been drawn into tech due to shrinkage in those other fields and to the attractively high compensation in tech.
Having survived through many layoffs, this is 100% not the case. The modern layoffs don't even seem to consider whether or not the person was a top contributor or sme. They are done to reduce cost and look good to the board/investors/whoever.
I'm pretty sure I survive because my tco is probably more on the humble side but I'm far from cheap.
Even if we assume that what you think is true then it is still solely the companies' fault. Because if the prospective employees are so shit why would you hire them in the first place?
I feel like assigning blame is neither here nor there. Leaders in these companies often look at what other companies are doing, and follow suite. There's an argument that Elon laying off 80% of twitter is another catalyst - so why companies not follow?
Ultimately, if hiring managers offered these roles because work was needed but no longer is, would you blame them for adjusting their approach?
I'm not looking to assign blame or anything, I've just seen this pattern play out in the last couple of decades. I think it's a valid theory of the current "market correction."
Because time-to-market matters? Because even minimally productive folks were ROI-positive in ZIRP world, but a bad idea in 8% interest world? Because Wall Street rewards growth?
The proliferation of LeetCode for interviews means that we have an army of employed software engineers who are great at solving toy puzzles, but when it comes to actual business and / or customer problems, they stumble and falter. Management only have themselves to blame for spreading this kind of culture of interviewing around.
And yet, the engineering interviews still typically revolve around Leetcode algorithms instead of the messy but hardly CS-intensive day to day work that engineering typically involves.
The more experience you rack up in your career, the more likely you are to walk away from jobs with laborious hiring processes too.
I will politely decline any opportunity that requires me to do multiple code and architecture tests and several other lengthy meetings, usually spread out over the course of months, because it’s not worth my time to jump through so many hoops. It signals to me that there is already a tendency to implement process for the sake of process.
What will really sell me on a company is a more in depth conversation with the leadership and the team where we learn more about what we value and how we like to operate. And that’s how I would interview anyone who was referred by someone I trust.
Very true, because an experienced engineer had internalised that the hard part of this field is not writing code. A company that focused all on that in their hiring process means they're either misguided, or are looking for inexperienced "coding monkeys."
In a productive day as a solo dev, hitting keys on my keyboard is the easiest part, the hardest part is thinking about a problem in front of my notepad, usually staring out my living room window or while taking a shower.
That's all well and good if you're willing to leave a lot of money on the table. All the companies that hand out half-a-million to a million-dollars/yr total compensation packages operate using similar hiring processes.
That's a lot of money over the span of even a few years to miss out on.
I'm in the UK so that kind of compensation package is completely unheard of. In London it's still feasible to pull in 100k+ without running the gauntlet.
That's (very charitably) because those are weed-out questions for early-stage engineers.
Lots of companies feel they need weed-out questions because they're swamped with applicants. How does applicant approach a problem? Does applicant describe their thought process figuring it out? Can applicant figure it out?
Did that tell anyone how good the applicant is at the job? No. It just provided some gates for the interviewer to say "no hire."
It's hard to differentiate the applicants' skills so early in their careers - not enough work history to go on. I have no idea why these would be used for senior positions except that "senior" is this year's "junior."
I was recently looking around at jobs in a different country and at least one place had a FAANG style interview process. I currently have 10 years experience in software engineering so I wasn't too keen to do it but I am considering doing it because they offer higher salaries than other companies.
I'm scared about the skill level of some of my coworkers. They are 30+ yo and I hear them debating about whether or not docker needs vt-d activated, if .msi packages can be customized, if terraform is useful for deploying on a single esxi.... All things that can be foind out with 1s of googling they will talk about it for 5 to 15min in meetings (formal or not) and the conclusion is always "we will have to test this".
There’s a decent chance they want to chit-chat a bit. I’ll occasionally strike up a conversation I know I could avoid by googling, just to have a conversation.
The industry hired more people than the small percent of the population before that were attracted to tech.
The jobs needed to change to accommodate all the new, less-tech-interested people. They became more standardized and less thought-involving. More and more jobs are now about using frameworks and writing glue code instead of solving novel problems.
Even the new techier-than-average people are likely in less techy jobs. And jobs less oriented to making them grow into their best geek selves.
Because the jobs became oriented to hiring neophytes to do the bulk of the work, we also see title compression and a low cap to expected salaries. If someone seems too senior and starts to cost too much, this model allows replacing them with new tabula rasa (bootcamp) workers.
This exact complaint was common in 2001. Which hey maybe it’s true! Maybe the nerds from 2 generations ago could have built all this technology 10x over… except they didn’t.
I, for one, had the same complaint then. The dotcom boom pulled in lots of people not interested in tech, just in paychecks.
> Maybe the nerds from 2 generations ago could have built all this technology 10x over… except they didn’t
Fantasy math aside, the nerds don't really decide what tech to build. There was no mass of nerds 25 years ago arguing over NNTP about who could build the best ad platform for the newly-emerging World Wide Web. That's a suit thing.
There’s also just no process anymore. Product managers can’t even be bothered to fill out more than a title line in a ticket. Burndown charts? Ha who cares about points on the tickets.
I don’t think the average tech worker is dumb. They may even be better than the average in the 2000s/2010s (tech boom), maybe not, but I’m at least sure especially those at Google could get and stay hired in earlier times.
What I suspect is that tech companies have many more workers than needed, because most software simply doesn’t need many people to build and maintain. I remember massive tech companies running on teams of <100 people, and even today, I hear of many critical departments with only around a dozen or less.
Google search, Github, AWS, Outlook, Facebook, Uber, IntelliJ. These are software behemoths, but at their core they’re rather simple, and a single developer could make an MVP of any in a few weeks, and with adequate testers and resources, something scalable and stable that could be used by professionals in a few months. And that’s creating it from scratch (well, existing libraries), though maintaining everything maybe doesn’t require much less, it surely doesn’t require more. I understand the big products have more, at least name recognition, and I understand that a big product needs more than programmers (asset creation, marketing, QA, tech support, accounting, etc.) But the software development side isn’t very big, especially if you already have the software.
Hey, chatgpt, write me a dropbox script, that mimics dropbox. Nono, this library does not have this function. No, this is also useless. What are you suggesting, aaaah, get me out of here.
Around 2015, I read an article saying that computer science was the most popular major for Harvard undergraduates. I found it quite surprising. I would think that a Harvard student could basically do anything with their life that they want: medicine, law, business, politics, anything they set their minds to.
Why, I wondered, would they want to enter this strange industry with all our odd characters and personalities? To me it's like, this is a place where you're going to compete to fix or improve something in the Linux kernel in order for Linus to swear at you and publicly humiliate you in an immutable ledger called the LKML. (Welcome to tech!) Even now, I don't really understand how this could have become the most popular major at Harvard.
I think things will sort themselves out in time. A person might start a career with little inherent interest but stumble into something that excites them and excel at it. Or a person might work in technology for a while, decide it's not right for them, and move on to something else. We'll have to see where it goes.
And this is on a backdrop of a long term 'hollowing out of the middle' as technology automates rote work (AI being the latest incarnation), leaving only high skilled jobs (including in tech) on one end, and manual jobs (driving, picking, construction) that we've been unable to automate on the other. Thus both ends (high and low skill) are seeing 'refugees' from this economic process, with overskilled people doing menial work, and underskilled people trying to tackle high end work.
This doesn’t really make sense - if it were true a single company could pay many times what faang does to secure talent 10x+ better than a normal tech worker and have unfettered success. It also means a 10x dev years ago would be.. 100x now?
It also implies some weird logic that if you were into tech before, you’re better than someone who was born 10 years later and is into tech now. And implications that if you’re not devoted to your employers industry your work output is meaningfully worse, which maybe is true but maybe not.
Apologies, I'm not sure I understand what you're saying so would love to hear more of your thoughts.
> It also implies some weird logic that if you were into tech before, you’re better than someone who was born 10 years later and is into tech now.
This was not what I was trying to say. The amount of people who work in tech is far, far higher now than it was 20 years ago. It just brought more "average" people in who don't really push the boundaries of what is possible. Lots of apps that move strings in and out of databases, very little real innovation going on from smart people.
Your first point assumes that it is easy to identify top tier talent during the hiring process and that tech employment is not a market for lemons [1].
My go to example of this: jQuery. Now for JS it’s primarily React, but the nosedive, or rather nosebleed, started with jQuery. You no longer had to understand the technology or data structures because there were declarative APIs that made things easy. The products were crippled and slow but a new wave of people could suddenly participate.
Back in my day you needed a PhD in CSS to center a div vertically. We used to pore over the freshest W3C drafts with excitement. You don't see that anymore since jQuery.
It's not that. I used to work on web apps about 20 years ago, and we created some involved interfaces using hand-coded cross-browser AJAX and DOM (it was before jquery even existed; iirc I remember when prototype.js came out and it was very exciting). They were really fast. It took a fair amount of skill and perseverance to do that, and that filtered out a lot of the developers.
These days, the people who could do the above can instead do it better (and in much less time) with jQuery. However, the people who previously couldn't do it at all in any reasonable amount of time, can now also do it with jQuery... poorly.
There's a middle (the people who would struggle with the low level but are proficient enough to do a good job with jQuery) but they are, in my impression... not a majority.
I believe that's why most sites (especially non-tech e.g. Chase bank and Amex) and many Electron apps (Teams, Evernote, ...) are such bloated, slow, glitchy monstrosities. Well that, combined with what the top comment said about the money
Yes, that is silly until the employer becomes over leveraged or the resulting product becomes too expensive. Then hard choices must be made. At that point it’s not silly anymore.
Having written JS full time for 15 years most JS developers are deeply uncomfortable navigating complex data structures of any kind. There is a deep emotional reliance on declarative APIs to do that for them. This is a skill super foundational to programming in any language learned from experience, but in modern JS the reliance on tools fails to foster those very necessary skills.
This is a challenge to describe to software developers who have no experience writing original software, such as without a framework, because they have no frame of reference to consider alternative points of view. People might find it challenging to explain modern astronomy to tribal people living in a jungle, for example, because the necessary intermediate education is absent.
"I think a solid 50% of people in tech are still on the chopping block. You can do much more with tools + really smart people in the year 2024 than you could before."
Absolutely not. So much more red tape now rquires a ton more headcount. Just think how much time we spend these days to fix dependency hells and patching security issues.
Maybe it's just beneficial to move web developers to a different catalog? So you have the web developers who mostly chase new or cool frames/tools, prefer dealing with human problems than machines problems, and other developers.
I mostly agree with your point and I have a corollary around this part:
> Simultaneously, a huge portion of the world saw tech salaries and wanted in on it so they started taking every quick certification, bootcamp, degree, etc to get into tech.
In 2007-2017 the concept of "startup company" was also heavily romanticized, to the point it was almost toxic. I was in university in 2013 and pretty much anyone wanted to build a startup. More often than not it was non-tech people exploiting tech people into building MVPs. It was painful to see (and in a few cases, to experience).
Don't even get me started on "apps". Every moron on the block had some kind of "idea for an app" and was completely clueless to how the app markets worked (let alone how to actually do that).
Of course, being pretty much all amateurish at best, no single business plan was in sight.
> The quality of your average tech worker has completely nosedived in the last 10-15 years.
The average tech job requires much less skills though. It's not like we're all working on rockets or the building blocks of the internet. A good 70% of tech jobs today are crud api and basic web/mobile apps
I've heard people describe the corporate travel platform I work on described as a "simple CRUD app" so I'm not sure I believe this is true about almost any software company.
Interesting idea. But supposing that the quality never nosedived, I suspect companies would still be doing these mass layoffs. Quality is always relative to the overall pool of job applicants, and companies would just layoff people with a higher bar in mind.
This is my bet as well. They're not firing because they have to. They're firing because the bean counters showed them how much more money they could have if they did.
Coast for a year or two, save millions, then rehire back when they need to innovate or update.
> The quality of your average tech worker has completely nosedived in the last 10-15 years.
I don't think your take holds water. It reads as a mix between self-aggrandizing and ladder-pulling, typically pinned on the so-called boomer mindset.
For the past 15 years you have seen universities producing graduates that are far better prepared for a cloud world in every single aspect of the business, and these graduates start off working in cloud-related projects. All frameworks that dominate front-end and back-end components were created in the past decade, and leverage the same cloud-based competencies that new graduates learn.
If anything, new graduates are heads and shoulders above veterans, and the only thing that they miss is two decades of work experience. That can be an asset or a liability.
I also add that the bulk of >15year veterans graduated in the 90s and 80s, which was a time when the academic world was still trying to figure out what it was supposed to teach in terms of software engineering, and most universities had basically scrambling to cover relevant topics. The graduates that went through those programs, unlike today's graduates, were woefully unprepared for the reality of software engineering. Two decades ago you could land a job by passing yourself as a "programmer" which meant you knew the syntax of a programming language such as Pascal or Cobol. That's the technical background of your average veteran with >15years of experience.
Therefore, I think you are entirely wrong. If anything, the quality of your average tech worker improved greatly in the last 15 years. Those who arrived in the field actually hit the ground running in today's tech world and often push over and replace much senior team members. Those who enter the field from non-tech fields have the technical chops ro replace both veterans and new graduates. In today's world you need more than claim to know C or Java to actually land a job.
Disclaimer: I'm a veteran with >15years of experience who worked at and was involved in the hiring process of a FANG.
I think this is only half the answer (or atleast part of it). This not only ends up attracting the make-quick-buck-with-certification types but also the political sociopaths who knows how to optimize time on the ladder over quality. This not only has lowered the bar but also made these places very toxic to work in! Passive aggressiveness anybody?
You are a company. The system we have demands growth. Even very stable and reliable profits are seen as failure. There must be growth.
The people who run a company can not press a magic button to increase revenues. They can't just pull a successful new project out of nowhere. Anything like that is going to be a risk, and will probably fail. It will also take time.
The one thing they can always do is cut costs. Projects can be cancelled. Divisions can be sold off. The biggest cost at most companies is labor, and labor can be let go.
When someone controls a company, they own a lot of shares in that company. Their bosses are all shareholders who only care that the stock price goes up. Nothing the company is doing is generating huge new revenue streams. Time for layoffs.
And when some people do layoffs, everyone does them. They're all subject to the same market pressures in the same industry. One company doing them gives all the other companies in the sector permission to do likewise. If a company doesn't follow suit the market might even start to question why.
You may have seen some news that Microsoft passed Apple briefly in terms of most valuable company on Earth. You may have also noticed that Apple is much more restrained in its layoffs than the others. Not doing as many layoffs, not doing as well in the market. These things are not unrelated.
This doesn't seem to me to explain why the layoffs are happening now. AFAIK Google has never done a true layoff until last January in the entire history of the company. Now it seems to be already routine.
As someone who grew up in the 80s-90s, my historical sense of layoffs is that they were a response to economic hard times, whether industry-wide or company-specific. Companies laid off when profits fell or disappeared, which then drove a need to search for ways to cut costs.
But the major tech companies are all seeing continuously growing revenues and profits within the context of very healthy general economic numbers, yet they are all laying people off.
The simple answer would be AI. That’s been the major change in the past year. I find it odd that everyone thinks Uber’s end game is automated cars, but no one seems to be talking about how tech companies end game is automated coders. LLMs are not even close, but what’s to say internally, they don’t have something that works. Or they have enough data to show they need fewer workers since they can leverage AI now.
I work at a FAANG, I can't speak for all companies divisions but I can pretty comfortably say that I have seen next to no evidence that AI tooling has become so productive that we "need" less engineers.
Actually it's somehow kind of the inverse, any AI related code has been subpar which has been putting a lot of stress on the core systems in terms of reviews and performance.
From my personal perspective, things overall seemed more productive pre ChatGPT for FAANG level companies.
Whats interesting is I think there might be a disconnect between AI capabilities and tech executives. I'm guessing executives believe AI will catch up to be good enough to multiply engineers productivity say 120% within the next few years - hence the 20% layoffs everywhere.
Maybe this will be the case. But for me on the ground its producing a lot more work. I now have to code review everything starting at a very high level working my way down and there's just so much MORE code now.
> Actually it's somehow kind of the inverse, any AI related code has been subpar which has been putting a lot of stress on the core systems in terms of reviews and performance.
My experience is that AI tooling just means engineers can do more, which means product managers want more… resulting in more work with the same engineering team…
We already had a revolution of automating coders, when Fortran first appeared. And a revolution of automating accountants, when a spreadsheet first appeared.
If anything, these revolutions increased the demand for programmers and specialists in finance. A person wielding a right automation tool is producing much more revenue, while demanding nearly the same salary; it makes sense to hire more of such people, as long as there is a market for the product or service your company provides.
The layoffs are a signal of this not happening. The AI is not helping enough to increase production and revenue per worker. Companies are out of new and efficacious business ideas. The companies don't know where to apply the intellectual / productive capability they have, so they are cutting it down, to save on its (substantial) upkeep.
A counterpoint would be: if you kept the same number of engineers and increased their productivity with AI, rather than keeping output the same and reducing the number of engineers, wouldn't you be producing more value overall?
"You are a company. The system we have demands growth. Even very stable and reliable profits are seen as failure. There must be growth."
But take a software company like McNeel. They make a popular but extremely niche market product called https://www.rhino3d.com. They've been around for decades. They have extremely low turnover. Many developers have worked there since the beginning.
They are used around the globe by architects, product designers, industrial designers, etc.
What are they doing right? Why/how do so many others get it wrong? Are humble software companies like this unicorns? Why?
> TLM, Inc. dba Robert McNeel & Associates is a closely held employee-owned Washington corporation funded solely from retained earnings.
So they're a private, employee owned company with a strong niche product that doesn't depend on outside money. They took the slow road, something that is antithetical to VCs but perfectly acceptable for bootstrapped founders that want to live comfortably but don't care about striking it rich.
McNeel is not a public company. Public companies are driven towards prioritizing revenue and growth much more than private companies because growth is a shared goal of all shareholders. I suppose it could be said that companies with outside PE or VC funding would also prioritize growth, but that would be in the same vein of getting to an exit or IPO rather than just making the stock go up.
idk what their profit structure or founding story is, but the poison pill for every company is investment capital. The second you take VC money you are beholden to someone other than your coworkers and clients. If you can get off the ground and become sustainable on sales alone, you are golden. Stability is possible but the problem is that most VCs demand 10x return, not 1.25x return.
Rhino is an amazing product that I use every day. The developers are active and responsive on their forums and I’ve had feature requests and bug fixes go from forum posts to shipping in a couple weeks. You can also pick up a telephone and immediately talk to a knowledgable tech support person. It’s amazing and I wish more companies operated like McNeel.
Edit: They also do that thing that everyone says is impossible and sell perpetual licenses. Every 18 months or so they offer paid upgrades to the next version and I always buy it because the price is reasonable and they're jam packed with useful new features.
> Rhinoceros (typically abbreviated Rhino or Rhino3D) is a commercial
> 3D computer graphics and computer-aided design (CAD) application
> software that was developed by TLM, Inc, dba Robert McNeel &
> Associates, an American, privately held, and employee-owned company
> that was founded in 1978.
So not a public company subject to the fickle stock market's expectations of constant growth...
> You may have also noticed that Apple is much more restrained in its layoffs than the others. Not doing as many layoffs, not doing as well in the market. These things are not unrelated.
There isn’t anything in life that’s “stable”. Not one single thing. Even rocks on the ground erode.
Because of inflation, stable is actually shrinking. If you raise prices perfectly in lock step with inflation, trends and tastes of your customers still change, necessitating innovation if only to maintain the exact same level.
In reality, you have to grow to ensure when the ground shifts from under you, there is still some buffer. Growth is insurance against an ever changing, unpredictable world.
The people who think things never change are people without imagination. They want safety and security, but that is an illusion.
> And when some people do layoffs, everyone does them. They're all subject to the same market pressures in the same industry. One company doing them gives all the other companies in the sector permission to do likewise. If a company doesn't follow suit the market might even start to question why.
I'd like to try and simplify this - it's simply the new "normal" and companies can get away with it(meaning the labor side does not punish it, yet - no unionising etc).
In the EU layoffs like this are not as easy to pull off by companies.
One reason I think not given enough weight is that it is now acceptable to do layoffs. These huge companies accrue a ton of dead weight and dead-end projects that they would love to flush routinely, but 5 years ago their stock prices would halve if they suddenly laid off 10% of their work force because of bad optics. Now it has become acceptable for all the various reasons so any company that wants to clean house is going to clean house.
Yep, executives just copy each other all the time. They start thinking about layoffs when other companies in the industry are doing it.
My company is discussing having people show up three days a week in the office. The only justification mentioned was that other companies are doing it. I am quite doubtful they did any sort of analysis to ensure it's a good idea.
Musk's bet, that every executive watched like a hawk, was to fire a whole bunch of "spoiled expensive developers" who would flap their wings about the sky falling (their perspective).
If Musk simply wanted to cut costs and get Twitter to the core of its profitability because the company had pretty much peaked, then it was a valid business plan in the Gordon Gecko realm.
But Elon talked out of all parts of his mouth nonstop, said he wanted vision and features but fired and hired for maintenance. It's weird that all the other companies too that as direction for their layoffs.
Executives were appalled at their lack of power in COVID too. I think the layoffs are an attempt to gain authority, I don't even think it is about the dollars and cents. Elites don't actually care about how rich they are after a certain point, what they care about is the gap between them and the "plebes", and the developer plebes were far too uppity in their view.
A lawyer commenting on the game industry (which is largely copying the Silicon Valley business model) has an excellent explanation of these cycles, and why it is that the industry is simultaneously reporting record profits and mass layoffs: https://www.youtube.com/watch?v=-653Z1val8s
tl;dw (in my own words): investors like it when you have extreme hiring during the good times, and they like it when you have extreme cutbacks during the bad times. Investors don’t like slow and steady growth.
Interestingly, Nintendo quite firmly rejects the Silicon Valley approach, sticking firmly with the slow & steady - and while the rest of the games industry was doing layoffs, Nintendo gave all their employees 10% raises.
I don't follow. Aren't employee salaries a deductible expense? The above makes me think more along the lines of no longer being able to deduct GPU costs incurred from deep learning research.
IIUC, whereas previously companies could deduct salaries in year Y from revenue in that same year Y, now section 174 allows deduction of only 1/5 of those salaries in each of the 5 years [Y,Y+1,Y+2,Y+3,Y+4].
After 5 years of this law being in effect, will the numbers balance back out to pre section 174? That is, does that deduction from year Y carry over into year Y+1, Y+2, etc.?
I mean, this probably still matters a lot for startups given their shorter lifetimes, but it seems any large company(I'm thinking of, e.g. Apple, who has plenty of cash on hand) that's been around for a while could just wait it out? I am not familiar with corporate tax law and how deals are structured, but could you also defer revenue in the same way to offset(e.g. customers with a 5 year contract paying progressively more but keeping the same total $ amount to sync with your deductions)?
This is for specific R&D costs, like doing something patentable. Most developers are not doing R&D.
Most developers are fixing bugs, helping sales, keeping systems online, keeping up to date on patches, performing security scans, complying to some internal policies.
> (3)Software development
For purposes of this section, any amount paid or incurred in connection with the development of any software shall be treated as a research or experimental expenditure.
What does the average company spend on R&D percentage-wise?
My old company was a contract engineering company, so they always tried to staff everyone possible on projects being paid for by customers. R&D was used to develop internal tech and build expertise on new technologies so it was only maybe 10-25% of most engineer's timesheets.
Do other tech companies spend that much more on R&D?
I think it's a combination of things and will vary by company.
For the largest companies, it's likely to some extent to put downward pressure on salaries in the market.
For others it can be like the Twitter case were they hired way too many devs for what was needed and that simply makes a bigger communications mess not more productivity.
For others still it's outside pressure to become financially stable or income positive with a reduction in investment capital over time.
The larger economy, the potential for a more widespread military conflict and a number of other factors taking their toll in different ways.
When belts need to tighten, late and expensive projects get cut.
There are still jobs out there, but with remote roles in particular, which I'm personally experiencing. Pay scales are all over the map and there are hundreds of applicants to many jobs out there.
It sucks to be looking and I imagine it sucks to be hiring as well. The latter because of the shear volumes of applicants to sift through.
I will say, I'd rather receive no contact at this point than the boilerplate rejection emails. If there's no substance or advice, it's just a net negative IMO. Used to hate getting dropped. But having to fill out a few dozen applications a day competing against hundreds and seeing the ones where you're less than ideal sucks.
I took 4 months off because I needed to get past the burnout. Now it's just hard getting back in.
IMO, what has happened is that big tech has realized they are monopolies or cartels, and the ethos of these companies as innovators has fundamentally died. Because Facebook's Metaverse, Google's multiple ventures in self-driving and other alphabet things, etc, have all failed miserably, particularly from the investment payoff.
Facebook and Google in the last 10 years went from usable worthwhile sites to ad-dominated monstrosities, and their revenue exploded. Likely the beancounters have taken total control, and recognized that worker salaries for all the worker bee tasks in the big tech companies are vastly overpriced.
This executive management decision is the underpinning of Musk's takeover and destruction of Twitter. It probably would have succeeded if he wasn't insane.
- excess hiring during the pandemic;
- interest rates raised;
- salaries too high;
- cut in remote work (to ease return to office later);
- some weird thing coming;
- AI;
(I could replace all those semi-colons with question marks.)
I don’t know about in general, but I can describe why I was laid off last year.
I was hired by a small consultancy on a software team supporting a dedicated software product for a small but powerful industry. Software was/is an emerging side business for that employer. This team completely lacked discipline and vision from a software execution perspective. The software was horribly organized, 80% of the logic was in SQL stored procedures, there was no test automation, and everything was copy/paste between environments. So, this was extremely high risk. The product side of the team, on the other hand, had extremely good discipline with a solid vision and extremely good documentation.
I was the only senior developer on the team with any advanced experience outside of SQL. It became super clear this employer was a mistake when they didn’t want me to fix anything and my junior peers back-stabbed me during 360s as salvation for their inability to communicate in writing. I just rode out the last several months until they eventually fired me when billable hours evaporated.
After a few months of looking for a new job I made a promise to myself to never EVER return to employment that feels immature. I would never take a job too reliant on frameworks and tools, such that the job/industry are compensating for talent with gimmicks. I abandoned my career writing JavaScript and eventually gained work in data science.
I can totally relate. This is painfully close to my experience (including the junior backstabbing to save their jobs and management not wanting to fix anything).
I can't say if this is common in the industry but it's certainly the most depressing situation to be in. I didn't see a way out of this that wasn't quitting. Changing the culture when people below and above you can't even realize they are drowning in their own mistakes seems like an impossible situation.
This is the bust side of the hiring frenzy of the past few years. Borrowing money isn't as cheap as it was, and investors get a better bang for their buck by letting it sit and earn interest.
Many companies over-hired and now are trying to lean up to balance their books. With some betting if they over-fire they can hire new talent at a lower rate.
Outside of the obvious mentions, including section 174 changes, I think that layoffs have caught on like a contagion and a bunch of mediocre managers and executives are borrowing from an outdated 80s layoff playbook.
The reality is that layoffs disconnected from a larger strategy (like discontinuation of a product line) are typically disastrous and highly predictive of underperformance relative to companies that didn’t perform layoffs. What typically happens is companies hire back the folks they let go very quickly with a huge dent in morale long term.
The concerning thing at this point is that general layoffs are so in vogue that they’re depressing tech wages. Few executives actually take any real responsibility for their overhiring either, the best you usually get is some generic I’m sorry followed by fat bonuses.
A reason that I can think of is it's a result of the hiring that happened during COVID. First you had Amazon saying they were going on a massive hiring spree. The other tech companies (except for Apple it seems) said to themselves, "Oh no! They're going to snap up all the talent!" so they ramped up their hiring too.
Interest rates were at record lows so there was a lot of available money and nobody (except for the scientists) knew how long this COVID thing would last. These companies are used to short-term thinking because they're public companies and the public markets won't think further than six months from now.
Now that interest rates are back up to where they were 20 years ago, and Amazon has gone through a few rounds of layoffs, the rest of the industry is doing some belt tightening as well.
I predict several of these companies will be a shell of themselves in 10-15 years for this and many other reasons.
obviously a steve-jobs-like backroom deal like the one from 2005 having most companies agreeing on doing it at the same time "with no quick hires backsies".
I'm convinced that it's because the recession that has been predicted for the past 3 years has not happened, and now we're in the election year and it must happen. The layoffs will continue until we get the macro outcome we're looking for
It's the only explanation for companies that are at all time highs in revenue, profit, and stock value to be laying off double digit percentages of their employees
>Rather, who precisely wants a recession now, in your opinion?
I don't think you need more than one guess to figure out the average political leaning of most CEOs and Chairs of corporate boards and then combine that with who is the incumbent
Programmer productivity is really hard to measure, so managers will act in their own interest and push for larger teams. The shift to digital due to COVID provided the excuse. Low interest rates provided the means. Eventually external factors force a reevaluation/correction.
SVB bankruptcy is part of it. A lot of free money dried up.
But also, it's an "everyone else is doing it" thing. Saturate the market with unemployed devs and you can reasonably tell your employees "Sorry, no budget for raises".
I suspect there is something political going on, concentrating power, and disdaining the value of individuals, compassion, and responsibility to build a better world (obviously, not any company's only considerations, but a balance -and now those considerations seem at an all-time low IMHO). In other contexts, the same movement disdains the rights, freedom, and lives of other people generally.
Or that could be a bunch of BS; it's hard to establish objectively. It seems so important, that cultural change, that I think it's worth talking about (and maybe someone else has a more objective, factual way to talk abou it).
I think it all comes down to the fears of investors. At larger companies they have to answer to share holders who are moody, pessimistic, and very bipolar. They will look at a companies revenue, see that its doing well, but still lose sleep over every negative piece of financial news they hear. Then they'll demand costs to be cut. 'd-dont you see the economy...' Next thing you know executives have to do the same thing as every other company to be perceived as competent. Investors love their group think so if you're doing something different during a time like this they're going to lose their shit (even more.)
For smaller companies the explanation is probably much simpler: there's less spending on new companies and therefore maximizing 'runaway' is critical. So if its not absolutely essential it needs to be cut. People who don't do that might not survive until things turn around, unfortunately. It's not a great situation to be in.
I think it happens every year by this time, more or less severe, to show "savings", good financials, by the end of the financial year in April. And then start hiring again to show it is booming. Also maybe to rotate the lower rated workforce
Venture capitalism, late stage capitalism, and just gross incompetence and lack of empathy.
The U.S. is all about me, myself, and I, and these layoffs are just one result of those at the top keeping it the way they like it.
Another driver is that workers' rights suck in the U.S. So now employees are just back to the first industrial revolution: they are just bodies and line items on spreadsheets. They are not people or assets.
Over hiring aka poor planning in the last few years.
Riding interest rates causing investors to demand more stock returns or they will seek returns in treasuries.
In free money era, companies created too many processes and incentive structures that are wasteful. These processes are too expensive when money is expensive. E.g. too many managers whose time is filled with calibration, stack ranking, agile processes instead of true engineering and product leadership.
Ultimately, growth cannot be endless and current crop of companies are not set up for difficult times.
One thing to keep in mind though is that even if AI can't do your job, if it can translate conversations with humans who can buy have a different primary language, and companies have moved to work from home, that's a recipe for jobs going to qualified people elsewhere.
So on top of a pendulum swing backwards from overhiring we're seeing accessibility and virtual commuting open the door for many more candidates for a smaller pool of jobs which compounds the effects.
[Pulling from another thread, but this is my best guesstimate. With additional context]
The rules of the game have changed.
> We were allowed to expense all employee compensation tied to software or R&D in the year in which we paid it. Now we can only expense a small fraction of that because we have to capitalize the expense. That means if you paid and developer $100 to develop a piece of software then sold subscriptions totaling $100 you now have a profit. The old model you have zero dollars in profit the new model says you have something like $80 in profit that you know have to pay taxes on… with what cash?
Additional "finger in the wind context": this change was brought about through the Trump Tax Cuts. My best guess is that Trump wanted the 174 change as a negotiation token to prod tech to make a deal. Mind you and me, this was all pre-pandemic. The financial world was pretty stable and this was going to be a 'great way' to make people work together, if desired. After the pandemic financial response, all bets were off.
Soloprenuers are the only one's somewhat immune. I'm thankful my companies needed to downsize before this.
Final context, there will be a whole new industry to define the useful life for a piece of software given the advancements in AI. This is going to be great fun.
[edit] One more thing, this change was thought to be “repeal-able” with new legislation. Since it no longer looks to be the case, to avoid “everything is securities fraud” (Matt Levine term), everyone has to adjust their public statements and accounting for this new change. Sure many of the big players will still make profit, but analysts only care about “beats and misses.” And since the legislation hasn’t passed to put the old rules back in place, accounting has to make these forecasts more permanent with less wiggle room - aka misses and forecasts down. Now, there is a huge op to trade the rules changing back - which would be a huge tailwind (made headcount cuts and get favorable tax treatment)
The section 174 changes were never intended to go into effect and are a byproduct of the bill review process by deficit hawks. A group in the House has called for changes but a bill hasn't made it out of committee yet.
Counterpoint: This doesn't apply to Google, because Google has _always_ amortized their developer comp, but Google still layed off a thousand people. So, maybe this is just an excuse?
Little impact on the revenue side while cost can be cut significantly.
Many companies have proved it to be useful and their stock prices soar. Why don't we try it
If interest rates are 5%; entrepeneurs must shoot for 20% annual returns to compensate for risk and get funding. If interest rates are 0% the bar is far lower.
I think that tech companies know they can get away with this now and are not scared to do it again (see Google, Amazon, etc).
Previously employees at tech companies were put on a pedestal with unbelievable perks and treatment. Now companies like Google can start to become more like conventional companies when all the other companies around them are becoming like that too.
The "free money" is going away, now that the Boomers are all retiring and taking their money out of the stock market. Interest rates are about to rise to 8-10% and stay there for a few decades, if all goes well.
This will mark the beginning of an immense era of innovation as all of that talent floods the market and is put to far more productive use.
Innovation, at least in our sector, requires capital investment. Hardly any startup can bootstrap. At high interest rates, and with people retiring and moving their money into safe assets, there will be little appetite to fund startups in the coming years. At least that's the theory that Peter Zeihan is presenting on his youtube channel.
The federal reserve raised interest rates to put people out of work and it was successful. Rising wages were a threat to corporate profitablity in the form of "inflation." The quality of workers has not dropped, but the quality of work products has, mainly due to cost cutting and short sightedness.
I think a lot of the revenue generating products for these companies are at a very stable, low-growth point.
There's just not a ton of innovation happening outside of AI and LLMs at the moment, IMO.
If a company foresees all of the future revenue being aligned in one direction -- whether it will work out that way or not -- then why not re-align and focus on that direction (AI, LLMs)?
The US Fed monetary policy has everything to do with it. Money is especially valuable right now. As well as the proliferation of new "AI" tools that increase worker productivity by 10x-1000x.
Between the Fed tightening demand and the AI tools simultaneously heating up supply (productive output), it's no surprise these layoffs are happening.
I don't think AI has much to do with it. The fear/expectation of AI, perhaps.
But the rest is definitely Fed policy; they specifically said they wanted to cool wages, and I think maybe in cahoots with shareholders - remember that guy bleating about google employees earning $200k+, and that it 'should' be more like $100k if they're lucky and BTW, give the resulting increased profits to us shareholders.
I'm not counting cumulative hours of productive work, I'm counting duration from inception to working implementation.
I consider one person knocking something out in one day that would previously have taken them 10 days of wrestling with to be a 10x increase in productivity.
And I'd consider one person building a proof of concept in a couple hours that would have taken 5 people 10 weeks of back and forth meetings, planning, pushback and reluctance to be a 1000x improvement in productivity.
These tools are seriously reducing the bar for knocking stuff out that would have previously required much more significant time investment and risk to even consider let alone tackle.
I've personally used copilot and chatgpt to implement things that would have taken me months of study and wrestling with in just a few hours.
Lack of vc investment at the bottom means lack of seed through series c companies suddenly flush with money willing to buy B2b software so they can grow and sell their software to B2b businesses who sell software to B2b businesses who sell their software to end users.
Massive overhiring in the sector, especially for non-tech positions or soft tech-adjacent roles.
Unfortunately when it comes to layoffs, many of the factors that lead to this overexpansion will also ensure that the parts that ballooned are not axed more than the actual tech positions.
My speculative pure non-analytic layman take is, these behemoths have reached such a scale and domination by hoarding all talents long enough to build the perpetual revenue machine in the era of free money. All low hanging fruits were picked, regulation shut all doors slowly as majority of market dominators were built on fringe of the law, so any new player will have to fight in impossible odds of both regulation and lack of funding, so chance of competition is very low. Anything that is barely threatening will be acquired and extinguished.
So, now that the money printing machines will continue to print, competition landscape is muddied by regulation hammer, so layoffs left and right. Besides, any bad decision now will bear consequences far ahead in future that by the time most people making the decisions will be retired or moved on to something else. The saying goes, death by a thousand cuts, but for behemoths, bleeding takes a long time before severe disability takes place.
Even though the annual government deficit is almost 7% of GDP, essentially injecting stimulus into the economy, wall st analysts are convinced high interest rates will cause a recession. CEOs have responded by slashing forecasts and employees to appease wall st.
My 2c: companies lay people off to please shareholders in the short run, and the laid off employees continue to work in the industry, developing the ecosystem and provide opportunities for more profit in the long run
Unproductive workforce and employees being too much in the comfort zone.
People just got lazier as more stuff were thrown at them - from chia puddings to remote work and most of us humans have lazy self-control features built-in.
All big companies are rusty and covered with cruft. Twitter showed that you may lay off 75% of the workforce and still run your business at scale relatively smoothly. It is a great temptation to follow the suite.
Interest rates went up and now profitability matters. Tech companies' biggest expense is employees, so it's the lowest hanging fruit to generate higher profits. It's really that simple.
The financial beast demanding growth feeds on increasing gains in the value of stocks and other financial instruments. One of the easier ways to fake growth is layoffs.
Your 401(k) is part of that beast, and your investments in stocks, etc., contribute to the system that drives layoffs.
It wasn’t free money, it was COVID. We had low rates and low inflation for years before 2020.
Remember when the government used emergency powers and forced the economy to shut down over health mandates? Remember the thousands of dollars in checks the government mailed to every person or business with a pulse?
That whole operation completely distorted and screwed up the economy. It boosted tech companies with unnatural growth. It boosted real estate prices. It increased the cost of grocery staples. The market reacted in the rebound, the Fed reacted to the market, and we are playing out the effects of the bad decisions made in 2020.
AFAICT from recent quarterly reports, the big tech companies are still seeing double-digit YoY growth in profits and revenues. So from whence comes the pressure to tighten belts?
The quality of your average tech worker has completely nosedived in the last 10-15 years.
All these huge companies wanted more products, more marketshare, more money, etc. They needed more people to pull this off. They started lowering hiring standards across the board because there just weren't enough people in tech.
Simultaneously, a huge portion of the world saw tech salaries and wanted in on it so they started taking every quick certification, bootcamp, degree, etc to get into tech.
It turns out that compared to the dedicated nerds of the previous generation, most new people just don't care that much about tech and don't want to go deeper than the bare minimum required by their job.
So I think tech overhired by a LOT, then they realized all these new people are actually net negatives on the company, and we are slowly correcting.
I think a solid 50% of people in tech are still on the chopping block. You can do much more with tools + really smart people in the year 2024 than you could before.