that would be a generous invented statistic if it only was addressing the inherent stochastic nature of llm output, but you also have to factor in the training data being poisoned and out of date
in my experience the error rate on llm output is MUCH higher than 5%
exactly. Programming languages are all just levels of abstraction above the analog/digital interface.
While it is important to understand the fundamentals of coding, if we expected every software engineer to be well versed in assembly that wouldn't necessarily result in increased productivity.
LLMs are just the next rung up on the abstraction ladder.
There will always be people interested in the gritty details of low level languages like Assembly, C, that give you a lot more granular control over memory. While large enterprises and codebases, as well as niche use cases can absolutely benefit from these low level abstraction specialists, the avg. org doesn't need an engineer with these skills. Especially startups, where getting the 80% done ASAP is critical to growth.
i think there will still be a need for "programmers" with those skills. We'll always need specialists and people building new language/frameworks/etc.
But I think everyone who isn't at least a standard deviation above the average programmer (like myself) shouldn't be focused on being able to read, write, and debug code. For this cohort the important ability is to see the bigger picture, understand the end goal of the code, and then match those needs to appropriate technology stacks. Essentially just moving more and more towards a product manager managing AI programming agents.