Hacker News new | past | comments | ask | show | jobs | submit login

same...but have you considered the broader implications of relying on LLMs to generate code? It's not just about being a 'force accelerator' for individual programmers, but also about the potential impact on the industry as a whole.

If LLMs can generate high-quality code with minimal human input, what does that mean for the wages and job security of programmers? Will companies start to rely more heavily on AI-generated code, and less on human developers? It's not hard to imagine a future where LLMs are used to drive down programming costs, and human developers are relegated to maintenance and debugging work.

I'm not saying that's necessarily a bad thing, but it's definitely something that needs to be considered. As someone who's enthusiastic about the potential of code gen this O1 reasoning capability is going to make big changes.

do you think you'll be willing to take a pay cut when your employer realizes they can get similar results from a machine in a few seconds?




My boss is holding a figurative gun to my head to use this stuff. His performance targets necessitate the use of it. It is what it is.


Yeah, but this, in itself, is triggered by a hype wave. These come and go. So we can't really judge the long term impact from inside the wave.


Your job won't be taken by AI, it will be taken by someone wielding AI.


As a society we're not solving for programmer salaries but for general welfare which is basically code for "cheaper goods and services".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: