Hacker News new | past | comments | ask | show | jobs | submit login

There's also the issue where if you aren't doing the work much anymore, will you continue to be able to competently check its output? Be able to intervene as well when it makes a mistake?

I think it's tempting but oversimple to focus on "output" and "time saved generating it," but that misses all the other stuff that happens while doing something, especially when it's a "softer" task (vs. say, mechanical calculation). It also seems like a mindset focused on selling an application rather than doing a better job.




Sure, but that's part of a much broader issue that predates AI by at least two millennia, probably much longer — the principal-agent problem.


> Sure, but that's part of a much broader issue that predates AI by at least two millennia, probably much longer — the principal-agent problem.

I don't think that captures what I'm thinking about, which is more skills atrophy ("Children of the Magenta") problem than a conflict of interest problem.

https://www.computer.org/csdl/magazine/sp/2015/05/msp2015050...

> William Langewiesche's article analyzing the June 2009 crash of Air France flight 447 comes to this conclusion: “We are locked into a spiral in which poor human performance begets automation, which worsens human performance, which begets increasing automation” (www.vanityfair.com/news/business/2014/10/air-france-flight-447-crash).

> ...

> Langewiesche's rewording of these laws is that “the effect of automation is to reduce the cockpit workload when the workload is low and to increase it when the workload is high” and that “once you put pilots on automation, their manual abilities degrade and their flight-path awareness is dulled: flying becomes a monitoring task, an abstraction on a screen, a mind-numbing wait for the next hotel.”


Ah, yes, I think I get you this time. (Is it just me, or does that now feel hideously clichéd from the LLMs doing that every time you say "no" to them? Even deliberately phrasing it unlike the LLMs, it suddenly feels dirty to write the same meaning, and I'm not used to that feeling from writing).

I still think it's a concern at least as old as writing, given what Socrates is reported to have said about writing — that it meant we never learned to properly remember, and it was an illusion of understanding rather than actual understanding.

(That isn't a "no", by the way; merely that the concern isn't new).




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: