Hacker News new | past | comments | ask | show | jobs | submit login

the people you're supposed to be eliciting requirements from are just regurgitating what ChatGPT told them are the requirements hehehe



This is becoming so true. I have read so many documents in the last year that are obviously from a GPT, especially when it’s about something new to a group.

But in the end, I would rather get a half baked GPT doc than a quarter baked junior analyst doc. I just worry that GPTs are going to kick the rungs out of the bottom of any knowledge work later. Being bad but junior used to be a learning environment without too many repercussions.

But how do you compete with peers using AI? You use it also. But now you have robbed yourself of a learning opportunity. Yeah you can learn someway by doing it, but it’s like doing homework by looking at the answers. Sure it can help you double check, but if you don’t put the effort into constructing your own answer, then you have only cheated yourself.

I think the AI alignment issues are probably over blown in the short term, but what about the long term when the average person has regressed so far as to be unable to live without AI. They will just do whatever is told to them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: