Hacker News new | past | comments | ask | show | jobs | submit login

What happens when nobody knows how to create their own artwork, write their own novels, program their own code, etc. because they are so reliant on LLMs? Yet LLMs are reliant on using the output from real people. Is there an event horizon where LLMs have nothing truly new to pull from and can only use what was created by previous generations of humans? An LLM stagnation.



Given that LLM content is generated with some randomness, and assuming that published LLM content is filtered to be useful for humans, then even in the case where no content is being generated by humans, I would expect the published content to contain new information that is generated by humans.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: