Hacker News new | past | comments | ask | show | jobs | submit login

I'm actually expecting AI training data to turn into a snake eating its tail. I've been spending the last month not sleeping so I could try to get up to speed on deep learning, and my understanding is that AI trained on AI generated output becomes crap surprisingly quickly, and AI generated content is already starting to proliferate. I have no idea the extent to which this will hinder new models being generated, but I could see it becoming quite the problem.



the tinfoil take is that everyone in charge is fully aware of this eventual problem such that when it occurs, a solution will be presented: ubiquitous verifiable digital ID technology that's required to use popular social media services, such that everything that a Real Person posts will be signed with said digital ID, thus bringing about the end of online anonymity, and the death of the Internet as we presently know it.

idk, sounds plausible to me, the way things've been going.


Entirely possible, but the amount of people who moved to Mastodon after Twitter got lit on fire gives me hope that people might explore options if things go a step too far


My Mastodon is a ghost town while my Twitter feed keeps having hundreds upon hundreds of posts per day.


Ah, I mostly go based on what I hear from friends. This, Reddit, and a regional moped Discord are the closest I get to social media


Also think of how many people will sell their digital persona to sell AI generated spam for a little cash in return. Ain't nuthin safe from spam.


> AI trained on AI generated output becomes crap surprisingly quickly

Yet GANs work quite well


Sorry, I figured in the context of transformer based language models it was contextually clear I was talking about those




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: