Hacker News new | past | comments | ask | show | jobs | submit login

Meet the new moral panic, same as the old moral panic.

Funnily enough, all of that can be found into Google. Or who knows, maybe even in a library, that kids can access freely! (though it seems the article authors never heard of such a thing)




We are the tech class and are therefore de facto gatekeep gatekeepers to the actual technology.

It's therefore on us to not be reactionary to these moral panics, but instead try to rise above and lead by example.

Either way, it's up to us to retain the power, use it how we see fit, or give it up.

We have a small and short-lived window of opportunity before regulatory capture sets in.

Which future will we choose?


Way back when, an associate was freaking out because he found Nazi! Content! On the Internet!

Then I pointed out that not only the local public library, but the local school libraries, had copies of Mein Kampf, right there on the shelves.


Not anymore! We've (effectively) burned those books, so surely now we're not doomed to repeat that history!


I saw person selling a copy with photo on top... Didn't check price as I felt it was going to sell for too much anyway...

I think there might have been lot of other questionable books too... But didn't try to find any.


I spotted one in a Barnes & Noble in Venice Beach four years ago.


Chock that up as a win for the “trust and safety” people!


Nope, LLMs hallucinate, that's what you don't find online.

That adds an extra level of danger.


> LLMs hallucinate

So do fiction writers. I hear there's even an entire genre of fiction that is expressly devoted to describing the commission of, and the solution to, violent crimes. And another genre that's all about war. And several genres about sex (with various focuses and levels of clinical detail). And books that discuss societal taboos ranging from incest to cannibalism. And...


If you ask a question which needs a correct answer you wouldn't ask a fiction writer


And if I had a question that needed exactness I wouldn't ask an LLM.


> Nope, LLMs hallucinate, that’s what you don’t find online.

People spinning elaborate narratives out of their imagination that aren’t true is, in fact, a thing I find (including resulting in dangerously false information on topics likely to be important to people) online all the time.


Internet has enough basic instructions that will cause things like release of chlorine gas...

Or various ways to electrocute oneself or start fires...


It won't be long before "Quora says I can melt eggs" turns into "Google's top result says millions of fish drown in oceans each year" or somesuch.


But the article is not addressing that, right?

Hallucination actually makes the problem pointed out "less worse" because maybe it will tell you how to make fire with orange juice.

Though again, people are attributing too much agency to a stochastic parrot. A parrot with a very large memory and sometimes off the rockers but still a parrot


Or they ask how to treat a disease and get a poisonous answer


You already get that from weird "health" bloggers. What's new?


You can sue a health blogger


You can sue anyone. You'll almost certainly not get anywhere for someone in the US, and any other country would tell you to pound sand. It's the same situation in practice.

Don't listen to random people for your personal medical decisions, robot or human, or you're going to have a bad time


"LLMs hallucinate, that's what you don't find online"

Yes, because you can always trust the reliability of some rando's blog. Have you been on the Internet lately? /s




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: