Hacker News new | past | comments | ask | show | jobs | submit login
[flagged] Google AI: People should eat at least one small rock a day (mastodon.social)
43 points by cdme 4 months ago | hide | past | favorite | 16 comments



I have been seeing these AI summaries in my SERPs for many months now. I am surprised that with so much testing they are still pretty unreliable. They used to include code snippets that almost universally didn't work properly.

Google, it only takes one or two blunders for people to stop trusting your AI. Why risk it by releasing a half-baked product?


Because a) there's pressure on them to do something very much in this vein, but b) there's no fully baked product on the horizon.

Like...this is what it is. This is the output of LLMs without, at the very least, massive amounts of extra man-hours spent fine-tuning the results in various ways. There's no way for it to judge the truth value of the statements it spits out besides determining how likely they are to be produced after what came before them.

So their choices were either "release nothing, at least not for years more" or "release something half-baked".


“If we don’t shoot ourselves in the face, then someone else with shoot themselves in the face, and then we’ll be behind in the crucial metric of number of bullets in our head!”


We've never seen a disruptive technology with this much media buzz behind it so quickly, and such rapid technological advances. Disruptive technologies are never "ready for prime time" ... until they are. And large organizations are more like psychopaths (they assume harm to others is part of the cost of doing business).


Wasn't this expected? Now there is broader proof of the issues of LLMs as trusted sources of information.


You get what you train on. This is a Google issue.


There is also RLHF to consider...



Funnily enough, ground birds like chickens and quail do need to eat small rocks or sand. That helps grind up food in the gizzard. I had quails, and the hens ate pieces of oyster shell. That also gave them enough calcium to lay an egg almost every single day.


Perhaps Google's AI included TheOnion in its training data:

https://www.theonion.com/geologists-recommend-eating-at-leas...


I think it's not actually in the training data. The llm is just using RAG, meaning it gets the top search results for that query and based on that generates a text. Kinda like perplexity but apparently worse then it.



Technically salt is a rock so the first part is correct.


AI ate the Onion


Well, yeah. How else do you keep your gizzard topped up?


Google should eat a bag of them for this nonsense




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: