These jobs are overwhelmingly paid by task, which puts a lot of pressure to go fast.
I assert the entire "hallucination" phenomenon is a side effect of these practices. When ChatGPT makes up a fake fact with fake sources to back it up, it's largely because such lies are rated very highly by the underpaid humans who aren't incentivized to follow up on sources.
"I assert the entire "hallucination" phenomenon is a side effect of these practices. When ChatGPT makes up a fake fact with fake sources to back it up, it's largely because such lies are rated very highly by the underpaid humans who aren't incentivized to follow up on sources."
It seems like with billions of investment, they could figure that out. It's commonly discussed as an extremely difficult problem to solve and the most important problem to solve in the most talked about industry on the planet. I'm having a problem believing that its something that is so easy to solve.
Are you suggesting that even with that much money, they have to do things the way things are "overwhelmingly" done, as opposed to being able to say "hey, we need it done this way instead, because it's important and we can pay for it."
It just seems pretty bizarre to think that the highest of high tech, that is massively funded, doesn't have the clout to just fix that in a heartbeat, if that's really where the problem is.
https://www.washingtonpost.com/world/2023/08/28/scale-ai-rem...
These jobs are overwhelmingly paid by task, which puts a lot of pressure to go fast.
I assert the entire "hallucination" phenomenon is a side effect of these practices. When ChatGPT makes up a fake fact with fake sources to back it up, it's largely because such lies are rated very highly by the underpaid humans who aren't incentivized to follow up on sources.