Plus, why do we care about that degree? If we could make it so humans don't hallucinate too that would be great, but it ain't happening. Humans memory gets polluted the moment you feed them new information, as evidence by how much care we have to give when trying to extract information when it matters, like law enforcement.
People rag on LLMs constantly and i get it, but they then give humans way too much credit imo. The primary difference i feel like we see with LLMs vs Humans is complexity. No, i don't personally believe LLMs can scale to human "intelligence". However atm it feels like comparing a worm brain to a human intelligence and saying that's evidence that neurons can't reach human intelligence level.. despite the worm being a fraction of the underling complexity.
People rag on LLMs constantly and i get it, but they then give humans way too much credit imo. The primary difference i feel like we see with LLMs vs Humans is complexity. No, i don't personally believe LLMs can scale to human "intelligence". However atm it feels like comparing a worm brain to a human intelligence and saying that's evidence that neurons can't reach human intelligence level.. despite the worm being a fraction of the underling complexity.