> Do you trust the content of the sites Google returns you?
You don’t. But i’dtrust a top rated Stackoverflow answer over whatever LLM spits out.
There is no “confidence score” from an LLM output. You cannot tell whether it is making things up (and potentially make very bad decisions based on it’s output)
There's no confidence score for specifically ChatGPT, other GPT models hosted by OpenAI (let alone the broader research community) have been given that capability.
You don’t. But i’dtrust a top rated Stackoverflow answer over whatever LLM spits out.
There is no “confidence score” from an LLM output. You cannot tell whether it is making things up (and potentially make very bad decisions based on it’s output)