You seem to be talking about chatgpt, not bing chat. Bing chat literally uses the search engine queries and links to those sources. I have seen its summaries include mistakes, but I have never seen it invent sources (I’ve tried maybe 500 chat queries).
It’s ironic that you’re very confidently presenting erroneous information here. I’d really recommend trying the actual product, or at least looking at the demos. It has some problems. It does not have the same problems that chatgpt does, because it does not rely solely on LLM baked-in data.
It’s ironic that you’re very confidently presenting erroneous information here. I’d really recommend trying the actual product, or at least looking at the demos. It has some problems. It does not have the same problems that chatgpt does, because it does not rely solely on LLM baked-in data.