Bing Chat searches then summarizes for you. It gets all the latest information, reads the top results and gives you a summary of what you are looking for. It's here today. Also, Bing Chat makes search by humans irrelevant for many things.
"You never change things by fighting the existing reality. To change something, build a new model that makes the existing model obsolete." ― Buckminster Fuller
I've been blown away by how much better this feels as a search interface. No longer trying to guess the best search terms or trying to narrow down searches. Just ask a question in English, and get a summarized answer with citations to let you evaluate the information. Like an actual personal assistant, and very transparent showing things like the search terms being used.
But how can you trust it to provide accurate information?
When I've played around with Bing, I've been seeing hallucinations and outright false data pop up quite regularly.
My initial assessment of LLVMs is that they can be great writing aids, but I fail to see how I can trust them for search, when I can't use it for simpler tasks without getting served outright falsehoods.
You have to follow the citations. They have information; the headline result doesn't tell you anything except "here's where we think you should look". That's a search problem.
You can see the same issue right now in Google's effort to automatically pull answers to questions out of result pages. Frequently it gets those answers wrong.
But that’s not how humans function. They won’t follow citations because it’s added work. Nine times out of ten, they will take what the AI spits out at face value and move on. Also those citations have a higher probability of being created by AI now as well.
Humans differ. If the information is not controversial most often accept it and move on. If the information is controversial most handwave, but a large group further checks and if they arrive different conclusions they start to get active on replacing incorrect info.
Yes, and humans will just skim search results rather than actually read the article. Or trust the Wikipedia page or book. Or believe the talking head. When they are not invested in the answer. But the occasions where it matters, we do read the article and/or check multiple sources and decide if it is bullshit or not. I don't much care if I get the wrong answer about how many tattoos Angelina Joli has, but I find myself comparing multiple cooking recipes and discarding about half before I even go shopping.
"You never change things by fighting the existing reality. To change something, build a new model that makes the existing model obsolete." ― Buckminster Fuller
Google needs to move fast.