Hacker News new | past | comments | ask | show | jobs | submit login

No. It solved it by (most of the time) giving the OP and I the answer to our queries, without us needing to wade through spammy SERP links.



If LLMs can replace 90% of your queries, then you have very different search patterns from me. When I search on Kagi, much of the time I’m looking for the website of a business, a public figure’s social media page, a restaurant’s hours of operation, a software library’s official documentation, etc.

LLMs have been very useful, but regular search is still a big part of everyday life for me.


Sure we now have options, but before LLMs, most queries relied solely on search engines, often leading to sifting through multiple paragraphs on websites to find answers — a barrier for most users.

Today, LLMs excel in providing concise responses, addressing simple, curious questions like, "Do all bees live in colonies?"


How do you tell a plausible wrong answer from a real one?


By testing the code it returns (I mostly use it as a coding assistant) to see if it works. 95% of the time it does.

For technical questions, ChatGPT has almost completely replaced Google & Stack Overflow for me.


In my experience, testing code in a way that ensures that it works is often harder and takes more time than writing it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: