I googled for a knowledge question and wasn't able to find an answer.
Then I used ChatGPT and it produced a surprisingly satisfying (opinionated and correct) response.
I was able to successfully extract value by treating LLM as knowledge base, what exactly is absurd about it?
(to be fair, I asked ChatGPT the same question today and this time it returned a bland "it depends" non-answer).
I googled for a knowledge question and wasn't able to find an answer.
Then I used ChatGPT and it produced a surprisingly satisfying (opinionated and correct) response.
I was able to successfully extract value by treating LLM as knowledge base, what exactly is absurd about it?
(to be fair, I asked ChatGPT the same question today and this time it returned a bland "it depends" non-answer).