Who will they be making money from? OpenAI is looking for companies willing to:
- tolerate the current state of the chatbots
- tolerate the high per-query latency
- tolerate having all queries sent to OpenAI
- tolerate OpenAI [presumably] having 0 liability for ChatGPT just randomly hallucinating inappropriate nonsense
- be willing to pay a lot of money for the above
I'm kind of making an assumption on that last point, but I suspect this is going to end up being more small market business to business than mass market business to consumer. A lot of these constraints make it not really useable for many things. It's even somewhat suspect for the most obvious use case of search, not only because of latency but also because the provider needs to make more money per search after the bot than before. There's also the caching issue. Many potential uses are probably going to be more inclined to get the answers and cache them to reduce latency/costs/'failures' than endlessly pay per-use.
Anyhow, probably a lack of vision on my part. But I'd certainly like to know what I'm not seeing.
A lot of companies use third parties to provide customer support, and the results are often very low quality and full of misunderstandings and what we now call hallucinations. I think a good LLM could do a better job and I bet it'd be cheaper, too. And as a bonus training the bots to handle new products is practically instant when compared to training humans.
- tolerate the current state of the chatbots
- tolerate the high per-query latency
- tolerate having all queries sent to OpenAI
- tolerate OpenAI [presumably] having 0 liability for ChatGPT just randomly hallucinating inappropriate nonsense
- be willing to pay a lot of money for the above
I'm kind of making an assumption on that last point, but I suspect this is going to end up being more small market business to business than mass market business to consumer. A lot of these constraints make it not really useable for many things. It's even somewhat suspect for the most obvious use case of search, not only because of latency but also because the provider needs to make more money per search after the bot than before. There's also the caching issue. Many potential uses are probably going to be more inclined to get the answers and cache them to reduce latency/costs/'failures' than endlessly pay per-use.
Anyhow, probably a lack of vision on my part. But I'd certainly like to know what I'm not seeing.