I think OpenAI is building a moat right in front of us through their plug-ins idea.
Executed well, plug-ins could become ChatGPT's App Store equivalent. Once that happens, OpenAI is undoubtedly going to convert a portion of search traffic into agent delegated work i.e., people will use ChatGPT to start and terminate their searches essentially delegating the work of collating data from multiple sources to ChatGPT.
FD - I've applied to release a ChatGPT plug-in and this space looks very interesting to me.
PS - Earlier today, I submitted to HN a link to my blog where I analyzed plugins. The HN post sank without a trace but the I wrote my blogpost using the Wolfram plugin in ChatGPT and it was a breeze. I genuinely feel I've seen the future.
This is only a factor if the underlying AI/LLM model becomes as good as OpenAI AND serves millions of people. While open source LLMs are soon going to gain parity with GPT-4/Bard, it is unlikely that they will hit the kind of usage numbers that an OpenAI/Google/Meta can deliver.
Even there, I'm not sure Google/Bing will cannibalize their own product to allow third parties to inject data into a search interaction.
ChatGPT is a fundamentally different product - it's a humanlike intelligence which is always ready to talk to you and assist you. We haven't ever had anything of comparable quality and reach. The nearest was Alexa but even she was limited to Amazon's catalog and shopping. It's like ChatGPT can be everyone's personal gofer and THAT is huge, imo.
No that's incorrect. It is auto-fill on steroids trained on years of internet postings from actual humans (ie, reddit, twitter, etc).
That input is going to dry up - no longer free (Reddit has said all future posts are going to cost $ to access). Good for OpenAI as it has such a headstart, but many companies can source/tap into legitimate alternative user data streams - FAAMG for sure, but also any company that can convince it's userbase to provide training data that already has a foothold.
Google has its finger on the pulse of everything that goes through google search, gmail, etc. There's a reason everyone 15 years ago thought that Google would produce the first game-changer AI.
OpenAI is getting stuff from Bing, but Microsoft controls that data.
> I think OpenAI is building a moat right in front of us through their plug-ins idea.
I think OpenAI is doing the opposite by massively degrading GPT-4 to support the load from all these integrations (as well as their new app). Their moat was the head-and-shoulders-above—their-competitors quality of GPT-4, which had now taken a huge nosedive. I’m not sure why anyone would pay for it instead of GPT 3.5-Turbo. It went from being a competent, if somewhat error probe coder to doing things like randomly inserting C# code blocks in text.
Unless, I'm missing something, the spec you expose to ChatGPT only tells them which api endpoint to hit. The code powering that endpoint is not visible to ChatGPT.
The only argument you could make is that you are giving it machine understandable text to describe an API endpoint. In future, it might not even need the text description if the API endpoint is named well. Quite a stretch though.
Do you find they are better than the opposite - ie an app using the GPT-4 API? To me the plugins seem back to front as far as optimal architecture is concerned.
1. single point of entry: you don't need ten different apps to achieve ten different outcomes. You can do it all right inside ChatGPT
2. Since I can have up to three plugins active inside ChatGPT (as of today), I can express more complex workflows than if I had one app each with no trivial way to stream data from one app to the next.
It's like Zapier for your ideas. You talk to ChatGPT > it extracts data from a plugin > you prod it along a bit more > it talks to a second plugin to do X > more chat > talk to third plugin etc etc.
Zapier itself only lets you flow data from one app to the next, not act on that data.
I expect that as the LLM evolves and matures, they will allow more plugins and start eating up different industries. E.g., why do you need a Google Drive when your ChatGPT can also store files for you?
>why do you need a Google Drive when your ChatGPT can also store files for you?
If the idea is to keep the actual files rather than summaries, than there's an entire world of requirements (data storage reliability, access control, auditing, integration) where OpenAI etc. have no competitive advantage, and their own issues (e.g. prompt injections). LLMs are a bad fit whenever you need it to act like a computer, they replace human style processing. For math, get a calculator.
In this case, I'd expect some way to interface to OneDrive or maybe even BackBlaze.
Executed well, plug-ins could become ChatGPT's App Store equivalent. Once that happens, OpenAI is undoubtedly going to convert a portion of search traffic into agent delegated work i.e., people will use ChatGPT to start and terminate their searches essentially delegating the work of collating data from multiple sources to ChatGPT.
FD - I've applied to release a ChatGPT plug-in and this space looks very interesting to me.
PS - Earlier today, I submitted to HN a link to my blog where I analyzed plugins. The HN post sank without a trace but the I wrote my blogpost using the Wolfram plugin in ChatGPT and it was a breeze. I genuinely feel I've seen the future.