Bound to happen, so establish yourself as deeply as possible as quickly as possible. Once folks are hooked up to these APIs, there's a cost and friction to switching. This just feels like a land grab that OpenAI is trying to take advantage of by moving quickly.
Most of the clients I'm working with aren't interested in the base level of service. They are looking to further train the models for their specific use cases. That's a much higher barrier to switch than replacing an API. You've got to understand how the underlying models are handling and building context. This sort of customer is paying far more than the advertised token rates and are locked in more tightly.
not really. fine tuning generally just involves running tailored training data through the model - the actual training algorithm is fairly generalized.
For example, the Dreambooth fine tuning algorithm was originally designed for Google's image, but was quickly applied to Stable Diffusion.
Switching to another LLM isn't always about quality. Being able to host something yourself at a lower or equal quality might be preferred due to cost or other reasons; in this case, there's no assumption that the "new" model will have comparable outputs to another LLM's specific prompt style.
In a lot of cases, you can swap models easier but all the prompt tweaking you did originally will probably need to be done again with the new model's black box.
Thee would be less friction to switch if the implementations (which are early enough) accounted for sending requests to multiple service providers including ones that don't exist yet.
OpenAI has a view few do - how broadly this type of product is actually being used. This is possibly the real lead to not just getting ahead, and staying ahead, but seeing ahead.
And also, what people are actually asking it. Are people using it to generate cover letters and resume help, or are they doing analysis of last quarters numbers, or are they getting programming help. That'll help them figure out what areas to focus on for later models, or areas to create specialized models for.
ChatGPT is massive success, but that means the competitor will jump in at all cost, and that includes open source effort.