Hacker News new | past | comments | ask | show | jobs | submit login

> ChatGPT API is gpt-3.5-turbo, GPT-4 API is GPT-4.

The OpenAI API has a "chat" endpoint, and on that you can pick between 3.5-turbo and 4 on the same API.

The ChatGPT web frontend app also lets you pick if you're a Plus subscriber.

I've seen this confusion in a few HN threads now, and it's not a good idea to use "ChatGPT API" as a stand-in for 3.5-turbo just because 3.5-turbo was what was available on the end point when OpenAI released a blog post using the term "ChatGPT API". That blog post is frozen in time, but the model is versioned, and the chat API orthogonal to the version.

"ChatGPT API" is a colloquial term for the chat stuff on the OpenAI API (vs. the models available under the text completions API), which offers both models. The only precise way to talk is to specify the version at this point.




Funnily enough, OpenAI's own pricing page splits GPT-3.5 and GPT-4 under headings "Chat" and "GPT-4"

https://openai.com/pricing

Though I think the bulk of the confusion just comes from the fact that http://chat.openai.com/chat has two very different views between the free vs paid tiers.

The paid tier makes it obvious that the ChatGPT has swappable models. The free tier hides it by dropping you right into conversation with the one model.


That is why I clarified "ChatGPT/gpt-3.5-turbo" at the beginning of my discussion.

Nowadays the confusion is driven more by AI thoughtleaders optimizing clickthroughs by intentionally conflating the terms than OpenAI's initial ambigious terminology.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: