Hacker News new | past | comments | ask | show | jobs | submit login

> LLM generated snippets and suggestions everywhere but it was there at least since 2019

Absolutely not. Note that ex. Google's AI answers are not from an LLM and they're very proud of that.

> So they have had internal APIs for this for quite some time.

We did not have internal or external APIs for "chat completions" with chat messages, roles, and JSON schemas until after OpenAI.

> Did you work there?

Yes

> What do you base this on?

The fact it was under lock and key. You had to jump through several layers of approvals to even get access to a standard text-completion GUI, never mind API.

> has been experimenting with LLMs internally ever since the original paper,

What's "the original paper"? Are you calling BERT an LLM? Do you think transformers implied "chat completions"?

> that would make them want different things in their public API as well.

It's a nice theoretical argument.

If you're still convinced Google had a conversational LLM API before OpenAI, or that we need to quibble everything because I might be implying Google didn't invent transformers, there's a much more damning thing:

The API is Gemini-specific and released with Gemini, ~December 2023. There's no reason for it to be so different other than NIH and proto-based thinking. It's not great. That's why ex. we see the other comment where Cloud built out a whole other API and framework that can be used with OpenAI's Python library.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: