Hacker News new | past | comments | ask | show | jobs | submit login

FTFY: ONLY 3 million times.

At the current pace of development, AI will catch-up in a decade or less.




How does that math work out? The developments during the last year has been... Abysmal? The hype and marketing bull is increasing exponentially though.


Groq, which appeared 4 months ago, was an abysmal development for efficiency?


Look at the price difference of tokens on their API between the first release of ChatGPT and the current one.

• Current 3.5-family price is $1.5/million tokens

• Was originally $20/million tokens based on this quote: "Developers will pay $0.002 for 1,000 tokens — which amounts to about 750 words — making it 10 times cheaper" - https://web.archive.org/web/20230307060648/https://digiday.c...

(I can't find the original 3.5 API prices even on archive.org, only the Davinci etc. prices, the Davinci model prices were also $20/million).

There's also the observation that computers continue to get more power efficient — it's not as fast as Moore's Law was, doubling every 2.6 years, or a thousand-fold every 26 years, or about 30% per year.


> How does that math work out?

He asked chatgpt to do the math.


And they pretty much made up a number. It’s a pretty clickbaity headline for an article that is mostly about vector databases.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: