Hacker News new | past | comments | ask | show | jobs | submit login

Loved the lone footnote defining their view of AGI:

> A highly autonomous system that outperforms humans at most economically valuable work

Holy goalposts shift, batman! This is much broader and much less than what Iā€™d been led to believe from statements by this company, including by altman himself.




I think that's been their working definition of AGI for awhile, actually.


All I've heard from every single tweet, press release, etc. has defined their AGI as "A system that can think like humans, or better than humans, in all areas of intelligence." This is the public's view of it as well - surely you can see how burying their "working" definition in footnotes like this apart from the hype they drum up publicly is a bit misleading, no?

A cursory search yields stuff like this:

https://www.theverge.com/2024/12/4/24313130/sam-altman-opena...


The footnote is aligned with what Sam Altman has been saying in most interviews up until recently. I was actually surprised to see the footnote since they have shifted how they talk about AGI.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: