Hacker News new | past | comments | ask | show | jobs | submit login

Yes. And, could potentially diminish OpenAI/MS.

Once everyone can do it, then OpenAI value would evaporate.




Once every human has access to cutting edge AI, that ceases to be a differentiating factor, so the human talent will again be the determining factor.


And the content industry will grow ever more addictive and profitable, with content curated and customized specifically for your psyche. The very industry Meta happens to be the one to benefit from its growth most among all tech giants.


> Once everyone can do it, then OpenAI value would evaporate.

If you take OpenAI's charter statement seriously, the tech will make most humans' (economic) value evaporate for the same reason.

https://openai.com/charter


> will make most humans' (economic) value evaporate for the same reason

With one hand it takes, with the other it gives - AI will be in everyone's pocket, and super-human level capable of serving our needs; the thing is, you can't copy a billion dollars, but you can copy a LLaMA.


> OpenAI’s mission is to ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity. We will attempt to directly build safe and beneficial AGI, but will also consider our mission fulfilled if our work aids others to achieve this outcome.

No current LLM is that, and Transformers may always be too sample-expensive for that.

But if anyone does make such a thing, OpenAI won't mind… so long as the AI is "safe" (whatever that means).

OpenAI has been totally consistent with saying that safety includes assuming weights are harmful until proven safe because you cannot un-release a harmful model; Other researchers say the opposite, on the grounds that white-box research is safety research is easier and more consistent.

I lean towards the former, not because I fear LLMs specifically, but because the irreversibly and the fact we don't know how close or far we are means it's a habit we should turn into a norm before it's urgent.


Very similar to Tesla and EVs


...like open balloon.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: