Hacker News new | past | comments | ask | show | jobs | submit login

That sounds like exactly the kind of thing the board of a non-profit should be preventing.



As an employee of a company, I trade my time and effort for some amount of rewards. I enter deals with the expectation of stability from the other party.

My unfounded Internet opinion: OpenAI just removed or reduced a large source of reward and have shown fundamental instability. OpenAIs success is very much tied to Employees and Compute.


If your goal is to work for a profit-sharing company, then don't work for a non-profit.


Plenty of non-profits give a lot of money to employees. There is nothing stopping non-profits from paying exorbitant sums to their employees, and executives often do get paid exorbitant. Non-profits mean they don't pay out to investors, but they are usually used as a grift to get people to work for less so the top people make more money and do fundraising on their pet projects.


The employees work for the for-profit part of OpenAI.


That is owned by a non-profit organization. It seems like a lot of the employees are chasing money, and forgetting that it's fundamentally not trying to maximize profit. Of course, Sam seems to have perverted its mission to be the latter (serving as the latest high-priest of mammon, like Elias served Lillith)


Yeah I mean, who cares if ASI kills us all as long as a couple hundred of the most well-paid people on the planet get even more rich.

It's insane to see all these takes when we don't even know what caused the loss of trust in the first place.


No one sincerely believes they have, or will soon achieve, AGI. Neither can they believe that the CEO can push them to achieve it and forcefully release it, whereas they would responsibly develop it (whatever that may mean) without him around.


Great summary.

We are very complicated creatures and things get out of control, both internally and externally. My armchair opinion is that they started to believe that all of it is so advanced and important, that they lost a bit of a grip on reality. Sutskever imagining planet covered with data centers and solar panels shows me that [0]. Every single person is limited in his/her view - I get a strange feeling when listening to him in this video. Also, they are not the only people left on this planet. Fortunately, this task of creating AI/AGI is not a task for a pack of ten, trying to save us from harm. Still, it may and probably will get rough. /rant

[0] https://www.youtube.com/watch?v=9iqn1HhFJ6c


Your second paragraph is pretty ironic given your first.


> Yeah I mean, who cares if ASI kills us all as long as a couple hundred of the most well-paid people on the planet get even more rich.

creating ASI for money seems particularly asinine as the machine overlords won't care terribly much about dollars


How do you know what ASI will value?


As an employee of a bay area tech company, presumably, in which a mid-level IC can make as much money as a C-suite executive in some less prominent industry*


Well, they're almost certainly 'not profiting' right now.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: