Hacker News new | past | comments | ask | show | jobs | submit login

This should be higher voted. Seems like an internal power struggle between the more academic types and the commercial minded sides of OpenAI.

I bet Sam goes and founds a company to take on OpenAI…and wins.




Yes, and wins with an inferior product. Hooray /s

If the company's 'Chief Scientist' is this unhappy about the direction the CEO is taking the company, maybe there's something to it.


Because the Chief Scientist let ideology overrule pragmatism. There is always a tension between technical and commercial. That’s a battle that should be fought daily, but never completely won.

This looks like a terrible decision, but I suppose we must wait and see.


OpenAI is a non-profit research organisation.

It's for-profit (capped-profit) subsidiary exists solely to be able to enable competitive compensation to its researchers to ensure they don't have to worry about the opportunity costs of working at a non-profit.

They have a mutually beneficial relationship with a deep-pocketed partner who can perpetually fund their research in exchange for exclusive rights to commercialize any ground-breaking technology they develop and choose to allow to be commercialized.

Aggressive commercialization is at odds with their raison d'être and they have no need for it to fund their research. For as long as they continue to push forward the state of the art in AI and build ground-breaking technology they can let Microsoft worry about commercialization and product development.

If a CEO is not just distracting but actively hampering an organisation's ability to fulfill its mission then their dismissal is entirely warranted.


It seems Microsoft was totally blind-sided by this event. If true then Trillion$+ Microsoft will now be scruitinizing the unpredictability and organizational risk associated with being dependant on the "unknown-random" + powrerful + passionate Illya and board who are vehemently opposed to the trajectory lead by altman. One solution would be to fork OpenAI and its efforts, one side with the vision lead by Illya and the other Sam.


I don't think you know what intellectual property is.


It seems you have jumped to many conclusion's in your thinking process without any prompting in your inference. I would suggest lowering your temperature ;)


One doesn't simply 'fork' a business unless it has no/trivial IP, which OpenAI does not.


Forked:

https://twitter.com/satyanadella/status/1726509045803336122

"to lead a new advanced AI research team"

I would assume that Microsoft negotiated significant rights with regards to R&D and any IP.


I wouldn't call starting from zero forking


What is starting from zero exactly?


Even a non-profit needs to focus on profitability, otherwise it's not going to exist for very long. All 'non-profit' means is it's prohibited from distributing its profit to shareholders. Ownership of a non-profit doesn't pay you. The non-profit itself still wants and is trying to generate more then it spends.


I addressed that concern in my third paragraph.


>They have a mutually beneficial relationship with a deep-pocketed partner who can perpetually fund their research in exchange for exclusive rights to commercialize any ground-breaking technology they develop and choose to allow to be commercialized.

Isn't this already a conflict of interest, or a clash, with this:

>OpenAI is a non-profit research organisation.

?


> ?

"OpenAI is a non-profit artificial intelligence research company"

https://openai.com/blog/introducing-openai


Yeah! People forget who we're talking about here. They put TONS of research in at an early stage to ensure that illegal thoughts and images cannot be generated by their product. This prevented an entire wave of mental harms against billions of humans that would have been unleashed otherwise if an irresponsible company like Snap were the ones to introduce AI to the world.


As long as truly "open" AI wins, as in fully open-source AI, then I'm fine with such a "leadership transition."


this absolutely will not happen, Ilya is against it


Yeah if you think a misused AGI is like a misused nuclear weapon, you might think it’s a bad idea to share the recipe for either.


> This looks like a terrible decision

What did Sam Altman personally do that made firing him such a terrible decision?

More to the point, what can't OpenAI do without Altman that they could do with him?


> What did Sam Altman personally do that made firing him such a terrible decision?

Possibly the board instructed "Do A" or "Don't do B" and he went ahead and did do B.


This is what it feels like -- board is filled with academics concerned about AI security.


You're putting a lot of trust in the power of one man, who easily could have the power to influence the three other board members. It's hard to know if this amounts more than a personal feud that escalated and then got wrapped in a pretty bow of "AI safety" and "non-profit vs profits".


You can’t win with an inferior product here. Not yet anyway. The utility is in the usefulness of the AI, and we’ve only just got to useful enough to start really being useful for daily workflows. This isn’t a ERP type thing where you outsell your rivals on sales prowess alone. This is more like the iPhone3 just got released.


Inferior product is better than an unreleased product.


Does ChatGPT look unreleased to you?


Maybe.

But Altman has a great track record as CEO.

Hard to imagine he suddenly became a bad CEO. Possible. But unlikely.


Where is this coming from? Sam does not have a "great" record as a CEO. In fact, he barely has any records. His fame came from working in YC and then the sky-rocketing of open AI. He is great at fundraising though.


wat

the guy founded and was CEO of a company at 19 that sold for $43m


> As CEO, Altman raised more than $30 million in venture capital for the company; however, Loopt failed to gain traction with enough users.

It is easy to sell a company for $43 if you raised at least $43. Granted, we don't know the total amount raised but it certainly it's not the big success you are describing. That and I already mentioned that he is good in corporate sales.


According to Crunchbase, Loopt raised $39.1M.


How many years did it take to go from 39 million to 43 million in value? Would've been better off in bonds, perhaps.

This isn't a success story, it's a redistribution of wealth from investors to the founders.


Ah, the much-sought-after 1.1X return that VCs really salivate over.


> he is good in corporate sales

Which is a big part of being a great CEO


It is a big part of start-up culture and getting seed liquidity. It doesn't make you a great long-term CEO, however.


A CEO should lead a company not sell it.


> It is easy to sell a company for $43 if you raised at least $43

I'm curious - how is this easy?


Ah yes the legendary social networking giant loopt


Loopt was not a successful company, it sold for more or less the same capital it raised.


or alternatively: altman has the ability to leverage his network to fail upwards

let's see if he can pull it off again or goes all-in on his data privacy nightmare / shitcoin double-wammy


Train a LLM exclusively on HN and make it into a serial killer app generator.


This. I would like my serial killer to say some profound shit before he kills me.


"should have rewritten it in rust" bang


Worldcoin is a great success for sure…!

The dude is quite good at selling dystopian ideas as a path to utopia.


I don't see it. Altman does not seem hacker-minded and likely will end up with an inferior product. This might be what led to this struggle. Sam is more about fundraising and getting the word out there but he should keep out of product decisions.


Brockman is with Sam, which makes them a formidable duo. Should they choose to, they will offer stiff competition to OpenAI but they may not even want to compete.


For a company to be as successful as OpenAI, two people won't cut it. OpenAI arguably has the best ML talent at the moment. Talent attracts talent. People come for Sutskever, Karpathy, and alike -- not for Altman or Brockman.


Pachocki, Director of Research, just quit: https://news.ycombinator.com/item?id=38316378

Real chance of an exodus, which will be an utter shame.


Money attracts talent as well. Altman knows how to raise money.

2018 NYT article: https://www.nytimes.com/2018/04/19/technology/artificial-int...


according to one of the researchers who left, Simon, the engineering piece is more important. and many of their best engineers leading GPT5 and ChatGPT left (Brockman, Pachocki, and Simon)


Who is "Simon"? Link to source re; departure?



Money also attracts talent. An OpenAI competitor led by the people who led OpenAI to its leading position should be able to raise a lot of money.


Money also attracts various "snout in the trough" types who need to get rid of anyone who may challenge them as for their abilities or merits.


Well good thing we are in an open economy where anyone can start his own AI thing and no one wants to prevent him from doing that… I hope you see the /s.


Literally ask around for a billion dollars, how hard can it be?


Maybe now he'll focus on worldcoin instead?


I bet not (we could bet with play money on manifold.markets I would bet to 10% probability). Because you need the talent, the chips, the IP development, the billions. He could get the money but the talent is going to be hard unless he has a great narrative.


I'll sell my soul for about $600K/yr. Can't say I'm at the top of the AI game but I did graduate with a "concentration in AI" if that counts for anything.


> I'll sell my soul for about $600K/yr.

If you're willing to sell your soul, you should at least put a better price on it.


Many sells their soul for $60k/yr, souls aren't that expensive.


Your soul is worth whatever you value it at.


That is "normal"/low-end IC6 pay at a tech company, the ML researchers involved here are pulling well into the millions.


your comment is close to dead, when you talk public open facts.

shows that the demographic here is alienated when it came to their own compensation market value.


People here love to pretend 100k is an outstanding overpay


It's definitely alien to me. How do these people get paid so much?

* Uber-geniuses that are better than the rest of us pleb software engineers

* Harder workers than the rest of us

* Rich parents -> expensive school -> elite network -> amazing pay

* Just lucky


Most companies don't pay that, step 1 is identifying the companies that do and focusing your efforts on them exclusively. This will depend on where you live, or on your remote opportunities.

Step 2 is gaining the skills they are looking for. Appropriate language/framework/skill/experience they optimize for.

Step 3 is to prepare for their interview process, which is often quite involved. But they pay well, so when they say jump, you jump.

I'm not saying you'll find $600k as a normal pay, that's quite out of touch unless you're in Silicon Valley (and even then). But you'll find (much) higher than market salary.


By being very good. Mostly the Uber-geniuses thing, but I wouldn't call them geniuses. You do have a bit of the harder working but it's quite minor and of course sometime you benefit from being in the right place at the right time (luck). I'd say elite network is probably the least important conditional on you having a decent network that you can get at any top 20 school if you put in the effort (be involved in tech societies etc.)


Isn't his narrative that he is basically the only person in the world who has already done this?


No, Sutskever and colleagues did it. Sam sold it. Which is a lot, but is not doing it.


this being bait and switched actual scientists implementing the thing under the guise of non-profit?


"I'll pay you lots of money to build the best AI" is a pretty good narrative.


The abrupt nature and accusatory tone of the letter makes it sound like more was going on than disagreement. Why not just say, “the board has made the difficult decision to part ways with Altman”?


> Why not just say, “the board has made the difficult decision to part ways with Altman”?

That's hardly any different. Nobody makes a difficult decision without any reason, and it's not like they really explained the reason.


It is a very big difference to publicly blame your now ex-CEO for basically lying ("not consistently candid") versus just a polite parting message based on personal differences or whatever. To attribute direct blame to Sam like this, something severe must have happened. You only do it like this to your ex-CEO when you are very pissed.


From all accounts, Altman is a smart operator. So the whole story doesn’t make sense. Altman being the prime mover, doesn’t have sufficient traction with the board to protect his own position and allows a few non-techies to boot him out ?


Well connected fundraiser - obviously.

But…smart operator? Based on what? What trials has he navigated through that displayed great operational skills? When did he steer a company through a rocky time?


I have no problem with getting rid of people obsessed with profits and shareholder gains. Those MBA types never deliver any value except for the investors.


>I bet Sam goes and founds a company to take on OpenAI…and wins.

How? Training sources are much more restricted know.


Define "wins".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: