It really puts into perspective how much of a meme economy we live in. I just read an article that says global lithium will be worth 15bn a year in 2030 when we're at peak battery demand. This company is planning to spend 1bn just this year in order to run some numbers through someone else's algorithm. People have given them 1bn in cash for that.
Clearly it's all bullshit. There's no way they need that much and somebody will be siphoning it all off.
> This company is planning to spend 1bn just this year in order to run some numbers through someone else's algorithm.
“Just”? Reductive mischaracterizations like this are not useful. It looks like a rhetorical technique. What is the actual argument?
It doesn’t matter much “whose” algorithm it is or isn’t, unless IP is important. But in these areas, the ideas and algorithms underlying language models are out there. The training data is available too, for varying costs. Some key differentiators include scale, timeliness, curation, and liability.
> Clearly it's all bullshit. There's no way they need that much and somebody will be siphoning it all off.
There is plenty of investor exaggeration out there. But what percentage of your disposable money would you put on the line to bet against? On what timeframe?
If I had $100 M of disposable wealth, I would definitely not bet against some organizations in the so-called AI arms race becoming big winners.
Again, I’m seeing the pattern of overreaction to perceived overreaction.
1. A business's value is related to profits or potential profits. If I put a dollar in, how many do I get out? What's the maximum number of dollars I can put in?
2.The farther away you are from an end customer, the lower your profits tend to be unless you have a moat or demand for your product is inelastic.
Lithium is far from customers and while demand for cheap lithium is high there are lots of applications that will opt for some other way to provide power if the price gets too high.
We can infer from publicly available information. BLOOM[0] was trained for four months on 384 A100 80GB GPUs, excluding architecture search. They specifically indicate (in the Huggingface page):
> Estimated cost of training: Equivalent of $2-5M in cloud computing (including preliminary experiments)
You can see from the training loss[1] that it was still learning at a good rate when it was stopped. The increased capabilities typically correlate well with the decrease in perplexity.
That makes many believe that GPT-4 was trained for vastly more GPU-hours, as also suggested by OpenAI’s CEO[2]. Especially so considering it also included training on images, unlike BLOOM.
You’re comparing single-year market value of a commodity that is dug straight out of an open pit with multi-year capital investment into one of the most advanced technologies the human race has created, currently offered by a single company? I’m not sure where to begin with that.
How much money do you think it takes to finance and build a lithium mine? How much capital investment is there in lithium right now? A lot.
The $ is distributed sovereignty - there's a constant tension between value per dollar, and money as denoting hierarchy. Did Louis XIV provide value equivalent to all those diamonds?
And AI is delivering on a lot of different planes right now. This shit is real on a practical and spiritual level. It's not every day that we get to participate in giving birth to a new form of life.
Clearly it's all bullshit. There's no way they need that much and somebody will be siphoning it all off.