As long as cost of Nvidia hardware the majority of the cost of AI, the cost of AI is entirely dependent on how much money Nvidia needs to extract to meet stock expectations.
Everything else is a tiny factor when it comes to compute
The same as true of GPT-3.5-turbo compared to the GPT-3 models which preceded it.
They want everyone on GPT-4-turbo. It may also be a smaller (or otherwise more efficient) but more heavily trained model that is cheaper to do inference on.
This is seriously the laziest type of analysis there is. It’s like a freshman who memorizes the formulas for a test and attempts to pattern match existing concepts rather than understanding fundamentals. In this case you’re pattern matching the most unimportant features of the existing trend. If you don’t understand the difference between AI and crypto I don’t know what to tell you
Ha you sound exactly like my own consciousness. There are few things in life I feel as resolute about than this one. Dynamic types were a mistake. Let’s learn and move on. I often joke that if null pointer references were the billion dollar mistake (or whatever it’s called), dynamic typing was the 100 billion dollar mistake. And we’re still living with it but thank god for typescript etc for saving us finally
I guess what I don’t understand about this is sum types represent a state that is so ubiquitous in our lives that it’s hard for me to go a day without seeing one. It’s simply the state of something either is this or that or that. It would be like a language without records. It’s missing a part of reality that shows up all the time
Love this. Not only is it my favorite feature of swift, but I often say swift enums are my favorite feature of any programming language. They are just so elegant and powerful