Hacker News new | past | comments | ask | show | jobs | submit login

> I've started to realize that many people have been holding on of building things waiting for "that next big update"

I’ve noticed this too — I’ve been calling it intellectual deflation. By analogy, why spend now when it may be cheaper in a month? Why do the work now, when it will be easier in a month?




Why optimise software today, when tomorrow Intel will release CPU with 2x performance?


Back when Intel regularly gave updates with 2x performance increases, people did make decisions based on the performance doubling schedule.


Curiously, Moore's law was predictable enough over decades that you could actually plan for the speed of next year's hardware quite reliably.

For LLMs, we don't even know how to reliably measure performance, much less plan for expected improvements.


Moores law became less of a prediction and more of a product road map as time went on. It helped coordinate investment and expectations across the entire industry so everyone involved had the same understanding of timelines and benchmarks. I fully believe more investment would’ve ‘bent the curve’ of the trend line but everyone was making money and there wasn’t a clear benefit to pushing the edge further.


Or maybe it pushed everyone to innovate faster than they otherwise would’ve? I’m very interested to hear your reasoning for the other case though, and I am not strongly committed to the opposite view, or either view for that matter.


If Intel could do that, they would be the one with a 3 trillion market cap. Not Nvidia.


Call Nvidia, that sounds like a job for AI.





Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: