Hacker News new | past | comments | ask | show | jobs | submit login

it's because you're using "geekbench" for your numbers. which is a combined number of misleading stats. it includes things like "encoding video" and "rendering html" - things that the m1 has specific optimizations for, which in the real world are done by the NVIDIA RTX on my Dell, with the CPU sitting at under 1% utilization. Yes, if you offload these tasks, which in the real world don't use the CPU at all, and run it on a CPU with special accelerators for these useless tasks, the CPU designed specifically to game "geekbench" metrics will win. In the real world, I got a multi-gig dataset I need to process and do calculations on. Go put a database on the M1 and see if it beats a xeon. Or for an easier test, just load up excel with lots of functions and lots of calculated rows and columns. Run that while running a couple of VMs for virtual appliances on your laptop too (128GB ECC RAM helps with that).

You're literally here saying the M1 is going to replace a server chip. Newsflash - the M1 doesn't even run any of the needed code, because it's arm code - a small niche.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: