Hacker News new | past | comments | ask | show | jobs | submit login

The writing is on the wall. From ARM to incredibly powerful emerging GPU's to interest in custom ASIC, the Intel instruction set's days are numbered



I don't care what GPU my mobile phone runs. It's fast enough, I'm ok with it.

I also don't care about servers. I'm not a dev-ops/admin person.

On my laptop/desktop, where I DO care, I don't see Intel going away anytime soon.

Even if we switched to some ARM devices, it would be a regression (if not for anything else, for having to run most Windows/MacOS stuff under some kind of Rosetta-like interpreter for many years).

Besides, as soon as ARM CPUs approach the limits of various manufacturing processes, they will face the same issues Intel does. Moore's law is not coming back.


Do you think general purpose applications are all going to run on GPUs or ASICs in the future? I wouldn't think so.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: