Hacker News new | past | comments | ask | show | jobs | submit login

I sometimes wonder why Apple would do this. I mean: i) the end-users frankly don't care. Seriously. ii) 90% (if not 99.99%) of dev would not (be able to) care

I mean, I doubt if more than perhaps 5% of Apple internal dev can take advantage of "tight integrat[ion] of new hardware and software." This integration probably takes form of either some specific app (think Pixel camera phone), some specific library, or compiler optimizations. The first one (specific app) can be accomplished much cheaper through add-on chips (guess what, that's what Pixel does). The 2nd and 3rd can be done much more effectively through a generally available chips (like, well, Intel's chip) since more people, from vendor's engineers to researchers to random open source ninjas, would be able to experiment and help out.

In other words, from a purely technical point of view, there is absolutely zero reason to do this. Whatever happens, Intel is among if not the best capable chip producers. And Apple is not "disrupting" (i.e. focuses on unaddressed aspect), but merely directly competing with Intel's core competencies. It's not Amazon entering details against WalMart's. It's Target's competing against WalMart's, except they don't have Target's existing competencies. Which, again, makes no technical sense.

On the other hand, if they want to completely lock in users......




I don't think they'd attempt it unless they had something that would disrupt the market. Intel has been rusting on its laurels without credible competition, and if Apple manages to steal a march on them with a new technology, that would be the impetus to bring it in-house.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: