Hacker News new | past | comments | ask | show | jobs | submit login

The one area that I think Apple might be setting themself up well for is the longer term edge device AI race. It’s reasonable to think that AI will become more and more embedded in all aspects of computation and communication, and with that, it seems plausible to think that running and executing inference at the edge on devices with low power and compute capabilities relative to what’s available in the cloud will be a desirable deployment methodology - cost, bandwidth, robustness, etc etc.

Obviously there’s nothing of the sort in the works now, but it’s not totally implausible to think that the expertise and infrastructure apple is developing now might be setting themselves up to also handle inference for things like home automation systems, industrial sensors and actuators, etc which may all need edge inference capabilities. Chips like the M Series or on the other side the NPU will presumably be coming to many devices outside of the mobile/pc market and it doesn’t seem totally insane to me to think that Apple might want to get into that world.

Again, this is all just speculation about long term pathways that might be unlikely after all.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: