Right... so this thing hooked up to a centralized AI, deciding what people see when, how often they see it, what they can express and how they feel... Do people really want to live in the matrix? I have a visceral reaction to this idea
"Where a user looks stays private while navigating Apple Vision Pro, and eye tracking information is not shared with Apple, third-party apps, or websites. Additionally, data from the camera and other sensors is processed at the system level, so individual apps do not need to see a user’s surroundings to enable spatial experiences."
Unless the code is fully free software that has been audited by enough people & there is a proof the binaries on the device correspond to this source code, then this means nothing.
Apple’s buzzword of the day yesterday was “on-device processing” and “on-device X” in general. They have invested heavily for years now into shipping ML chips in their devices and don’t seem to be stopping; if anything they’re increasing the use of it.
Is there anything definitively claiming this would be any different?