Except Apple has stated they will encrypt that information and not supply it to apps to avoid targeting or fingerprinting. And second how would that help someone with ALS?
> And second how would that help someone with ALS?
They won't share information on where you look, but they will share info on where you 'click', which is used to navigate apps. IIRC you are supposed to use your fingers to do this, and other actions, but I imagine that Apple's accessibility team will have an alternate mode for people with motor limitations. It could be a long-blink, or rapid blinking, for example.
Also donning the AVP headset might not be possible for someone with advanced ALS. A fixed outside-in apparatus (probably attached to the patient’s wheelchair) would make more sense.
There are a bunch of hardware out there to get gaze vectors, the problem is that they rely on a generalised model of the eye. With fine tuning you can go from >5 degree of error, to <1-2