You can play with this in browser.[1][2][3] And there's OpenCV.js.[4] But I more often simply put a small square of yellow gaffer's tape between my eyebrows, and do color tracking.[5] Accurate, reliable, and cpu/battery cheap.
And for expert UIs, one doesn't have to emulate reality. So you might exaggerate parallax, be non-linear or discontinuous, etc. There might be a much larger vocabulary available, than merely "look through the small portal" and "make secondary content conditional on viewing angle". Maybe.
Why might you want this? Consider your primary "desktop" environment shifting into HMDs. But you still need to mix in screen time, to give your eyes a break. It would hypothetically be nice to have similar UI idioms on screen. Especially as your tooling becomes less and less like current 2D ones.
And for expert UIs, one doesn't have to emulate reality. So you might exaggerate parallax, be non-linear or discontinuous, etc. There might be a much larger vocabulary available, than merely "look through the small portal" and "make secondary content conditional on viewing angle". Maybe.
Why might you want this? Consider your primary "desktop" environment shifting into HMDs. But you still need to mix in screen time, to give your eyes a break. It would hypothetically be nice to have similar UI idioms on screen. Especially as your tooling becomes less and less like current 2D ones.
[1] old similar demo (broken on FF) http://auduno.github.com/headtrackr/examples/targets.html video: https://vimeo.com/44049736 [2] current tracking-only demo https://trackingjs.com/examples/face_camera.html [3] eye/gaze tracking (demos at bottom; but for me, they require good lighting, and beard removal) https://webgazer.cs.brown.edu/ using https://www.auduno.com/clmtrackr/examples/clm_video.html [4] opencv.js https://docs.opencv.org/master/d5/d10/tutorial_js_root.html [5] color tracking https://trackingjs.com/examples/color_camera.html