Head/eye tracking as user input isn't really new in VR. In the past it has not felt very good. The pinch gesture in the Hololens was very unreliable. Its not natural that your gaze effects the world around you other than making eye contact with someone, perhaps. I think reaching out and gripping things is farm more _natural_.
That said, the sensors and software in this headset might finally be up to the task. And despite being unnatural it might be easy enough to pick up.
I remember reading a first impressions post where they mentioned that starting off, they were instinctively reaching out and pinching the UI elements. They said it worked fine, because you're also looking at whatever you're pinching, and it only took a few minutes to adjust to keeping their hands in their lap.
But I found it promising that the Vision Pro can see your hands in front of you as well, which should really help ease the learning curve, and prevent first-time users from feeling frustration with the UI.
> I think reaching out and gripping things is farm more _natural_.
Like when you grab a virtual object and it feels like you're grabbing absolutely nothing at all? This isn't natural and it's actually quite jarring in my experience.
That said, the sensors and software in this headset might finally be up to the task. And despite being unnatural it might be easy enough to pick up.