The problem is the best apps attach all sorts of weird things to weird buttons.
I've given plenty of novices a try of a headset and it's a train wreck with them accidentally pushing buttons every which way that completely destroy whatever setup you've done. My favorite is devs like to make the "grip" buttons (on the side where your middle finger rests) shift the frame or scale of world (eg: OpenBrush does this). It's super natural once you know what you are doing, but give that to a novice and it's incredibly hard for them to learn to not grip the controllers. They cling on to them like crazy and the whole world is spinning around and they immediately tell you they are disoriented and nauseated and hate VR.
This could be solved by having well-designed interface guidelines and have those guidelines enforced by the company building the headset. If anyone is going to do that, it's Apple.
I'm happy the VisionPro uses gestures to control the interface, but I also agree with the parent poster that we'll need hardware controls someday. No matter how good the gesture recognizers get, waving your hands in space is never going to be as good as having physical controls and buttons attached to your hands.
> This could be solved by having well-designed interface guidelines and have those guidelines enforced by the company building the headset. If anyone is going to do that, it's Apple.
I've no doubt there will be tons of third, and possibly even first party peripherals for it, but treating gesture interaction as the baseline to design against was the right call. It's the number one thing that will help adoption right now if done right. I think the callout made in the keynote about each new Apple product being based around a particular interaction paradigm (e.g. mouse/keyboard, clickwheel, then multitouch, now gesture) makes it seem obvious that this is the natural evolution of HCI.
> No matter how good the gesture recognizers get, waving your hands in space is never going to be as good as having physical controls and buttons attached to your hands.
I thought this as well 5 years ago, but pose detection and reverse kinematics have come lightyears since then thanks to ML. I'm fully confident that what Apple is shipping with Vision Pro will be equal to or better than the 6DOF control fidelity of the gen 1 Oculus constellation based tracking. The only problem that remains is occlusion when reaching outside the FOV, which indeed it's hard to imagine a solution without controllers or an outside-in tracking setup.
The problem is the best apps attach all sorts of weird things to weird buttons.
I've given plenty of novices a try of a headset and it's a train wreck with them accidentally pushing buttons every which way that completely destroy whatever setup you've done. My favorite is devs like to make the "grip" buttons (on the side where your middle finger rests) shift the frame or scale of world (eg: OpenBrush does this). It's super natural once you know what you are doing, but give that to a novice and it's incredibly hard for them to learn to not grip the controllers. They cling on to them like crazy and the whole world is spinning around and they immediately tell you they are disoriented and nauseated and hate VR.