Hacker News new | past | comments | ask | show | jobs | submit login

> This could be solved by having well-designed interface guidelines and have those guidelines enforced by the company building the headset. If anyone is going to do that, it's Apple.

I've no doubt there will be tons of third, and possibly even first party peripherals for it, but treating gesture interaction as the baseline to design against was the right call. It's the number one thing that will help adoption right now if done right. I think the callout made in the keynote about each new Apple product being based around a particular interaction paradigm (e.g. mouse/keyboard, clickwheel, then multitouch, now gesture) makes it seem obvious that this is the natural evolution of HCI.

> No matter how good the gesture recognizers get, waving your hands in space is never going to be as good as having physical controls and buttons attached to your hands.

I thought this as well 5 years ago, but pose detection and reverse kinematics have come lightyears since then thanks to ML. I'm fully confident that what Apple is shipping with Vision Pro will be equal to or better than the 6DOF control fidelity of the gen 1 Oculus constellation based tracking. The only problem that remains is occlusion when reaching outside the FOV, which indeed it's hard to imagine a solution without controllers or an outside-in tracking setup.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: