Hacker News new | past | comments | ask | show | jobs | submit login

>I’m not an apple person but I do think the eye tracking to focus / pinch the air to activate combo is very natural and significantly different to what meta, HTC etc are doing. I think this might be the turning point for AR/VR.

From getting started in 2015 with the DK2, I've held from the very beginning that controllers are a dead end for VR. I've given countless demos where people are just completely confused because it's so unnatural and awkward, and it adds to the overall cognitive load to the point of destroying presence. Sure competent gamers can figure it out in a few minutes, but that's not the point. You don't interact with the world using triggers and buttons. You interact with the word by interacting with it. The controllers need to go away.




You don't interact with the real world with gaze and pinching. Do you think this gesture is also a dead end?

For me, controllers are bad simply because your hands are full. You're giving up interacting with the real world to interact virtually. Having to figure out a remote control isn't all that insurmountable for the majority of the population but no one wants to hold a controller all day.

The missing haptics and discrete button feedback is an issue with hand tracking though.


Don’t we interact via gaze and pinching every time we pick up a small object?


Where you must look at the object directly and perform a pinch click away from it? No.


Reads a bit like when people would complain about having to use a mouse back in the late 80s / early 90s.

There’s ton of stuff you’ll never be able to do in VR without some kind of controller with buttons and joy sticks. Making the controllers smaller, lighter, more ergonomic and fairly standardized is the natural path forward. People who grow up with VR and games won’t have a problem.


it's a huge adoption barrier though

The problem is the best apps attach all sorts of weird things to weird buttons.

I've given plenty of novices a try of a headset and it's a train wreck with them accidentally pushing buttons every which way that completely destroy whatever setup you've done. My favorite is devs like to make the "grip" buttons (on the side where your middle finger rests) shift the frame or scale of world (eg: OpenBrush does this). It's super natural once you know what you are doing, but give that to a novice and it's incredibly hard for them to learn to not grip the controllers. They cling on to them like crazy and the whole world is spinning around and they immediately tell you they are disoriented and nauseated and hate VR.


This could be solved by having well-designed interface guidelines and have those guidelines enforced by the company building the headset. If anyone is going to do that, it's Apple.

I'm happy the VisionPro uses gestures to control the interface, but I also agree with the parent poster that we'll need hardware controls someday. No matter how good the gesture recognizers get, waving your hands in space is never going to be as good as having physical controls and buttons attached to your hands.


> This could be solved by having well-designed interface guidelines and have those guidelines enforced by the company building the headset. If anyone is going to do that, it's Apple.

I've no doubt there will be tons of third, and possibly even first party peripherals for it, but treating gesture interaction as the baseline to design against was the right call. It's the number one thing that will help adoption right now if done right. I think the callout made in the keynote about each new Apple product being based around a particular interaction paradigm (e.g. mouse/keyboard, clickwheel, then multitouch, now gesture) makes it seem obvious that this is the natural evolution of HCI.

> No matter how good the gesture recognizers get, waving your hands in space is never going to be as good as having physical controls and buttons attached to your hands.

I thought this as well 5 years ago, but pose detection and reverse kinematics have come lightyears since then thanks to ML. I'm fully confident that what Apple is shipping with Vision Pro will be equal to or better than the 6DOF control fidelity of the gen 1 Oculus constellation based tracking. The only problem that remains is occlusion when reaching outside the FOV, which indeed it's hard to imagine a solution without controllers or an outside-in tracking setup.


yeah absolutely.

As it stands, it's a dilemma. The Vision Pro is $3500 and still only does half of what I want a headset for.

The only saving grace is the Vision Pro is so expensive, I can pick up a Quest for only $500 and it will do the other half it's hardly any more ;-)




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: