Hacker News new | past | comments | ask | show | jobs | submit login

Note: I'm speculating here, not saying Apple Vision Pro can do any of this.

If you could track hand movement 1:1 enough that interacting with a 3D space was as natural as it would be as if it was actually in front of you, the problem is definitely a solved one at that point. If Apple manages to do this at any point in the lifecycle of this product line, it'll be a significant breakthrough for this type of computing / experience




But what replaces the buttons? You need precise true/false states. Pinching again? That would be just one button. What about aiming/shooting? Just locking and pinching? What about character movement without a stick?

This all seems like playing Quake on a touchscreen.


All controls for visionOS also have something called "direct touch" where if you reach out and press the button the button gets pressed!

Further, you can absolutely make 3D objects able to be grabbed, thrown, resized, etc. and it is actually pretty simple to do.


I'm very skeptical this is going to be nice to use. People barely tolerate touchscreens (i.e. phone yes, laptop - maybe, desktop - no).

This is a button which has no physical presence at all.


Yeah. On a touchscreen you get at least a tactile resistance of the feeling of touching something rather than the air, although you don't feel whether you touch the virtual button or not, or when you are successfully pressing it. On hand tracking you don't feel anything at all. Controllers have a large advantage here.

Maybe Apple deliberately decided against including optional controllers, to make it clear that they are aiming at AR, not VR gaming. But even in AR the lack of tactile feedback could be jarring.


Apple has a 3d keyboard that you can reach out and type on. Tactile feedback is replace with visual feedback - the letters glow more as your finger gets near and they "pop" when you make contact.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: