It seems to work great for 2D control where it works similar to a mouse or a touchscreen. But for 3D control outside floating screens it is a bit unclear whether it can compete with a proper 3D controller.
Note: I'm speculating here, not saying Apple Vision Pro can do any of this.
If you could track hand movement 1:1 enough that interacting with a 3D space was as natural as it would be as if it was actually in front of you, the problem is definitely a solved one at that point. If Apple manages to do this at any point in the lifecycle of this product line, it'll be a significant breakthrough for this type of computing / experience
But what replaces the buttons? You need precise true/false states. Pinching again? That would be just one button. What about aiming/shooting? Just locking and pinching? What about character movement without a stick?
This all seems like playing Quake on a touchscreen.
Yeah. On a touchscreen you get at least a tactile resistance of the feeling of touching something rather than the air, although you don't feel whether you touch the virtual button or not, or when you are successfully pressing it. On hand tracking you don't feel anything at all. Controllers have a large advantage here.
Maybe Apple deliberately decided against including optional controllers, to make it clear that they are aiming at AR, not VR gaming. But even in AR the lack of tactile feedback could be jarring.
Apple has a 3d keyboard that you can reach out and type on. Tactile feedback is replace with visual feedback - the letters glow more as your finger gets near and they "pop" when you make contact.