Hacker News new | past | comments | ask | show | jobs | submit login

I’m not an apple person but I do think the eye tracking to focus / pinch the air to activate combo is very natural and significantly different to what meta, HTC etc are doing. I think this might be the turning point for AR/VR.



>I’m not an apple person but I do think the eye tracking to focus / pinch the air to activate combo is very natural and significantly different to what meta, HTC etc are doing. I think this might be the turning point for AR/VR.

From getting started in 2015 with the DK2, I've held from the very beginning that controllers are a dead end for VR. I've given countless demos where people are just completely confused because it's so unnatural and awkward, and it adds to the overall cognitive load to the point of destroying presence. Sure competent gamers can figure it out in a few minutes, but that's not the point. You don't interact with the world using triggers and buttons. You interact with the word by interacting with it. The controllers need to go away.


You don't interact with the real world with gaze and pinching. Do you think this gesture is also a dead end?

For me, controllers are bad simply because your hands are full. You're giving up interacting with the real world to interact virtually. Having to figure out a remote control isn't all that insurmountable for the majority of the population but no one wants to hold a controller all day.

The missing haptics and discrete button feedback is an issue with hand tracking though.


Don’t we interact via gaze and pinching every time we pick up a small object?


Where you must look at the object directly and perform a pinch click away from it? No.


Reads a bit like when people would complain about having to use a mouse back in the late 80s / early 90s.

There’s ton of stuff you’ll never be able to do in VR without some kind of controller with buttons and joy sticks. Making the controllers smaller, lighter, more ergonomic and fairly standardized is the natural path forward. People who grow up with VR and games won’t have a problem.


it's a huge adoption barrier though

The problem is the best apps attach all sorts of weird things to weird buttons.

I've given plenty of novices a try of a headset and it's a train wreck with them accidentally pushing buttons every which way that completely destroy whatever setup you've done. My favorite is devs like to make the "grip" buttons (on the side where your middle finger rests) shift the frame or scale of world (eg: OpenBrush does this). It's super natural once you know what you are doing, but give that to a novice and it's incredibly hard for them to learn to not grip the controllers. They cling on to them like crazy and the whole world is spinning around and they immediately tell you they are disoriented and nauseated and hate VR.


This could be solved by having well-designed interface guidelines and have those guidelines enforced by the company building the headset. If anyone is going to do that, it's Apple.

I'm happy the VisionPro uses gestures to control the interface, but I also agree with the parent poster that we'll need hardware controls someday. No matter how good the gesture recognizers get, waving your hands in space is never going to be as good as having physical controls and buttons attached to your hands.


> This could be solved by having well-designed interface guidelines and have those guidelines enforced by the company building the headset. If anyone is going to do that, it's Apple.

I've no doubt there will be tons of third, and possibly even first party peripherals for it, but treating gesture interaction as the baseline to design against was the right call. It's the number one thing that will help adoption right now if done right. I think the callout made in the keynote about each new Apple product being based around a particular interaction paradigm (e.g. mouse/keyboard, clickwheel, then multitouch, now gesture) makes it seem obvious that this is the natural evolution of HCI.

> No matter how good the gesture recognizers get, waving your hands in space is never going to be as good as having physical controls and buttons attached to your hands.

I thought this as well 5 years ago, but pose detection and reverse kinematics have come lightyears since then thanks to ML. I'm fully confident that what Apple is shipping with Vision Pro will be equal to or better than the 6DOF control fidelity of the gen 1 Oculus constellation based tracking. The only problem that remains is occlusion when reaching outside the FOV, which indeed it's hard to imagine a solution without controllers or an outside-in tracking setup.


yeah absolutely.

As it stands, it's a dilemma. The Vision Pro is $3500 and still only does half of what I want a headset for.

The only saving grace is the Vision Pro is so expensive, I can pick up a Quest for only $500 and it will do the other half it's hardly any more ;-)


Yes, that makes sense.

A comparison could be drawn from the distinction between the Blackberry keyboard and the iPhone touch surface.

Apple’s design approach could well be aiming to make the Vision Pro feel as natural as possible.


Head/eye tracking as user input isn't really new in VR. In the past it has not felt very good. The pinch gesture in the Hololens was very unreliable. Its not natural that your gaze effects the world around you other than making eye contact with someone, perhaps. I think reaching out and gripping things is farm more _natural_.

That said, the sensors and software in this headset might finally be up to the task. And despite being unnatural it might be easy enough to pick up.


I remember reading a first impressions post where they mentioned that starting off, they were instinctively reaching out and pinching the UI elements. They said it worked fine, because you're also looking at whatever you're pinching, and it only took a few minutes to adjust to keeping their hands in their lap.

But I found it promising that the Vision Pro can see your hands in front of you as well, which should really help ease the learning curve, and prevent first-time users from feeling frustration with the UI.


> I think reaching out and gripping things is farm more _natural_.

Like when you grab a virtual object and it feels like you're grabbing absolutely nothing at all? This isn't natural and it's actually quite jarring in my experience.


Definitely. I think controllers and haptics are really underrated in all this Apple news but I'm still keeping an open mind.


It seems to work great for 2D control where it works similar to a mouse or a touchscreen. But for 3D control outside floating screens it is a bit unclear whether it can compete with a proper 3D controller.


Note: I'm speculating here, not saying Apple Vision Pro can do any of this.

If you could track hand movement 1:1 enough that interacting with a 3D space was as natural as it would be as if it was actually in front of you, the problem is definitely a solved one at that point. If Apple manages to do this at any point in the lifecycle of this product line, it'll be a significant breakthrough for this type of computing / experience


But what replaces the buttons? You need precise true/false states. Pinching again? That would be just one button. What about aiming/shooting? Just locking and pinching? What about character movement without a stick?

This all seems like playing Quake on a touchscreen.


All controls for visionOS also have something called "direct touch" where if you reach out and press the button the button gets pressed!

Further, you can absolutely make 3D objects able to be grabbed, thrown, resized, etc. and it is actually pretty simple to do.


I'm very skeptical this is going to be nice to use. People barely tolerate touchscreens (i.e. phone yes, laptop - maybe, desktop - no).

This is a button which has no physical presence at all.


Yeah. On a touchscreen you get at least a tactile resistance of the feeling of touching something rather than the air, although you don't feel whether you touch the virtual button or not, or when you are successfully pressing it. On hand tracking you don't feel anything at all. Controllers have a large advantage here.

Maybe Apple deliberately decided against including optional controllers, to make it clear that they are aiming at AR, not VR gaming. But even in AR the lack of tactile feedback could be jarring.


Apple has a 3d keyboard that you can reach out and type on. Tactile feedback is replace with visual feedback - the letters glow more as your finger gets near and they "pop" when you make contact.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: