Hand tracking in VR

Soldato
Joined
23 Jul 2009
Posts
14,083
Location
Bath
Full disclosure: I work at Ultraleap (we acquired Leap Motion last year) so I have strong opinions on this, but I'm keen to hear whether you guys think that well implemented hand tracking and spatial interaction methods are something you'd be excited by.

I would love for games like star trek bridge crew to just have used my actual hands, as half the fun of that game was the social expression afforded to you and all the interfaces were just touchscreens. Similarly I'd love to be able to take my hands off the controls in elite dangerous and navigate the menus with my hands instead of having to click through them.

What do you guys think? Great way to enhance some/all games or pointless gimmick? Let's assume that it's done better than oculus have /sassy
 
Soldato
Joined
14 Sep 2008
Posts
3,810
Location
Nottingham
Board games, most notably would be Table Top Simulator. There's a control binding for the Index controllers that goes a way to replicating touch by using the natural grip and grab to pickup and move pieces, but with no finger support your still reliant on pressing buttons to action detailed movement.

Hand tracking totally makes sense for touch interfaces, your controls in Elite, or working the menu's in Big Screen for example.
 
Soldato
OP
Joined
23 Jul 2009
Posts
14,083
Location
Bath
Board games, most notably would be Table Top Simulator. There's a control binding for the Index controllers that goes a way to replicating touch by using the natural grip and grab to pickup and move pieces, but with no finger support your still reliant on pressing buttons to action detailed movement.

Hand tracking totally makes sense for touch interfaces, your controls in Elite, or working the menu's in Big Screen for example.
TTS is a good shout actually. If anyone has any other ideas it might be quite useful.

One of the things we've been working on is how you deal with interfaces when you don't have buttons. There are quite a few analogues with the work already being done in VR interfaces, of course, but I was just kinda brainstorming for concepts where it's actually already a good fit without loads of new concepts being introduced.

For anyone interested, this is our idea on dealing with the tricky issue of browser navigation in AR, but I think it would fit reasonably neatly in VR too for complex navigation.

https://www.ultraleap.com/company/news/blog/vr-ar-content-browser/

I live in this stuff so it's always useful to hear opinions from end users to make sure we aren't drinking the cool aid too much!
 
Soldato
Joined
20 Apr 2015
Posts
4,064
Location
.
Probably most games and simulators would benefit from hand/finger tracking, and most games/sims have menus. So pulling a menu up in VR would need a specific hand gesture like clicking your fingers or doing the Vulcan 'V' sign. Once you've finished finger pointing at your menu choices the finger click would make the menu disappear.
 

Deleted member 66701

D

Deleted member 66701

Hey devrij. I'm currently doing a PhD at Lancaster University and I'm currently looking at novel interactions in VR. I'm about to start a small study comparing hand interactions to established interactions for speed, accuracy and efficiency. I'm using the oculus quest. If you fancy collaborating on something drop me a pm.
 
Soldato
Joined
14 Sep 2008
Posts
3,810
Location
Nottingham
Some DCS modules offer up a fully clickable cockpit already. As great as every switch is in the Black Shark working is, it's cumbersome having a controller in hand when you want to pick up the flight stick.
 
Soldato
OP
Joined
23 Jul 2009
Posts
14,083
Location
Bath
Some DCS modules offer up a fully clickable cockpit already. As great as every switch is in the Black Shark working is, it's cumbersome having a controller in hand when you want to pick up the flight stick.

That's precisely it for me. Or having to fumble for a controller that has idled out somewhere on your desk with your headset on. I like the idea of having my hands available, just in real life, without having to pick them up or put them down.
 
Man of Honour
Joined
30 Oct 2003
Posts
13,227
Location
Essex
That's precisely it for me. Or having to fumble for a controller that has idled out somewhere on your desk with your headset on. I like the idea of having my hands available, just in real life, without having to pick them up or put them down.
Full disclosure: I work at Ultraleap (we acquired Leap Motion last year) so I have strong opinions on this, but I'm keen to hear whether you guys think that well implemented hand tracking and spatial interaction methods are something you'd be excited by.

I would love for games like star trek bridge crew to just have used my actual hands, as half the fun of that game was the social expression afforded to you and all the interfaces were just touchscreens. Similarly I'd love to be able to take my hands off the controls in elite dangerous and navigate the menus with my hands instead of having to click through them.

What do you guys think? Great way to enhance some/all games or pointless gimmick? Let's assume that it's done better than oculus have /sassy

I would like it. Nothing would be more natural that just picking stuff up rather than fumbling around with vr controllers. Question is how do you provide feedback to the hands so you don't just keep jabbing your hands through things in the VR world without some big old gloves?
 
Soldato
OP
Joined
23 Jul 2009
Posts
14,083
Location
Bath
I would like it. Nothing would be more natural that just picking stuff up rather than fumbling around with vr controllers. Question is how do you provide feedback to the hands so you don't just keep jabbing your hands through things in the VR world without some big old gloves?
Absolutely, feedback is really important. With gestures and other spatial interaction interfaces you try to design them so they are inherently tactile (eg a pinch has its own haptics because your fingers touch eachother), but the solution to creating virtual objects in reality is a tricky one.

Our solution is to project sensations onto the hands using ultrasound. You can't see it or hear it, but you feel it. Essentially we vibrate your skin so you can feel it on your fingers or palm and we can modulate the vibration to simulate different textures or draw shapes on your hand to better represent what you're touching or doing (imagine a tap tap for a button actuation and release). It's never going to produce force though. That's just physics. I think you're stuck with gloves /suits for that kind of experience. What's cool to me is how much your brain does to match up a vibration with what you're seeing and expecting to feel. Eg a random pattern of vibrations on the palm and running up the fingers + the animation of shooting lightning out your hands and your brain is like "yep that feels like lightning". I guess what I'm saying is that sometimes it doesn't have to be a perfect simulation in the same way that the haptics in your phone don't have to feel like you are pressing a physical keyboard to tell you that your key press worked without you having to think about it. It just needs to be enough to tell your brain the thing you were trying to do worked without the mental load of confirming it visually.
 
Soldato
Joined
9 Mar 2010
Posts
2,838
Massive fan of hand tracking - hate its limitations though :)

For me it comes down to a couple of things that need addressed to make it work really well:
1) Haptics for interacting with objects - While not ideal, I find that pushing a button with a "virtual" hand using a controller that even just a buzz gives me enough of a tactile response to make it feel a little more real. It's especially convincing when you have something that shows the controller you're holding while showing your hand because it's a one to one representation of exactly what's going on.

i.e. pushing a button with the left index finger like this:
images


I had seen the Ultraleap STRATOS but written it off as without support for a VR headset I just don't see the need - as hand tracking without a VR headset is more of a chore than a benefit in my opinion.

I've always wondered why haptics/hand tracking people aren't working on a complimentary system, where instead of a glove trying to do ALL the work (haptics and tracking) they don't make a glove that ONLY does haptics but maybe compliments something like the Leap tracking - maybe it has markers to give more robust tracking from the IR cameras that means you can use the same device with or without gloves that provide added value.

Oh, one thing I do like is putting a menu physically attached to one hand, so that you "press" your fingers to press the buttons. Annoying on the Quest at the moment as it "gracefully" stops tracking when your hands overlap (very annoying) but great interaction/feedback for systems that support it (like the Leap)

2) Responsiveness - This is possibly a haptics thing again, but really... physical buttons are better than any hand tracking for a couple of reasons. Firstly the amount of movement and precision you get (i.e. the ability to choose between an A and B button on a controller to quickly do two possible functions) I'm not sure will ever be matched by hand tracking. And to be honest, it's not a hand tracking problem, it's probably a human problem. How accurately can you really place your finger, hovering in mid air, or preforming some kind of recognised gesture without some kind of support that comes from holding a physical object.

Additionally, people are lazy. When I hold a controller I'm able to put my hands behind my back and get as comfy as I want and (even though there might be no controller movement tracking) I'm able to press a button without any hesitation it won't be picked up by the application. Hand tracking faces the additional problem that the tracking volume is going to be comparatively small (when we consider all in one devices - which is where I think everything is moving to now) because it can rely on additional sensors to support tracking loss.

Which is where, again, I think a system that combines "dumb" gloves with physical buttons on the top of the index finger (not unlike PointCTRL - https://forums.eagle.ru/showthread.php?t=218861) means yeah, you can use your Leap without gloves, but put these gloves on and you get an additional IMU, haptics and buttons that make it super good.

3) Reliability/Robustness - if I gave you a mouse that when you clicked the left button it only worked 90% of the time, how quickly would you smash it?... That's how much I want to smash the Leap when it doesn't detect a pinch gesture or falsely detects a pinch gesture. Same thing goes for when I need to drag something around of move my hands so that the sensor doesn't detect them - you're literally fighting against the system half the time and it's not productive.

BUT... saying all that... I frickin' love hand tracking :D

The Oculus Quest now supporting both controllers and hand tracking at ALMOST the same time (limitation that it can only do one OR the other so you need to put the controllers down to activate hand tracking) is a sign of things to come. I can easily see Oculus adapting their hand tracking model to the RiftS at some point and with that comes the potential power to do both hand and controller tracking at the same time.

As others have said, moving between real controls (HOTAS) and virtual controls (like the Elite Dangerous of DCS cockpit) is going to be frickin' amazing.

I also love it for its accessibility. The number of applications I've developed and demoed to people shows me that some people don't even know what a trigger is - so putting buttons on a control panel that can just be reached out and touched is a game changer in terms of usability.
 
Soldato
Joined
11 Sep 2013
Posts
12,299
I personally don't find HOTAS games in VR to be especially limited... Indeed, the whole point of HOTAS to me is that you don't have to take your hands off to start using mice, punching keys or stabbing fingers at empty space.

Outside of that, again it comes down to implementation. Even floating menus would probably benefit from some kind of buzzy haptic feedback, so I'm guessing research into something like static electricity fields that cause a buzz when you put your hand into/near them?
My first question would be how you implement things like this for people with mobility/dexterity limitations in their hands.....
 
Soldato
Joined
9 Mar 2010
Posts
2,838
I personally don't find HOTAS games in VR to be especially limited... Indeed, the whole point of HOTAS to me is that you don't have to take your hands off to start using mice, punching keys or stabbing fingers at empty space.

Outside of that, again it comes down to implementation. Even floating menus would probably benefit from some kind of buzzy haptic feedback, so I'm guessing research into something like static electricity fields that cause a buzz when you put your hand into/near them?
My first question would be how you implement things like this for people with mobility/dexterity limitations in their hands.....

A HOTAS used for gaming comes from HOTAS used in real planes. In real planes they take their hands off the throttle and stick for a variety of reason - I mean you give me a full sim cockpit that simulates this and I'm all for it, but ironically, even with a full sim pit if you're using it with VR it's handy to have hand tracking to see where your hands line up to use your real switches. Hell, even just using the switches on the base of a warthog you have to feel around or be really familiar with it before you can use it at any kind of speed.
018122_pc_dcs_a-10c_warthog.jpg


Of course, a lot of VR people fall back to Voice Control stuff - which works great in a sim like Elite but does detract a little from the experience in a real life sim like DCS.

Just as a wee input on the disability side as well: I actually like a lot of what Leap Motion had done in the past. Not only had they looked at the hardware but started looking at the software/ux side of things to complement their product. This means that while they're designing hardware that assumes you've got hands, their software only assumes you've got virtual hands. So if you can simulate a virtual hand their work that contributes towards UX design that focuses on usable interfaces helps anyone that can use their software. As such, you've got some products that instead of tracking hands, track signals being sent to the muscles, which in turn generate virtual hands. It's super annoying because I can't find the video, but they showed a demo where it was tracking the user opening and closing their hand (just with muscle/signal sensors) and when he held his hand closed with the other hand, he was able to open and close his virtual hand just by TRYING to open and close his hand. Really clever and could likely be a great benefit to many disabled users in a variety of ways.
 
Soldato
OP
Joined
23 Jul 2009
Posts
14,083
Location
Bath
Thanks everyone, and especially @RoyMi6 those are some really valuable insights which I'll definitely be passing on. I started this thread mainly out of curiosity to gauge opinion, but it's actually been really helpful as well.
 
Soldato
Joined
14 Sep 2008
Posts
3,810
Location
Nottingham
Our solution is to project sensations onto the hands using ultrasound.

This is what Mike (the VR podcast guy) was talking about on his show a few months ago?

Everything he described at the time sounded extremely interesting, the way your describing it sounds fairly similar. Either way it's awesome that people are working toward these types of goals, VR has such a bright future.
 
Soldato
Joined
11 Sep 2013
Posts
12,299
A HOTAS used for gaming comes from HOTAS used in real planes. In real planes they take their hands off the throttle and stick for a variety of reason - I mean you give me a full sim cockpit that simulates this and I'm all for it, but ironically, even with a full sim pit if you're using it with VR it's handy to have hand tracking to see where your hands line up to use your real switches. Hell, even just using the switches on the base of a warthog you have to feel around or be really familiar with it before you can use it at any kind of speed.
Yah, I do know.... but I mostly play space games and use CH kit, so not especially based off real planes aside from being a generic F16 type stick... and certainly not that CH throttle. Arguably the same for people flying with Saitek X52 or X55, or Thrustmaster TM16000M. Helicopter games are another completely different element, too.
If I'm playing a proper, grown-up flight sim, I'd rather a full cockpit with surround screen than just a desktop anyway. VR is of limited use at that point.

But having tried a couple of VR-only flight sims and had to use the controllers for flight controls, it was hideously bad. You need the physical feedback of physical controls, and waving a wand around or trying to use my knee as a reference pivot was soul-destroying!
Using them for hands to operate instruments as well was absolutely game-breaking.
Not sure how it'd work with LEAP-style hand (and presumably foot) tracking, as there are toggles, flip switches, buttons, dials, knobs, sliders and all manner of other controls in a cockpit... Ought to be more accurate as the reference point for where the controls are isn't gonna be moving around like it does when your VR uses head tracking. But again, comes down to how it feels to press, flip, twist, turn, slide, yank, trip and dial in what you want the aircraft to do. A gentle buzzing might not be good enough.

Don't get me wrong, I love VR, but there are still a lot of limitations...
 
Soldato
OP
Joined
23 Jul 2009
Posts
14,083
Location
Bath
This is what Mike (the VR podcast guy) was talking about on his show a few months ago?

Everything he described at the time sounded extremely interesting, the way your describing it sounds fairly similar. Either way it's awesome that people are working toward these types of goals, VR has such a bright future.
I can't say I'm familiar with Mike the VR podcast guy, but if you have a link I'd love to check it out. We had our lead UX researcher on Radio 4 the other week so maybe some other people picked it up as well?

I got to try this demo before they packed it up for CES (apologies for the elevator music), and I have to say I was pretty stoked. I want that to be the future. I also found it pretty funny to watch visitors doing it because you know when they get to the "relaxation" bit because they're all swirling the stars around for ages lol. This particular one had haptics as well which makes a surprising difference to the accuracy rate of gestures. Similar to have cherry brown to cherry red switches in a keyboard, it just kinda helps you know it worked and not wondering if your half-keypress registered or not.



For AR, this concept has me super excited as well. I think this interface could totally work.

watch
 
Back
Top Bottom