VR Hands

Hello everybody,
Now that we know that oculus will be shipped with the touch controllers (and HTC vive have their own hand controllers as-well), I was wondering has anyone already rigged and configured a pair of hands for the use in VR experiences.
From what I understand, with the touch controllers in use, the available information we might use to represent the user’s avatar, is the hmd’s position and orientation, and the touch controllers position (rotation also?). And I was wondering what is the best way to simulate the virtual hands based on this information. Probably IK will be the best solution, but the problem is that with such little information about the hand position (we only know the palm position, no elbow, shoulder…) I was wondering how to approach this problem.
If anyone has already tackled this problem in the past, I would be very thankful for every insight you guys might share with me that might make the virtual hands feel as natural as possible.

Many many thanks

Will be doing this soon, but with my own custom-built controllers

I am currently building a bluetooth interface jacket and gloves using inertial technology and conductive thread – the benefit of this will be: (a) you can get precise arm coordinates using FK-style translations (I have not tried yet, but my understanding is you can manually manipulate bones/rig/avatar, in which case this will be easy); (b) opening/closing fingers on hands to be able to manipulate things (guns, doors, grabbing objects, swinging swords or baseball bats, etc.)

This doesn’t really answer your question, but only to say:

  1. Yes, I would imagine with hand-only positioning, you’d have to go with IK
  2. I don’t understand why these companies are choosing that route; it doesn’t seem like the best way of doing things (unless you plan on using a thumbstick controller at all times, but the problem I’ve found is that this style of interaction is what makes VR unnatural, vertigo-inducing, and nauseating. We need to rethink VR games and how we interact with them so that our brains don’t go haywire when we are playing them and cause users to feel queasy. In my games I am developing, for example, walking/running around are not even an option…)

So… I’m not going to invest too heavily in their interfaces because my belief is that there will be better ones… (not just a belief; I know this for a fact. Some already exist, and I’m designing another myself.)

Hi Actiview,

we have rigged and configured a couple of hands for the use in computer games, animations, but also VR (for a public version, you can see our demo based on Oculus Tuscany here:

).

While we are working with available hardware such as Razer Hydra, LeapMotion, RealSense, SonyMove, etc. I am also quite curious about the Oculus Touch. As far as I could see, there’s probably the possibility to get orientation and position of the controller (similar to the Hydra), but not the finger positions (as for LeapMotion or RealSense).

We have developed our own IK that can work with target position and orientation, but we have noticed there are some disadvantages to have IK-based hands. Some of them: you have to calibrate the device to adjust your arm length to the avatar’s arm length - otherwise it is very annoying not to be able to reach things in VR; your arms are not tracked, so the IK is responsible to guess your elbow position - and it’s weird when your avatar’s elbow is not where your’s is; constraining hand orientation with kinematics is be a good thing to avoid overstretched (virtual) wrists, but then again, your avatar’s hand might be constrained where your real hand isn’t.

So, from my current experience and the feedback we got on demoing our stuff, there are at least clear advantages in terms of user experience when not using IK in VR, but “flying hands.”; at least until you have the equipment to really measure and project the real arm configuration into the virtual world …

/kai

You might want to have a look at Valve - Virtual Reality in 2015. They did some experiments with IK and it just didn’t work for the reasons you outlined (basically, to show the player’s arm, you really need full arm motion tracking - if you want to show the body of the player, you need full body motion tracking … if you try to fake it, the player will notice it and not like it very much). So, if you only have hand controllers, it’s actually better to just show the hands - or whatever the controllers represent (which often will not be a hand but something else).

Also, if you look at the Valve / SteamVR / HTC Vive demos and actually experienced them - this kind of controller really works quite well. In particular Tilt Brush is a pretty good example of why having a controller with a touch area and/or joystick and buttons is actually also quite useful in VR. They also have an archery demo that works really well with those controllers. Of course, there’s certainly also use cases for having actual hands in the game (the job simulator being one actual example).