Full Body Immersion

I’m working on some OVR demos to be added to Final IK and it would be nice to get some feedback on the way…

Here is a very basic demo (Win, DK2), mapping a standing animated 3D character to the positional tracking of DK2, using Full Body Biped IK.

Don’t forget to look down. :wink:

I’m planning to add movement, interactions and see if it makes any sense to aim weapons like that later…

Cheers,
Pärtel

This is rather cool mate! Works pretty well for me. Actually rather immersive as the knees bend when I bend mine! Goob job so far. Let’s see more :slight_smile:

Thanks for the feedback, Thomas! :slight_smile:

I’ve been working on some interactions.

Its quite convenient I’d say, I wish I could operate stuff by looking at it in real life. :slight_smile:

Note that when you have picked up the phone, you can still keep interacting with other stuff with the left hand.

Cheers,
Pärtel

Holy Moly, take my money! ( Ok, you do sales do you? :wink: )

Is it possible to do IK from any body point or just the joints?
Im asking because this would open the whole universe of tracking device options like Six Sense STEM or the MYO.
Even two MYO’s + Rift would be enough then to do full body IK.

Edit: ye okay … who cares about feet. :wink:

wow ! great demos, can’t wait to see more !

Hi Partel,

I am very interested in testing your code. This is exactly what i purchased FinalIK for!! I am mainly interested in mapping a standing animated 3D character to the positional tracking of DK2.

Is that solution something you could share here?

Best,
zipper

Really wonderful! and I must say that your IK is better than what I use in my asset InstantVR (of course :)) I would stay away from using animations on the avatars which are not derived from the real body movements. I may decrease immersion.
Be careful, this will drag you into something very exiting and time-consuming!

to Binary42: As far as I know, MYOs won’t do the job. They can detect gestures, but do not have positional or rotational information about the hands/arms. Currently, they best option is Razer Hydra and yes: I’m really looking forward to STEM!

Pascal, yes it has positional and rotational position of the lower arm, that’s why im interested if the IK would work on bones.
Edit: As Pascal pointed out via pm there is indeed now world position from the Myo. Thanks!
Basically, if u don’t need gestures, you can strap a mobile phone to your arm as well. :wink:

Hi, sorry for the delay,

Its basically possible to add any number of controllers to any body part. Perhaps not “out of the box”, but essentially its possible. I’m also hoping STEM, Perception Neuron and other controllers will not take much longer. Can’t wait to prove my point. :slight_smile:

Hi, Zipper,
I had to change some things in FinalIK to make that possible, mostly the Interaction System. It will be included in FinalIK 0.5 that I was planning to upload as soon as I’m done with this VR stuff. But if you send me your invoice nr in a private conversation, I will send you what I have right away.

Thanks! About the idle animation, I agree the current one has too much motion, but without any animation, the dude would be just static, which would also decrease immersion. Actually a very interesting thing I noticed, when you have the headset on for long enough and you keep looking at your virtual self, you’ll start to unknowingly mimic the animation, which will make it feel quite cool. It might be just me, but it’s something I noticed… interesting… :slight_smile:

It is time-consuming alright, but as you said, also very interesting. :slight_smile:
At the moment I’m working on aiming weapons. There are a couple of solutions, one is basically having the gun “parented” to the headset and aiming with your head, which is quite accurate and comfortable. Another way is to aim the gun with the mouse, then you will actually have to move your head if you wish to look through the scope. It’s not that quick and comfortable, but neither is aiming real rifles. It feels almost like the real thing though. :slight_smile:

I’m worried about health risks though, as you’ll have to keep one of your eyes closed for aiming and the other exposed to the Rift, it can’t be good for long periods of time. It feels worse after than it does with normal Oculus use. As the rifle is always very close to the camera, it also very important to have the headset properly calibrated.

But I’ll post some aiming demos soon. :slight_smile:

Cheers,
Pärtel

That will make things A LOT easier :smile:

May be an interesting way to learn how to dance :slight_smile:
But seriously: the physiological effects of immersion are one of the most interesting things of VR development. The most interesting thing I have now is virtual weight. I have an experimental implementation which calculates the physics effect of heavy objects on you body movements. If you handle a heavy sledgehammer for example, you won’t be able to lift it above you heady easily. So you see the hammer acting like it is heavy, you know it is heavy and somehow… you are starting to feel that it is heavy, your arms are getting tired from lifting it. Sort of a reverse Occam’s Razer. Magical.

With the gun, I have tried your options too, but nothing beats actual hand tracking. Having a Razer Hydra following the hand (and gun), pulling the trigger. Unfortunately not everyone has one, so I have a fall-back mechanism which uses mouse movements (or right stick on Xbox controller if they choose to use that) for aiming.

Never did close one eye like you said, maybe because I have no experience with real rifles and how to use them?

Pascal.

Very cool! Would be great if it had improved support for sitting down, since that’s what most people will be doing in VR. I did see that simply sitting causes the avatar to move into a seated position, so it’s already part way there. A few things that would make it even better:

  1. Being able to recenter to a seated rather than standing position.

  2. When seated, having the hips be glued to the seat rather than swaying.

  3. A general reduction or elimination of swaying when seated. When I sat down and remained motionless, the avatar was moving quite a lot.

  4. An option to have the hands on a table or on a controller instead of by the side?

I’m sure people could do most of this themselves using Final IK (which I’ve purchased already), but it would be useful to have it as a built in demo like you’ve done with various other aspects of Final IK.

When do you expect to release this?

Indeed, VR sure opens new doors for psychologists and neuroscientists, I bet there will be a bunch of papers written any time soon. :slight_smile: There is a number of phenomenons regarding weight perception and estimation, such as the Charpentier illusion, I wonder how they apply to VR…

Hand controllers of course are great, but as you said, not many people have them yet, so I’ll be concentrating on just the head at this time.

I have one more cool idea for aiming guns… If you take off the headset, turn it sideways, cover one of the lenses, and look at the other from a short distance, it’s exactly like looking through the scope of a sniper rifle. Even the lens distortion adds to the effect. Makes me want to attach a trigger to the headset. :smile:

In a similar fashion, why not use the headset as a steering wheel for a driving game if you don’t have one. :wink: Its shaped almost like the F1 steering wheel anyway.

Hi, I can make a sitting down demo, no problem, it’s just attaching the body/thigh effectors to the seat.
Also the hands can be pinned to anything you need as you can see from the interaction demo.
I don’t know yet how long it will take, its a lot of inventing. But I plan to do weapon aiming and movements before it’s finished. Then once it’s clear the main package won’t need any more changes to support VR, I’ll pack it up and upload to the Store. VR demos will probably be a separate (free) package because I don’t want to add the OVR assets and plugins, they are updated by Oculus.

Cheers,
Pärtel

Hi all,

I got a shooting demo almost ready, have a look. :slight_smile:
I went for the “simulator mode”, where you aim the gun with your mouse and if you want to take an accurate shot, move your head and close the other eye so you can see down the sight as you would with a real rifle. I think it’s more fun, challenging and realistic than just having the gun parented to the head.

I actually had to configure the dioptre of the gun model for precision. Shit is getting real! :slight_smile:

If you are left-handed like myself, hit “H” to switch.

You can press “R” any time to recenter. You better sit/stand straight while re-centering so you can move your head down a bit to take a shot.

And of course you can shoot with LMB.

You can also move around with WASD to see what you hit, but the legs are not moving along yet, I’ll address that later.

Cheers,
Pärtel

Updated the build.

Now you can walk around with feet moving, do ratchet rotations with Q/E. I also limited up/down aim angle and added some motion to the gun when walking so it doesn’t feel magically locked to the universe. :slight_smile:

Cheers,
Pärtel

1 Like

Another update.

Made some things work better and now you can hold RMB to aim with your head.

1 Like

do u plan on releasing this in anyway for people to use with their occulus vr projects?

Hi, yes it will be a free add-on to Final IK.

2 Likes

Its done! :slight_smile:

You can download the OVR demos from here:

Unity 4
Unity 5

They work with the new Final IK 0.5

Cheers,
Pärtel

@partel …do you have any problem with people using this for their commercial projects wether freelance or other?