Roomscale vs. Seated vs. Standing

How do we specify that an Oculus/Vive app is meant to be a seated experience instead of roomscale? I’ve found questions with snippets of answers from older versions of Unity but nothing official and no documentation.

In my case, I’m working on an “in the cockpit” experience. I can’t have the player’s camera “stand up and walk a meter to the left” because that’s outside of the cockpit - he is now looking at himself from the outside.

So I’ve tried calls like:

XRDevice.SetTrackingSpaceType(TrackingSpaceType.Stationary);
InputTracking.Recenter();

The former isn’t really documented at all and the documentation for the latter teases the reader with hints of unrevealed knowledge:

Not that it tells you how to set a seated, staanding, or room scale experience. Or links to anything. Or mentions anything at all.

The call to SetTrackingSpaceType appears to have no effect at all on Oculus Rift.

So I’ve implemented my own VR camera “constraint” system that forces the VR camera to stay within 0.25 meters of the cockpit player’s virtual head. This “works” - but feels like I’m fighting the system due to simple ignorance. It also pins the player to some possibly incorrect initial “roomscale” position that possibly isn’t what he wanted. It also keeps the guardian system flashing at you all the time if you are trying to play seated at a desk at the edge of the play area.

So what is the proper, correct way to do a “seated” experience? How do we even specify that we WANT a seated experience with the latest 2017.3 XR API’s?

Any guidance on this would be greatly appreciated. Thanks!

1 Like

I also tried many of the shennanigans in this thread - which is just a series of wild experiments undertaken by developers without documentation. That was a joyride.

Also, this does almost exactly what a cockpit game needs, but restricts even the minute head movements that make higher end VR so polished:

InputTracking.disablePositionalTracking = true;

I was hoping that something like SetTrackingSpaceType would be the richer version of this, but as mentioned, it seems to do nothing at all on Oculus Rift.

You probably dont want to use disablePositionalTracking. What that does is hard-remove the position element of the camera update. its intended for 360 video scenarios. I need to actually double check what gets set wrt seated/standing.

Changing the seated/standing setting changes which how the HMD relative space is reported. For seated, 0,0,0 in HMD relative space, is the position the HMD was calibrated at for roomscale, 0,0,0 is on the floor, and the HMD is offset from this point.

Changing to “Seated” does not stop the HMD from reporting lateral position changes away from the “center” position, or the position that the HMD was calibrated at using recenter. All it does do is change whether your camera should start at the curated head height (seated), or on the floor (roomscale) to get the right result.

The problem you actually want to solve is “how do i constrain the user to a particular space?”. You have a few different options here,

You could use the Tracked Pose Driver and set the tracking to rotation only. However this is likely to introduce undesired comfort issues when the player looks around the cabin and moves their head but only the rotation is applied. Especially when used in conjunction with joystick style locomotion.

Or you could setup some form of “bounding volume” so that if the user translates far enough away from their calibrated seated location, you fade the screen to black, and ask the user if they want to re-center to their current location and/or if they’d like to return to their previous position to continue the game. This avoids any comfort / dislocation style problems with removing the position component of the headset motion (and also means that any head modelling on mobile devices will be correct) while driving home that translating outside the prescribed volume is not something you really want the user doing.

1 Like