How to translate VR/XR InputTracking.GetLocalRotation into WorldSpace?

I’d like to know the origin rotation, (0, 0, 0) of a VR/AR space, that’s being used as the relative rotation for GetLocalRotation. Is there a way to get this in UnityEngine.XR?

.

I have the position working correctly. Which means I can get the WorldSpace position of the relative position used for all InputTracking.GetLocalPosition() calls. I have the rotation working for only a single case, when there is no rotation on any of the VR/AR Head Nodes’ parents. Get the tracking spaces origin position and rotation · GitHub This doesn’t work if the VR/AR Head Nodes’s parent rotation changes to anything else, say 60 degrees in x.

.

I want to do this so my tracked objects don’t need a common parent. So I can have a VRNode.Head that’s a child of something called feet. Other objects not part of that hierarchy at all can make use of the VR/AR positioning space. So I can keep a sword that’s laying on the floor outside of the VR/AR hierarchy, yet have it’s position correct when it’s being held by a tracked hand.

I’m not sure I understand your question, but I think your confusing the scene world space with tracking space in whatever XR input system you’re using. There is no world tracking coordinate because the tracking system isn’t part of the scene. It’s apples and oranges.

The easiest way to I know to integrate tracking is to create an empty game object that represents your tracker origin (if you’re using a Rift or Vive and a standing/room scale experience, this is usually some point on the floor where you initialized your system) in world scene coordinates. Then, add empty game objects for each XR tracked node. Create a script to update those node objects with tracker positions. You can then query those objects for world position. If you want to navigate the scene beyond your tracking area, you implement some sort of motion model that moves that tracking origin object in the virtual world.

I’d re-examine why you don’t want to use parenting. The easiest way to grab the sword in your example is to parent it to one of your hand game objects with a zero offset. It’ll move with your hand. But, if you really don’t want to maintain a hierarchy, a script to set the world position of the sword with the world position of the hand game object is easy.

No problem. My suggestion may not be what you’re looking for.

Absolutely, you can do the calculations manually, but it’s a lot harder. When you create a parent / child relationship in Unity between game objects, what you’re doing is establishing a local coordinate system for the child in the parent’s Transform coordinate frame. To get the child’s world position, you need to first multiply together the Transforms of all the ancestors from the scene root down to the child. Then decompose the result into a translation, rotation, and scale.

Last time I looked, you couldn’t create Transforms outside the scene structure (just by creating game objects). However, you could certainly create a Matrix4x4 (SetTRS) that contains the transform of the tracking space in scene coordinates, and another Matrix4x4 of the local tracker offset. Then multiply them together and decompose the resulting Matrix4x4 into a TRS. I don’t think Unity has a method to do that, but you can find one on the internet.

It’s a lot easier (and efficient) to create a game object for your tracker space in the scene, and then child another game object to represent your (for instance) hand. Then update the hand position (local) with XR tracking data. The Transform.position property of the hand is the position in world coordinates. You don’t have to do any math.