Converting external 3D position and orientation to Unity world coordinates

Hi everyone,

First time i’m posting on this forum so i’m probably posting on the wrong thread.
I know 3D passage matrix are well known but i’m new to 3D development and i really couldn’t find a proper answer to my problem on the net or in this forum.

So here’s my problem :

I’m using an Hololens 1, meaning when i’m launching an app, the 3D scene initialize and set the world coordinates and orientation at the Hololens. So initial Rotation is (0, 0, 0) in yaw pitch roll and the world origin is at my camera position.
I’m also using an external tracking tool (Optitrack) which gives me the position (x, y, z) and orientation (quaternion or yaw pitch roll) of the Hololens in its own coordinate system.

I need a way to transform this coordinates so it matches the ones in unity. Meaning when i launch the app, i’m sending a network message to the Optitrack system, getting its actuel coordinates for the Hololens and then calculate a passage matrix from this.
My main goal is to know where other objects tracked by the Optitrack system are in the unity space and place them accordingly (probably with cubes at the beginning to see their position and orientations). And i think i need this passage matrix so every object in the Optitrack coordinate systems will be transformed in the unity coordinate system and i’ll just have to do like :
O = object in optitrack coordinate (position and orientation)
O’ = object in Unity coordinate (position and orientation)
P = passage matrix

O’ = PO.

Does anyone knows how i can do it?

PS: in case the x y z axis of the Optitrack are not the same as in unity (-Z instead of Z or left/right handed) will the solution still work? How should it be changed?

Thanks in advance for your answer and sorry if its a trivial question (as far as i looked on the net, couldn’t find an easy understandable unswer for the total noob i am in 3D and with unity).

Unity uses a left-handed coordinate system, with X to the right, Y straight up, Z going away from you when you are in identity rotation (0,0,0).

Whatever coordinate system the stuff you’re talking about uses, you should be able to remap and / or rescale it, but you have got to know the actual data.

Produce meaningful test data (like one single point), figure out what it should be in Unity, derive the mapping, write the code and test it.

For an example of this, Blender3D uses a right-handed coordinate system. When imported into Unity3D, Unity’s importer transmogrifies the base GameObject so that more or less it works just right, basically by swapping around some axes and then applying a (-90,0,0) rotation to the base object.

Hi,

First of all, thank you for your answer.
The solution you’re proposing is the first thing i tested. Just nullifying the initial position/rotation offsets does not work correctly (or i dont understand what derive the mapping means, which is also totally possible).

Even if i tell my system that whatever initial position and orientation is (0, 0, 0), when i’ll rotate my object, its new values will depend on the base axis of my Optitrack coordinate system and thus, will not align properly with the unity ones. Same goes for position. When i’ll move my object 10 cm forward in unity, this may result in like 3.4 x and -5.4 y for example in the Optitrack coordinate system.

So even if i synchronize the initial position of my object in both coordinates, i cant find its correct position/orientation after any movement.

Another idea i had was to use two objects (Hololens and my smartphone for example) and init a forward vector in unity using these two objects. So in my Optitrack coordinate system, i have the forward vector of Unity. However, i dont know how to use this information to know the correct mapping to go from Optitrack coordinates to Unity coordinates.

Concerning the left/right handed coordinate system, thank you for your answer, that’s what i’ll do :slight_smile:

This indicates other problems than just the different coordinate systems. Did you properly calibrate and align the optitrack system? I believe there is this right-angle looking tool included to set your origin after the “wanding” step. The problem you describe sounds like your origin is just diagonally aligned (or not set at all) or something similar.

I’m pretty sure for my thesis we put all tracked objects into a MotiveSpace parent object which was rotated by 90° around the Y-axis. We put a lot of calibration effort into assuring that the different sensors we used worked together in the same virtual space, so i’m not entirely sure if that’s what the 90° were for, but given that the object is called MotiveSpace (Motive being the optitrack software we used), that should be worth a try. Only after fixing the diagonal movement issue tho.

1 Like

Hi Yoreki, thanks for your answer.

It is correctly calibrated and aligned with my physical environment.
The main problem lies on the fact that the unity space orientation and 0, 0, 0 position initialize when i start the unity app in the Hololens. And i cant ensure that the Hololens is physically aligned on my Optitrack axis when i launch the app (the Hololens is on my head and i cant position myself and orient my head exactly so that it matches the Optitrack axis when i start the app. There are small deviations and thus, looking at x = 0.10 in Unity means i’m not looking exactly on the x axis in OptiTrack but probably something like 0.8 < x < 1.2 and some non zero values on Y and Z axis.

I’m also using motive and after Kurt-Dekker answer, that’s what i did to compensate the left/right handed problem.
However, as stated before, this does not compensate for initial deviation along the optitrack axis.

We used an Oculus Quest and ran into similar problems. We ended up with a somewhat complex system.
For the realtime tracking we relied on the sensors in the Quest itself (since they were superior to optitrack in our setup), but the initial positioning of the Quest was not guaranteed to be perfect in relation to the motive space objects or the Unity scene. So like @Kurt-Dekker mentioned, we attempted to derive the precise mapping.

Our solution to this was rather specific, so i’m not sure how much of it is helpful to your setup. In our case we realised that the hand tracking of the Quest is more or less accurate in relation to the headset. So we thought of an easy to re-create scene where the hands touched specific parts of the table, with the head position known relative to it. We then manually re-positioned the gameobject to fit the real scene. The (opti-)tracked position of the headset at that location served as our ground-truth so to speak. The hands here mostly serve to make the “object larger”, which minimized errors since humans are not perfect and this is a manual step.

6668827--763270--optiHands.png

Afterwards we took several measurements of poses (different positions and rotations of the HMD in 3D space), which resulted in two sets of data: the virtual positions and rotations in Unity, as well as the positions and rotations returned by optitrack. We then used an heuristic approach (read: brute force) over the sum of distances and differences in angles, to map the two datasets. Afterwards we were able to apply this mapping to the HMD object, which corrected the offset - from which point on we solely relied on the Quest sensors for movements and rotations.

6668827--763273--optiPointMatch.jpg

We ended up with a ~0.6mm devation between the data points, which is a lot better than trying to position it manually each time. A perfect mapping is not possible, due to little inaccuracies in measurements (ground-truth and optitrack).

I hope this helps in any way. The explanation may be a bit superficial, but i was not the one mainly responsible for this mapping so that’s what i remember :slight_smile:

1 Like

Thanks for so much details on your process.

Ok so if i understand correctly, you used multiple object which you knew their respective positions in unity and in Optitrack and then you performed a mapping to calculate the rotation/position offsets with a relatively high precision.

I think this solution would totally work in my setup. It’s not a perfect (mathematically exact) solution but it would be good enough for me.

Thanks again for your help !

That’s what i meant when i wrote that a perfect mapping is (likely) not possible. If we had perfectly accurate data sets we could use some kind of mathematical solver for this. However, since we use real-life data there are always some inaccuracies. And be it only the ones from Optitrack itself. A perfect solution should be impossible. I think.

It may be possible to achieve a higher accuracy than described above with different mapping algorithms, but for practical intents and purposes, the deviation was absolutely unnoticable in the virtual environment. So if there is an actual user behind the HMD, i doubt more accuracy would be required.

I have a similar problem, where I have an external tracking device that produces 3D points in its own coordinate system. I want to use a simple calibration procedure with three reference points to calculate the transform that will map tracker points to Unity’s world space. Does Unity provide functions to easily do that or does that have to be done manually?