# About attaching the unity camera to a gyro + magnetometer

My hardware-sensor outputs data like this:

sensorfused-yaw,sensorfused-pitch,sensorfused-roll,gyro-x,gyro-y,gyro-z,mag-x,mag-y,mag-z

sensorfused data is rubbish, it makes no sense at all

mag is m/s² => mag-value * dT² = delta-distance-in-m

I wanted to attach this to my VR-helmet and have it attached to the camera in unity. But all I get is rubbish for the values. Rubbish meaning that the rotation is all over the place but not comparable to the input.

The goal is to have my VR-helmet on and walk around in my room seeing the scenery depending on where I am.

So in general, what is wrong with this code and how could I improve it?

wx etc. means total movement.
gx etc. means total rotation

``````bool erstes = true;
...

if (serialPort1.IsOpen)
{
string[] werte = ausgabe.Split(',');
if(werte.Length == 9)
{
if (erstes)
{
a = DateTime.Now;
erstes = false;
}
float.TryParse(werte[6], out magnetometer.x);
float.TryParse(werte[7], out magnetometer.y);
float.TryParse(werte[8], out magnetometer.z);
//subtract gravity
gravity[0] = alpha * gravity[0] + (1 - alpha) * magnetometer.x;
gravity[1] = alpha * gravity[1] + (1 - alpha) * magnetometer.y;
gravity[2] = alpha * gravity[2] + (1 - alpha) * magnetometer.z;

DateTime b = DateTime.Now;
TimeSpan diff = b - a;
a = b;
linear_acceleration[0] = magnetometer.x - gravity[0];
linear_acceleration[1] = magnetometer.y - gravity[1];
linear_acceleration[2] = magnetometer.z - gravity[2];

WegX = linear_acceleration[0] * (float)(diff.TotalSeconds * diff.TotalSeconds);
WegY = linear_acceleration[1] * (float)(diff.TotalSeconds * diff.TotalSeconds);
WegZ = linear_acceleration[2] * (float)(diff.TotalSeconds * diff.TotalSeconds);

wx += WegX;
wy += WegY;
wz += WegZ;

transform.Translate(WegX, WegY, WegZ);
transform.rotation = Quaternion.Euler(gx, gy, gz);
}
}
``````

Don’t everybody speak up at once.

If your sensor readings are “not comparable to the input,” you can stop there. You’re either misinterpreting the outputs, or the sensor is damaged.

If your sensor readings are just 6d acceleration, you’re not going to be able to integrate it back into a room-space position that is accurate at all. You’ll have to apply a Kalman filter on sensor outputs, but filtering involves a bit of lag. Without any feedback from another method of position detection, your calculated room-space position and orientation will start to drift off from the real position and orientation within a few seconds. Depending on the ratings of the sensors, if you ever shake the device too vigorously or even gently knock on it with your hands, your readings may spike and saturate, and cause even more drift.

To really pinpoint location in a room, you will probably need camera-marker style sensors first, and then optionally combined with the accelerator readings to reduce jitter and lag. Augmented reality phone apps usually try to detect and then track visual features from the camera, for the same benefit.

We use a kinect with the oculus rift to track body position. Works well.

Yeah, I am doing this aswell. Can you answer my question in http://forum.unity3d.com/threads/162075-Connecting-Kinect-Unity-with-Official-SDK?p=1368767&viewfull=1#post1368767

halley, how do I find out what my sensor outputs?

Collect some data under controlled circumstances. Create a large array and a button; gather readings every OnUpdate() into the array. Write the values to a file or something. The key is: confirm what readings you have with what is documented for the sensor.

In re-reading your code and docs above, I noticed that you state the gyro reads in rad/s, but Unity’s Quaternion.Euler() function defaults to angles in degrees. I don’t see any obvious radians-to-degrees conversions there. Perhaps you’re getting values in a small -6 to +6 radians range, even if you are twisting the gyro sensor like mad… where you were expecting values like -360 to +360 degrees.