Search Unity

About attaching the unity camera to a gyro + magnetometer

Discussion in 'Scripting' started by Sounds-Wonderful, Sep 25, 2013.

  1. Sounds-Wonderful

    Sounds-Wonderful

    Joined:
    May 15, 2013
    Posts:
    106
    My hardware-sensor outputs data like this:

    sensorfused-yaw,sensorfused-pitch,sensorfused-roll,gyro-x,gyro-y,gyro-z,mag-x,mag-y,mag-z

    sensorfused data is rubbish, it makes no sense at all

    gyro is rad/s => gyro-value * dT = delta-angle-in-rad
    mag is m/s² => mag-value * dT² = delta-distance-in-m

    I wanted to attach this to my VR-helmet and have it attached to the camera in unity. But all I get is rubbish for the values. Rubbish meaning that the rotation is all over the place but not comparable to the input.

    The goal is to have my VR-helmet on and walk around in my room seeing the scenery depending on where I am.

    So in general, what is wrong with this code and how could I improve it?

    wx etc. means total movement.
    gx etc. means total rotation

    Code (csharp):
    1.  
    2. bool erstes = true;
    3. ...
    4.  
    5. if (serialPort1.IsOpen)
    6. {
    7.     string ausgabe = serialPort1.ReadLine();
    8.     string[] werte = ausgabe.Split(',');
    9.     if(werte.Length == 9)
    10.     {
    11.         if (erstes)
    12.         {
    13.             a = DateTime.Now;      
    14.             erstes = false;
    15.         }
    16.         float.TryParse(werte[6], out magnetometer.x);
    17.         float.TryParse(werte[7], out magnetometer.y);
    18.         float.TryParse(werte[8], out magnetometer.z);              
    19.         //subtract gravity
    20.         gravity[0] = alpha * gravity[0] + (1 - alpha) * magnetometer.x;
    21.         gravity[1] = alpha * gravity[1] + (1 - alpha) * magnetometer.y;
    22.         gravity[2] = alpha * gravity[2] + (1 - alpha) * magnetometer.z;
    23.        
    24.         DateTime b = DateTime.Now;
    25.         TimeSpan diff = b - a;
    26.         a = b;
    27.         linear_acceleration[0] = magnetometer.x - gravity[0];
    28.         linear_acceleration[1] = magnetometer.y - gravity[1];
    29.         linear_acceleration[2] = magnetometer.z - gravity[2];              
    30.        
    31.         WegX = linear_acceleration[0] * (float)(diff.TotalSeconds * diff.TotalSeconds);
    32.         WegY = linear_acceleration[1] * (float)(diff.TotalSeconds * diff.TotalSeconds);
    33.         WegZ = linear_acceleration[2] * (float)(diff.TotalSeconds * diff.TotalSeconds);
    34.        
    35.         wx += WegX;
    36.         wy += WegY;
    37.         wz += WegZ;
    38.        
    39.         float.TryParse(werte[3], out GradX);
    40.         float.TryParse(werte[4], out GradY);
    41.         float.TryParse(werte[5], out GradZ);
    42.        
    43.         GradX = GradX * (float)diff.TotalSeconds;
    44.         GradY = GradY * (float)diff.TotalSeconds;
    45.         GradZ = GradZ * (float)diff.TotalSeconds;
    46.        
    47.         GradX = GradX / 10;
    48.         GradY = GradY / 10;
    49.         GradZ = GradZ / 10;
    50.        
    51.         gx += GradX;
    52.         gy += GradY;
    53.         gz += GradZ;       
    54.        
    55.         transform.Translate(WegX, WegY, WegZ);
    56.         transform.rotation = Quaternion.Euler(gx, gy, gz);
    57.     }
    58. }  
    59.  
     
  2. Sounds-Wonderful

    Sounds-Wonderful

    Joined:
    May 15, 2013
    Posts:
    106
    Don't everybody speak up at once.
     
  3. halley

    halley

    Joined:
    Aug 26, 2013
    Posts:
    2,451
    If your sensor readings are "not comparable to the input," you can stop there. You're either misinterpreting the outputs, or the sensor is damaged.

    If your sensor readings are just 6d acceleration, you're not going to be able to integrate it back into a room-space position that is accurate at all. You'll have to apply a Kalman filter on sensor outputs, but filtering involves a bit of lag. Without any feedback from another method of position detection, your calculated room-space position and orientation will start to drift off from the real position and orientation within a few seconds. Depending on the ratings of the sensors, if you ever shake the device too vigorously or even gently knock on it with your hands, your readings may spike and saturate, and cause even more drift.

    To really pinpoint location in a room, you will probably need camera-marker style sensors first, and then optionally combined with the accelerator readings to reduce jitter and lag. Augmented reality phone apps usually try to detect and then track visual features from the camera, for the same benefit.
     
  4. lmbarns

    lmbarns

    Joined:
    Jul 14, 2011
    Posts:
    1,628
    We use a kinect with the oculus rift to track body position. Works well.
     
  5. Sounds-Wonderful

    Sounds-Wonderful

    Joined:
    May 15, 2013
    Posts:
    106
  6. Sounds-Wonderful

    Sounds-Wonderful

    Joined:
    May 15, 2013
    Posts:
    106
    halley, how do I find out what my sensor outputs?
     
    Last edited: Oct 2, 2013
  7. halley

    halley

    Joined:
    Aug 26, 2013
    Posts:
    2,451
    Collect some data under controlled circumstances. Create a large array and a button; gather readings every OnUpdate() into the array. Write the values to a file or something. The key is: confirm what readings you have with what is documented for the sensor.

    In re-reading your code and docs above, I noticed that you state the gyro reads in rad/s, but Unity's Quaternion.Euler() function defaults to angles in degrees. I don't see any obvious radians-to-degrees conversions there. Perhaps you're getting values in a small -6 to +6 radians range, even if you are twisting the gyro sensor like mad... where you were expecting values like -360 to +360 degrees.