Search Unity

Stock Unity HMD tracking: Y offset different Oculus vs. Vive

Discussion in 'AR/VR (XR) Discussion' started by plmx, Jan 19, 2017.

  1. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    I am using stock/native Unity VR integration. On one PC I have OpenVR/SteamVR with a Vive, on another I have the Rift with the Oculus Runtime. No plug-ins.

    On the Vive, when I place the HMD on the floor, the camera Y offset is zero. On the Rift, when I place the HMD on the floor, it is -1.67f.

    A wild guess would be that when using the Rift, y=0 is actually at eye level - because during Rift setup, the user is asked to hold the controller at eye level.

    This however creates a major problem for me now - how can I make sure that the camera is on the same height for both HMDs?
    • Question 1: Shouldn't Unity report the same Y for both HMDs if the headset is on the floor (or at any other height in the room)?
    • Question 2: If we have to deal with different Ys ourselves, is there any way to GET that 1.67 offset (I guess it is different for each user) from the floor somewhere in stock Unity so I can calculate the offset myself?
    Thanks!

    Philip
     
    Last edited: Jan 19, 2017
    ROBYER1 likes this.
  2. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    Am I missing something here? Has anyone else run into this issue? Any insights?

    Thanks!
     
  3. PatHightree

    PatHightree

    Joined:
    Aug 18, 2009
    Posts:
    297
    I'm having the same problem.
    My Vive tracking space is floating in the air, while the Rift tracking space is working fine.
    I found that setting VRDevice.SetTrackingSpaceType(TrackingSpaceType.RoomScale) on the Rift causes its tracking space to float as well.
    Getting both in a consistent state is one step closer to finding a solution, I'd say.
     
  4. Ikik

    Ikik

    Joined:
    Oct 19, 2012
    Posts:
    23
    Did you try setting the origin on the oculus one? If I recall correctly, it defaults to floor and changing it to 'eye' makes it match.
     
  5. PatHightree

    PatHightree

    Joined:
    Aug 18, 2009
    Posts:
    297
    I found that when I position the camera at floor height before enabling VR, it works correctly.
     
  6. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Having the same issue here with the Tracked Pose driver, the Vive HMD is quite a bit taller than the Oculus HMD, despite me setting room tracking correctly before testing in editor. The Oculus headset we are building to is the Oculus Quest and in that the player height is spot-on, it's only the Vive on the PC which makes our head height so much taller!
     
  7. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    The original issue I was having was indeed what @PatHightree has reported above, and which I have received instructions about in this thread. You really just need to call SetTrackingType with the RoomScale enum in some Awake() method.

    If that does not solve your issue, you should create a new thread explaining that you already tried the above since you are looking at a different problem then.
     
  8. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    That solved it indeed, thankyou!
     
    plmx likes this.
  9. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    Thinking about this, since it's an easy thing to get caught up on when setting things up at first.
    Not sure the best approach to solving this in the new Subsystem architecture.

    Thinking maybe a 'startup' option in the new XR Settings so you can decide at startup time what is preferred.
     
    hippocoder and plmx like this.
  10. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    That would be really useful I think. Are you able to quickly summarise the ideal VR set up process using Unity 2019.3 beta and just Native Unity VR?

    Currently I am just using the tracked pose drivers, my VR inputs are through the Unity Input manager in player settings (I know this can also be done using Unity XR Input in code but it doesn't seem necessary to me now?) and any platform specific support I have downloaded through the package manager without using the brand new XR Subsystem manager that is in preview.

    We are trying to avoid platform specific SDKs or plugin imports for future proofing of our application
     
  11. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    I've been on a mission to do that since 2016 (see my various other posts) :D AFAIK, some things are still only possible with native SDKs (off the top of my head: async scene loading without compositor-inducing hiccups or at least an ability to set a fixed loading screen in the HMD; retrieving the size of the room scale play area; being notified when the user switches to the VR store/home screen).

    I have high hopes for the new subsystem but I've just not found the time to look at it in more detail.