Search Unity

On native Unity VR support: Camera vs. Controllers

Discussion in 'AR/VR (XR) Discussion' started by plmx, Jan 5, 2017.

  1. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    While implementing support for motion controller VR (Vive, Oculus+Touch) I noticed a discrepancy in implementation that seems rather weird to me, and that I'd like to discuss.

    When enabling VR support in Unity options (Player), ANY camera object is automatically linked to the headset.
    But there is no similar mechanism for controllers; i.e. I have to manually add a script to a "controller" object and write some code to achieve tracking.

    I realize that cameras and controllers are different objects, and I also know that when selecting "Target Eye" > "None (Main Display)" in the camera I can avoid tracking.

    My issue: I feel that having to treat the camera and the tracked controllers differently is wrong from a (software) architectural point of view. Shouldn't "tracking" be handled in the same way in both instances, i.e. either both with a built-in mechanism or both by requiring developers to write scripts?

    Personally, I feel that the controller way is the "right" way to go: Yes, I have to write an individual script and copy values, but then again, I control that script, and can include custom logic (fall-back, or whatever).

    But then again, there may be good reasons for having the camera being adjusted automatically which I do not know about.

    Thoughts?
     
    Last edited: Jan 5, 2017
  2. NickAtUnity

    NickAtUnity

    Unity Technologies

    Joined:
    Sep 13, 2016
    Posts:
    84
    I tend to agree. There are two good reasons for having the camera automatically track the head:
    1. For simpler (especially non-game) VR experiences, it reduces the work on the developer. This is easy to understand if you think about 360 video or theater type experiences where the developer just needs a single camera tracking the user's head. The built in support requires no work from the developer for this.
    2. It ensures (the best it can) that the in-game view will match the user's head rotation and position. If the game did not do this correctly users would likely experience some uneasiness. The defaults here help ensure VR experiences made in Unity feel good to customers.
    That said, we definitely understand that this imposes some limitations and some inconsistencies and are discussing various ways of resolving that while maintaining some of the ease-of-use benefits of our current system.
     
    plmx likes this.
  3. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    Thanks for your answer! Those are good reasons. Perhaps a middle way would be a built-in component/script which can be added on demand by developers to trackable objects like the head and controllers.
     
  4. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    I just noticed another discrepancy: When Time.timeScale=0 (i.e. game is paused), the camera still keeps on following the HMD (i.e. keeps updating). The hand controllers, by contrast, might or might not depending on which methods you use for implementing it.

    Interestingly, in the Oculus Utilities, FixedUpdate() is used, so no hands in paused mode. OpenVR, by contrast, uses OnPreCull() for this, so the hands are tracked during pause :)
     
    Last edited: Feb 12, 2017
    NeatWolf and DrBlort like this.