Search Unity

Animation and Timestep

Discussion in 'Animation' started by wfoxall, Feb 22, 2017.

  1. wfoxall

    wfoxall

    Joined:
    Oct 23, 2015
    Posts:
    7
    Hi there,

    I'm a bit unclear of the connection between animations/animation controllers and the TimeManager settings.
    I am working on a video based app and had keyframed a bunch of time-based cues using animations. I have since had to settings in the TimeManager and all animations are now out of sync. It's like the animation timings have all dilated.
    Maybe I'm mistaken, but I thought that the event timings in animations would be independent of this change? Was I wrong assuming this?

    Old settings:
    Fixed Timestep = 0.01111, Maximum Allowed Timestep = 0.3333333
    New settings:
    Fixed Timestep = 0.01, Maximum Allowed Timestep = 0.01

    Thanks!
     
    Last edited: Feb 22, 2017
  2. wfoxall

    wfoxall

    Joined:
    Oct 23, 2015
    Posts:
    7
    To add to this. I mocked up a test with three cubes that jump up and down according to what they consider a second to be.

    The one on the left uses Time.realtimeSinceStartup as its timebase.
    The middle one uses ctime += Time.fixedDeltaTime
    The one on the right uses an animation with 4 samples a second (8 samples in total. 4 down, 4 up)

    What I'm finding is that when I use the Unity default TimeManager settings:
    Fixed = 0.01, Max = 0.3333, Time Scale = 1.
    They all jump up and down at almost exactly the same time. However they are ever so slightly faster than a digital metronome.

    When I change the settings to:
    Fixed = 0.01, Max = 0.01 Time Scale = 1.
    The left (realtime) one ticks at nearly 1 jump a second while the other two are slower and they fall out of sync.

    This is a problem because my video playback and audiosources in Unity run in realtime, I need a max timestep of 0.0111 to maintain 90fps but this throws all my animation scripts out.
    Surely all these three timebases should match unless time scale is altered?
     
  3. wfoxall

    wfoxall

    Joined:
    Oct 23, 2015
    Posts:
    7
    Okay. So after doing more research... I think I was misunderstanding what's happening. Perhaps someone could confirm or correct my interpretation of what I've read across the internet for the benefit of me and others:

    I think Maximum Allowed Timestep basically limits the amount of time per frame that Unity can spend on physics (and therefore animation) calculations.
    So if I'm demanding a Fixed Timestep and Maximum Allowed Timestep of 0.1 but the frame rate is slower than that due to the other computation it needs to do, the physics calculations have to wait until they have time in the next frame. Therefore anything physics based gets delayed frame-by-frame therefore slowed down and out of sync?

    So the problem for me is actually that my frame-rate turns out to be slower than the Unity profiler is reporting. It's more like 85 rather than 100.
    This is a problem because it's a Rift game which needs to 90fps. Weirdly this frame rate always pings to about 85fps when I put the Rift on in Unity, even with a blank scene. So that's now what I need to figure out.