I'm having trouble definitively determining whether or not I should always be multiplying my relevant values by deltaTime. I have heard so many conflicting opinions on this. The way I understand it is deltaTime is the amount of time it took to render the last frame. So, for sake of argument if your game is running at 60fps (frame rendered ever ~0.16ms) everything's peachy but if your game drops to, say, 30fps (frame rendered every 0.32ms), in order to maintain the same sense of time scale, multiplying by deltaTime would push your values to where they would normally be assuming a fixed time scale. Kind of like if you're travelling by car at a steady speed, you should be able to calculate exactly where the car would be after a certain amount of time has passed. If this is correct then using deltaTime simply creates the effect of frame-skipping at lower framerates, whereas by not using deltaTime it has the effect of slowing down your game at lower framerates. In some games frame skipping is preferred, whereas in other games slowdown is the preferred effect at lower framerates. If I prefer slowdown, why would I want to use deltaTime? I've heard some people suggest you should ALWAYS use a multiple/deltaTime but I can't wrap my head around as to why. Also, if I'm understanding this right I understand this could be a problem at faster framerates. If the game runs over the expected 60fps it will run faster. How do I limit the games framerate at <=60?
You got it right. You should always use deltatime, unless then you don't use deltatime. To be honest, i really can't imagine why one would prefer slowdown. For smooth look for slower machines with cost of intended gameplay? You can set target frame rate: http://docs.unity3d.com/ScriptReference/Application-targetFrameRate.html
Awesome. Thanks for your answer. As for why have slowdown... this game is probably the best example I could imagine: Go to the 4 minute mark for an example:
You probably want more sophisticated control the just dropping Time.deltaTime. Frames are never exactly the same length, so dropping it will add all sorts of little inconsistencies.
OK and just like that I'm back to hearing two conflicting views. BoredMormon. This is what I often hear... some allusion to a vague problem that may occur if I don't use deltaTime but what I'm trying to do is wrap my head around the precise problem that could occur. Even one simple example would probably set me on the right path to understanding this issue better. I even heard once someone say "make your own deltaTime". Oh... OK... what?! Any guidance would be greatly appreciated as I feel I'm treading water on this issue or going in circles.
Honestly I would do a prototype. What you seem to be looking for is an effect different from the normal perfectly smooth gameplay experiance. You could wrap deltaTime and then easily turn it on and off and see if there is an effect on game play. Another option to explore is smoothDeltaTime.
You could always use FixedUpdate() if you want your gameplay logic to operate at a specific frame rate, it may run more than once per frame if the computer can't keep up, but your game is more likely to be deterministic.
ROFL. All I have to say it WTF and thank god there are other people to pass games like that! My understanding works on the idea that there are two "threads" running your game, though they may not actually be implemented as threads, I think it's handy to think of them this way. One of the threads is polling game state and drawing frames. The other is running the physics engine and changing the game state. The physics thread is given a higher priority such that it gets as much CPU time as necessary to complete its work. The framework attempts to keep this work at fixed intervals, so every time fixed update is called, the same amount of real life time has passed. However, if things are not going well, the amount of real life time may actually be much more. Regardless, the game engine pretends that the fixed amount of time has passed and continues on. This is why Time.deltaTime is always the same in FixedUpdate methods. The drawing thread, meanwhile, gets the leftover CPU cycles. When the physics engine needs to ramp up, it starts taking time away from the drawing thread resulting in lost frames. When the physics engine is idle, there is more time for the drawing thread so you might end up with multiple frames where you were expecting only one. This is why Time.deltaTime is important -- it tells you how long ago you were in an Update method. Maybe it was exactly 1/60 second as expected, or maybe 1/30 second because the CPU is taxed ... or maybe it is 1/120 second because there is so little happening the game can crank up the frame rate. In practice, I think Unity tries really hard to keep your preferred frame rate but sometimes the nature of computer systems gets in the way and, even in optimal situations, you end up with something wacky like 1/60.012345. The rule, then, depends on where you are coding. Are you in a function that can get called by the drawing thread or the physics thread? For the example video you showed, I think you are going to run into trouble if you ignore deltaTime in the drawing thread functions because the physics thread has no concept of skipping time like this. If you really want to get the behavior in the video, you might have to play around with dynamically changing the game speed or rolling your own physics engine.
It's really quite simple: Update is run every frame, so the data inside is processed once per frame (including little subtle movements you make). When the framerate on your game drops (and it will, eventually), those movements become fewer per second too, so the actual measurable movement of objects in your game will slow down. Now, if it's just the difference of 5fps, it's not noticeable, but when it's more than that? deltaTime is literally "the time it took for the LAST frame to be processed", so it's a kind of slightly delayed reaction. By multiplying by deltaTime, you can make it so that the stuff that happens in your Update (movements mostly, but other things too) happen a consistent rate in real time, so that lag spikes make it so that your character or w/e actually moves a bit further, or does a bit more work or w/e, during those frames. This does help to make things smoother, but more importantly it makes them PREDICTABLE, even in unpredictable environments. FixedUpdate() basically has something like deltaTime already applied to it as a whole (automatically)- the updates occur at completely predictable intervals and so your calculations don't require multiplying by deltaTime anymore. That's why it's highly suggested that any physics calculations you do, you do in FixedUpdate, so that "forgetting to add it" is no longer a problem. There's no need to make it more complicated than that, IMO.
Lots of interesting stuff, thanks everyone. I set up a test environment and simulated a lower framerate by sticking this code in Awake() Code (CSharp): void Awake() { Application.targetFrameRate = 10; QualitySettings.vSyncCount = 0; } And everything seems to be fine (I get that same kind of slowdown as you see in the video above) so long as there is no code using the physics engine. Everything is in Update() and its all working deterministically. However, once I introduced controls based on rigidbody2D (physics engine) and handled it via FixedUpdate() the results were off: stuttery and laggy. Sort of the worst of all possible worlds where you get frame skipping uglyness and laggy inputs. So I thought to try to fix this by putting the physics stuff in Update() instead of FixedUpdate() and that just messed a lot of stuff up. So, now I'm wondering if I should do what Eisenpony suggested and come up with my own physics simulation to avoid FixedUpdate() altogether. Using Unity's built-in physics is just too nice to give up on, however, so I may have to compromise elsewhere. BoredMormon: By wrap deltaTime do you mean write a static class that refers to deltaTime and use that reference instead of Time.deltaTime? Like: static class MyTime { static float deltaTime = Time.deltaTime; } And then simply use MyTime.deltaTime in the code instead? And then set it to 0 to turn it off for testing?
Worth noting that you could get the effect you want using any update method you wish (* Time.deltaTime in normal Update required) by simply using Time.timeScale and making certain things immune to the change by multiplying them by Time.unscaledDeltaTime instead. You could also make a wrapper that switches between deltaTime and unscaledDeltaTime as needed based on the context. Alternatively you can adjust the frequency with which FixedUpdate occurs too, and use that in some way. *shrugs*
"Rolling your own" would be my last resort. I think if you look at this page and pay careful attention to Maximum Allowed Timestep and Time Scale sections, you may find what you need. EDIT: I just noticed this bit on the same page. To me this says use Time.deltaTime everywhere you want affected by Time.timeScale, even in FixedUpdate.
Did you adjust your target FixedUpdate framerate to 1 / 10 to match your target Update framerate? If your FixedUpdate loop is running at the default 50 FPS and you've capped your Update to 10 FPS, of course the game is going to feel horrible.
Right - I think you actually want to do the opposite. Keep your frame rate high and reduce the speed of the game engine. You should be able to do this with Time.timeScale so long as you use Time.deltaTime where ever you need the slowdown.
Didn't know about unscaledDeltaTime, so i've been dividing deltaTime with timeScale if needed. Good to know But what would be needed in this case to be immune, as it is trying to simulate slowing down? Shouldn't it just be using deltaTimer everywhere and do the slowdown: if (fps < targetFps) timeScale = fps / targetFps; @SkillBased, if you are making that kind of game like in the video, or any fast paced game, you do realise it makes game easier for players with slow machines, when fps drops under the fixed fps. Can't you make use of quality settings for slow machines? I think you should let people with slower machines see some jerky movement, and let people with better machines enjoy the benefits that comes with higher frame rates.
Nope, the music and sound effects for instance will need to be immune. Also, the player itself usually, because iirc in that video the slowdown happened on everything except the player's ship, the music, and the background. There are a lot more things that need to be immune than it seems, usually. Back to the topic though, I disagree with the premise on FPS-based speed being advantageous to the player (though I'm sure an exception exists somewhere, even if I can't think of what that exception might be right now). In that video in particular, and all bullet-hell games in general, an FPS-based speed would kill you constantly and repeatedly until you finally turn to computer-defenestration for stress relief. It would be unpredictable from one second to the next, and though you might think it makes gameplay smoother, the difference between a section of gameplay taking 1 second and 1.5 seconds when you're doing it over and over to get the timing down is disastrous. One may think that the extra half second gives players an advantage of some kind over momentarily jerky movements, but it really doesn't in that case. If someone has bad fps, they'll likely have bad fps in a very unpredictable way (momentary lags, not a constant and smooth 15fps). As there will be no consistency in lag, the only consistency you can hope for is game time versus real time- to make the timings in general predictable, and you can't do that by basing the timings on FPS.
I've had to reconsider this, along with a great many things since I first posted this. Thankfully I'm still in prototype so ironing out the issues discussed shouldn't be a disaster. I don't think its quite as bad as you say though. Most of these games aren't about knowing exactly where/when things occur but reacting to where you are currently in relation to what's happening on the screen (most bullet patterns are targeted to where the player is at, so its different on every playthrough). However, uncontrolled slowdown will be too unpredictable and actually gives benefit to players on slower machines or those using lag tactics. That could be bad if I ever do some sort of a leaderboard, etc. In the end, I don't even intend for uncontrolled slowdown such as you see in the video above, but rather controlled slowdown (Time.timescale) on command. So yeah I guess I'm favoring using deltaTime again...