Search Unity

Game Speed impacted by CPU frequency multiplier setting

Discussion in 'Scripting' started by Lost-in-the-Garden, Apr 27, 2017.

  1. Lost-in-the-Garden

    Lost-in-the-Garden

    Joined:
    Nov 18, 2015
    Posts:
    176
    Hi everyone,

    I have something odd here... We are currently developing for consoles, and to be a bit closer to the target spec, I reduced the clock speed of my AMD cpu via the bias frequency multiplier setting. Oddly enough, this also reduced the game speed. Not in the sense of "the frame rate drops and it runs slow" but as in "an ingame second now takes 2 seconds instead of one".

    I tried it with a standalone build, and the same build ran at different speeds based on CPU settings. Even though we use our own custom timer class, it is just a wrapper on top of unity's Time class. Increasing the game speed via Time.timescale brought the game back to speed.

    On my colleague's overclocked computer (Intel CPU) this effect was not observed though.

    Anyone experienced something like this before? Seems like some of the system clocks unity is using is impacted by the changes...
     
  2. lordofduct

    lordofduct

    Joined:
    Oct 3, 2011
    Posts:
    8,528
    Hrmm... never seen this myself.

    If you can create a simple project that demonstrates this, I'd love to check it out.

    Might be worth issuing a bug report to Unity over.
     
  3. StarManta

    StarManta

    Joined:
    Oct 23, 2006
    Posts:
    8,775
    I haven't tried running Unity on a system like this, but you should be able to easily work around and counteract it by setting Time.timeScale to the inverse of your system clock multiplier.
     
  4. Lost-in-the-Garden

    Lost-in-the-Garden

    Joined:
    Nov 18, 2015
    Posts:
    176
    I am not sure the clock multiplier is exposed somewhere in the system, so trying to compensate for that might not be a reliable solution for production.

    Another thing is that it also seems to affect the editor. According to the profiler, the game ran at about 100fps which is even faster than running it at the stock clock speeds.
     
  5. WarmedxMints

    WarmedxMints

    Joined:
    Feb 6, 2017
    Posts:
    1,035
    Did you just adjust the mutpliplier or did you also adjust the base clock?

    You could running this in an elevated command prompt
    Code (csharp):
    1.  bcdedit /set {current} useplatformclock Yes
     
    Lost-in-the-Garden likes this.
  6. Lost-in-the-Garden

    Lost-in-the-Garden

    Joined:
    Nov 18, 2015
    Posts:
    176
    no idea, I just went to the bios and changed the frequency multiplier, I guess.

    It's not so much a problem for development here, I am just concerned about players cheating in the game that way. It's a racing game, so I can easily do faster lap times if everything is running at half speed. Is this a common exploit in games, and how could I defend against it?
     
  7. WarmedxMints

    WarmedxMints

    Joined:
    Feb 6, 2017
    Posts:
    1,035
    What OS are you testing on?

    I seem to recall this being a bug in Windows 8 as it was designed as a tablet OS it uses the Base Clock of the processor as the Real Time Clock (RTC). So when you install it as a desktop OS, it doesn't use the integrated RTC on the motherboard. So if overclocking using the baseclock, the result will be the time in windows being thrown out of sync.
     
    Lost-in-the-Garden likes this.
  8. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Honestly?

    Sounds like you missed multiplying by Time.deltaTime :p Have you tried the game on other people's *slower* computers? such as an old laptop or something without bios changes?

    If it's definitely not this it's worth filing a bug for IMHO, unless what you are doing is really odd. Also underclocking a computer to get it near console performance almost never works because half of the problem is from OS overheads, different memory timings and drivers, you are best off just testing it on the dev kit.

    Finally, have you tried using C# and setting up your own System frame timing? you said you abstracted the timer class so it might be whatever Unity is doing. I'd just file a bug report to get some feedback at least...
     
  9. Lost-in-the-Garden

    Lost-in-the-Garden

    Joined:
    Nov 18, 2015
    Posts:
    176
    @hippocoder this is what we are working on right now


    ...so yes, we know how to properly use timers for our game code. Thanks for the effort, but the comment was not very helpful.

    following the useplatformclock hint, It seems that this is in fact a problem/exploit in many games. The larger ones defend against that with specific cheat detecting software. Has anyone experience with that on an indie level?
     
  10. WarmedxMints

    WarmedxMints

    Joined:
    Feb 6, 2017
    Posts:
    1,035
    It is an old and known bug/exploit. As it's down to the timer that windows uses, I believe you would get the same issues if you used .nets timer api.

    Looks like a fun game though, I will have to check it out on my consoles when it is released. Look like it might work quite well in VR as well though. Would be fun to play it on my vive.
     
    Lost-in-the-Garden likes this.