Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Is unity FPS count wrong or am i missing something?

Discussion in 'Editor & General Support' started by ronan-thibaudau, Sep 5, 2012.

Thread Status:
Not open for further replies.
  1. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    I posted that first in the volume grass asset thread as i noticed it there but making an empty project and simply counting seconds and frames doesn't match the FPS that unity display in the debugger, which one is right?
    On one hand :
    - Unity doc says update is called once per frame, so i can't see how my frame number could be wrong (it's simply ++ed in onupdate)
    - The time displayed matches the time as i see it pass
    On the other hand:
    - I think if unity's FPS counter in the edit was wrong (and wrong by a 20X factor) it would have been noticed earlier!
    Can someone shed some light on this for me then? Empty screne, unity indicates > 2000FPS, i've let it run for 250 seconds and it's only at 5600 frames so fps should be 22.5fps, obviously i can't see my setup having 22.5fps on en empty scene, but then is update not actually called each frame as the docs say? If so this is severely missrepresenting many assets that have a fps counter tagging them as "slow".
     
    littledwarf and Bryarey like this.
  2. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,401
    Update is called once every frame. Use this to measure frames per second. An empty scene will probably have 60fps (depending on your monitor), since vsync is enabled by default.

    --Eric
     
  3. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    Thanks eric but i perfectly understand this, as i point out in my post, yet i do get wildly diferent results from unity (telling me 2Kfps which seems reasonable on an empty scene without vsync) and checking updates count vs time elapsed (giving me 22.5fps, and surely my monitor doesn't have a 22.5hz refresh rate) so one way or the other something is going weird, either update isn't being called once per frame and this seems really weird, or it is being called once per frame AND i get 22.5fps on an empty scene AND unity's build in fps displays the wrong fps.
    Edit here's, a sample of the issue :

    Edit 2 : the fps counter provided with the demo shows 30 fps, unity's built in shows 700 fps , and my own incremented on each update frame counter shows 13400 frames over 2800 seconds
    Edit 3 : even simpler exemple bellow, as you can see the code is as trivial as can be and i get unity telling me i push 2KFPS, but my counter says 1700 frames over 10 secs (170fps instead of 2000!)


    Code (csharp):
    1.  
    2. using UnityEngine;
    3. using System.Collections;
    4.  
    5. public class FPSDisplay : MonoBehaviour {
    6.  
    7.     // Use this for initialization
    8.     void Start () {
    9.    
    10.     }
    11.    
    12.     public static int frames = 0;
    13.    
    14.     // Update is called once per frame
    15.     void Update () {
    16.         frames++;
    17.     }
    18.     void OnGUI()
    19.     {      
    20.         GUI.Label(new Rect(200,100,300,50), "started since: " + Time.realtimeSinceStartup);
    21.         GUI.Label(new Rect(200,150,300,50), "frames : " + frames );    
    22.     }
    23. }
    24.  
    25.  
     
    Last edited: Sep 5, 2012
  4. Alastair Callum

    Alastair Callum

    Joined:
    Sep 3, 2012
    Posts:
    71
    You don't want to rely upon the frame(rate) to manage any mechanics in your game. You should use elapsedTime and Time.

    The number of frames per second is an average of successful calls over a period of time. The variance you are getting between your counted frames and frame rate cannot be the same because the frame rate is not constant.

    Some frames get dropped and other take longer to render..

    Eg.

    call frame 1
    - frameCount = 1
    call frame 2 (frame dropped)
    - frameCount = 1
    call frame 3
    - frameCount = 2

    Although you are keeping track of all completed frames, the rate at which they are being called is never constant; although sometimes you may see it at a constant 60.0 for example, you must remember that even that update (thats telling Unity to print the frame rate) is being updated at a different rate to the pure rate of frames per second.
     
  5. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    I'm not trying to use this for game mechanics but to measure performance. So i'm still confused with. Hich one represents "unity on this scene on my system can push x frames like this one per sec" to figure out which one should represent a 60ish fps goal.
     
    Bryarey likes this.
  6. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,401
    The script on the wiki is the "true" frames per second; the stats in the editor are not.

    --Eric
     
    Bryarey likes this.
  7. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    Is this a bug? If not what do the stats in the editor represent if not the "true fps"
     
    OfficialHermie likes this.
  8. Alastair Callum

    Alastair Callum

    Joined:
    Sep 3, 2012
    Posts:
    71
    You'd be better learning how to use the Profiler to measure performance.
     
  9. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    But this still all answers "off topic" to me. I know how to use the built in profiler, and better regular profilers being a full time C# dev (non game industry) for years now. But i'm still finding it confusing at why i can get two diferent "fps" results unless there is a bug, which one is right or not doesn't really answer my question, neither do alternatives help, i'm still, regarldless of any alternatives, interested in knowing why one FPS shows a constant (it's not varying, i didn't hop on a nice lucky frame) 1700-2200FPS, while the other shows a constant 60ish fps, both displayed at the same time, either there is a bug, either one of the FPS isn't FPS but another measure, and i'd like to know what it actually represents.

    There are 2 fps numbers, both recommanded by unity (one build in, one based on basic functionality) and they both display extremely diferent results, why does this happen?
     
  10. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Because the editor measures it with vsync on and vsync in general isn't that stable in the editor?
     
  11. manutoo

    manutoo

    Joined:
    Jul 13, 2010
    Posts:
    522
    I'm with ronan.thibaudau on this one. The FPS reported by the Statistics windows can be _wrong_ .

    Here my case.

    I have 2 parts in my scene : 1 requiring 6 Millions polys to be renderer, another 15 Millions polys.
    I put them far apart from each other, so I can see 1 without interfering with the other. I can check it's working correctly with the poly count reported in the Stats.
    When I go to the more complex part, I can visually see the fps drop with my eyes, coz it's jerking ; and my FPS counter (taken from the wiki) confirms that.
    But the Statistics report a higher FPS when seeing the complex part !

    Hard Numbers :
    On "low" Part :
    - Stats : 700 fps
    - Real : 74 fps
    On "high" Part :
    - Stats : 720 fps
    - Real : 47 fps
    On empty part (ie: nothing on screen)
    - Stats : 2000fps
    - Real : 280 fps

    The "low" part contains twice more objects than the "high" part, but requires way less polys to be renderer coz it uses an alternate low poly mesh for the shadow casting.

    More over, if I display the profiler window, then the fps reported by the Stats screen become correct (ie: quite close of the real ones, about 20 or 30% above, instead of ~1000%).

    If the real fps get lower, and the Stats reports they got slightly higher, when the other involved processes didn't change, for me, it means it's bugged.

    My guess :
    the Stats report the time it took to queue the rendering operations, but not the actual rendering time.

    Thus the reported FPS number is meaningless (at least in this case ; it may depends of the way the GPU works).

    I just tested this with Unity 3.5.6 . (it wasn't working either with 3.5.3 )
    My GPU : Radeon HD 5850 .
     
    Last edited: Oct 6, 2012
  12. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    Aye that's the feeling i get, if update is really being called once per frame then the unity profiler version simply cannot be right.
    vsync is not the issue, not only is it disabled for me but i get scenes with 300+ in both, and the numbers still vary a lot.
    Also both the gui version and the profiler version are "stable" so it's not an issue with catching smoothing over time.
    I'd really like an answer from a unity staff on this as this is a pretty big issue imho
     
    OfficialHermie likes this.
  13. makeshiftwings

    makeshiftwings

    Joined:
    May 28, 2011
    Posts:
    3,350
    Hmm I'm wondering if there's an actual answer to ronan's question too. All the answers that translate to "Shut up and use something else" are dodging the actual question. Is what you are all saying actually that the Unity "Statistics" window is completely broken and useless? If so, that really should be pointed out with big red warning signs in the documentation, because I've been relying on it being at least somewhat accurate.

    Also, if the FPS and timing parts of it are broken, does that mean the Draw Calls / Tris / etc are all broken as well? Or does no one know if any of it is actually working?
     
    Last edited: Oct 7, 2012
    buttmatrix, gamedev42 and Khena_B like this.
  14. manutoo

    manutoo

    Joined:
    Jul 13, 2010
    Posts:
    522
    makeshiftwings,
    I think all the numbers of the Stats screen are correct.

    But what is misleading is that the thread rendering times are the CPU time used for the rendering (including the wait for the VSync if it's turned on), while the real bottleneck is the GPU rendering, thus it should be the one whose time is reported.

    Like this, the reported FPS is mostly useless and very misleading.

    Note: for people going into details to optimize their game, knowing the CPU rendering time is interesting as well, but it shouldn't be put in front. I spent months thinking I was seeing the actual FPS of the GPU rendering ; fortunately, I didn't take any development/optimization decision from it.
     
  15. makeshiftwings

    makeshiftwings

    Joined:
    May 28, 2011
    Posts:
    3,350
    But the FPS shouldn't be different unless it's doing some sort of nonsensical calculation that has nothing to do with frames or seconds. "Frames Per Second" means the number of frames you get in one second, not the number of VBO transfers from CPU to GPU or anything weird like that. And like the wiki shows, it's pretty simple to calculate; you just take the time since your last tick and average out across a second or so; you don't have to know anything about what the CPU or GPU are doing.
     
  16. manutoo

    manutoo

    Joined:
    Jul 13, 2010
    Posts:
    522
    makeshiftwings,
    I agree with that.

    I guess Unity Dev wanted to provide the FPS produced by the rendering, out of any other game part (ie: user scripts, physics, etc...), but as it seems the timing doesn't take in account the actual GPU render time, it doesn't work... Maybe it was working in the past, but not anymore due to structural changes (multi-threading comes to mind).

    We'd need a Unity Dev to confirm or invalidate that.
     
    OfficialHermie likes this.
  17. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    No it's not that, i get the same thing when 0 cpu time is used pretty much (if cpu time is near 0 and gpu time is around 5)
     
  18. Dreamora

    Dreamora

    Joined:
    Apr 5, 2008
    Posts:
    26,601
    Frames per second is a simple thing and defined as 1.0 / max(frametimeCPU, frametimeGPU)

    This is using the real frametime, not the 'vsync capped' one so if the call to render the frame takes 0.1ms, you will get 1000 FPS, even if your settings enforce vsync and limit it to 60 or whatever, cause it render technically runs at 1000 fps

    Thats the relevant stats towards developers.

    For gamers using the game, the 'ingame fps counter' is relevant as it shows the physical fps they are really getting, but to you as developer that one is pretty much useless as it 'washes out' any informations to conclude something from and identify / fix problems as its never going to be accurate


    At best you use Unity Pros profiler anyway as it gives you the frametime etc in detail, not some strange value based on it and other things
     
    Last edited: Oct 7, 2012
  19. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    The issue is unrelated to vsync, we are talking about tests were neither are using vsync (with both largely over 60, i have exemples with both over 200!)
    One thing is 100% sure and shouldn't be raised again in this thread, VSYNC has nothing to do with the issue.
    And yes frames per second is a simple thing, which is why we're so surprised with 2 such diferent numbers!
     
  20. jerotas

    jerotas

    Joined:
    Sep 4, 2011
    Posts:
    5,572
    What's the target platform of your game? I ask because if it's iOS or Android, and you deploy to a device using the built-in Unity GUI controls like that, you won't get reliable numbers on the device, especially on older devices. I did extensive tests that way on my iPhone 3GS, and the Profiler showed me that a single GUI Text was actually cutting my FPS in half. A scene with absolutely nothing but the GUI Text could not reach 25FPS! Unbelievable but I assure you it's true. If you want to display it that way on a mobile, use NGUI or other such mobile-optimized package.

    I agree with the poster above, use the Profiler for a better idea of FPS and what your bottlenecks are.
     
  21. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    The target is PC standalone, the issue happens just the same inside the "play" mode where both the included and separate fps are displayed "at the same time". We're not looking for a solution to profile, as i said i know how to do this, the question is NOT "how i can i workaround this", IT IS "what the hell is happening there". No matter how you look, fps is a single metric, and two diferent measures for the same information at the same time with an order of magnitude or more in diference is definately worth an explanation.
     
  22. Dreamora

    Dreamora

    Joined:
    Apr 5, 2008
    Posts:
    26,601
    the play mode is absolutely unreliable for any kind of performance measure unless you close the scene view and maximize the game view, because you effictively dual render the scene which heavily impacts the framerate.
    Your in-play mode framecounter in this case is totally wrong because it does not take the editor instance into account, while the stats one does.
    You should only measure performance in standalones, never inside the editor

    This is a change / enhancement / fix / however we want to call it that was introduced with Unity 3.5.0
     
  23. makeshiftwings

    makeshiftwings

    Joined:
    May 28, 2011
    Posts:
    3,350
    I will echo ronan and manotu yet again and ask to please look at the numbers they are getting. It's showing HIGHER fps in the stats window while running in playmode than it is showing when actually counting the frames by update calls. It doesn't matter if there is Vsync, double rendering, triple rendering, six instances of call of duty running at the same time, and a squirrel sitting on the cpu fan. Any of those things would only LOWER the fps if they were affecting it. The stats window shows a HIGHER fps than the game is actually running at.
     
    Khena_B likes this.
  24. Dreamora

    Dreamora

    Joined:
    Apr 5, 2008
    Posts:
    26,601
    Prior 3.5 you would have seen exactly the behavior you mention.
    But as mentioned, 3.5 added code to the editor to compensate for the editor impact.

    Since 3.5 the logic in Editor Play Mode works differently and the stats window 'reverts the impact of the editor', showing what it likely would be if the editor was not around.
    As such, measuring performance in the editor became meaningless if it is meant for more than 'getting a feel' as it now does no longer return the in editor playmode performance.
    The use of the value is better than the old value though cause the old value always was meaningless as the impact of the editor is pretty heavy (30%++ of the performance is eaten by the editor, depending on the machine that can make up to 70% of the effective frametime), due to which in 'playmode code' based FPS counter is ALWAYS incorrect in relation to a standalone build really running on the machine.
    People tend to forget that play mode in the editor runs also inside the editors mono VM with all the overhead, garbage etc

    Thats the reason why I mentioned that you should never try to measure performance in the editor.
    This holds for any kind of performance measure, be it an FPS counter or profiler or whatever
     
    Last edited: Oct 7, 2012
  25. manutoo

    manutoo

    Joined:
    Jul 13, 2010
    Posts:
    522
    ronan.thibaudau
    Where do you see CPU time GPU time ?
    I can see only "Main Thread" and "Renderer". I'm not sure what both mean. I'd say "Renderer" is what is the commands enqueuing (ie: "draw this there"). "Main Thread" seems to raise to ~15.5ms when VSync is on, so it's likely containing the final SwapFrontBuffer() call. Other than that, I don't know what it contains ; being called main thread, I used to think it was the total time used for the whole game, but now, I guess it's not... And there's no explanation about these 2 terms here : http://docs.unity3d.com/Documentation/Manual/RenderingStatistics.html .

    dreamora,
    just to be sure, I just re-did my test outside the editor, and I got the _exact_ same FPS.
    There's no reason to have different FPS between the editor the build, as long as C# isn't involved. In both cases, I'm pretty sure the core rendering object handling of Unity is using the same optimized libs (and my test just confirmed that).

    I do run my game within the editor with the editor view hidden, thus the matching results.

    And it's fortunate it works like this, else it'd require to do a build to test each new optimization, and that would be much more time consuming.

    And I would find very strange that " the stats window 'reverts the impact of the editor' " ; that would make little sense, and just bring confusion, approximation bad result. Although, it's what we got right now, I don't think it's the reason behind these wrong results. And if it was done like this, then it'd have been way better to just remove that FPS counter the timings that come with it, as they'd have no meaning. In other words : either you can time only the game rendering without the editor part, or you don't show any number.

    Maybe you should read my earlier post in that topic ; seeing the Stats FPS raising when the GPU rendering actually takes longer is not a good sign.
     
    Last edited: Oct 8, 2012
  26. Morning

    Morning

    Joined:
    Feb 4, 2012
    Posts:
    1,141
    I tried a test of my own and I am really confused right now.
    The script reports the most fps of about 180
    Editor reports 110 fps
    and just for the sake of it FRAPS reports 90
    I am not even sure who to trust now.

    Tried a standalone build and the script reports twice as much FPS as FRAPS (300 in script, 150 in FRAPS). It did the same in the editor so I assume the script is doing something twice.

    Note this was not done in an empty scene, I had to actually spam it to get reduced fps otherwise numbers were way too high for any testing.

    I am wondering how is the editor calculating FPS because the results are wrong and thus rather useless. 20fps difference is rather important, especially if it's more than in reality.
     
    Last edited: Oct 7, 2012
  27. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    I agree that "reverting the impact of the editor" makes no sense, first off i don't see how that's possible at all, and second the fps is very similar in and out of editor for me and still way diferent from the "stats" fps
     
  28. Morning

    Morning

    Joined:
    Feb 4, 2012
    Posts:
    1,141
    I don't see how editor overhead is relevant at all when all you're doing is counting framerate in the game view. If you get 30 fps in game view, you tell that. No weird numbers.
    Unless Update() is not actually executed once per frame, but that would make no sense. So I don't see how or why is the editor reporting these numbers.

    There was a similar thread some time ago but I don't remember what it turned into.
     
    Last edited: Oct 7, 2012
  29. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    Aye this was my first guess, that update wasn't called once per frame, but i got told no and the documentation explicitely states it is called on a per frame basis.

    I'm just completely clueless at what could be happening here except a montruously huge bug that no one caught untill then!
     
  30. makeshiftwings

    makeshiftwings

    Joined:
    May 28, 2011
    Posts:
    3,350
    Is there any documentation on this change they made in 3.5? How do they fake the FPS count to try to compensate for the editor? Do they just display twice the FPS you're actually getting, because they assume it will be about twice as fast if you do a separate build? That seems pretty sketchy. I'm not sure how they would calculate what they imagine the framerate would be without random guessing, which makes a fake FPS guess much more useless than an actual FPS display. If they are doing some funky math that has nothing to do with frames or seconds to display the frames per second value, they should document that.

    The feel I'm getting from the overall answers in this post is that no one actually knows what the stats window is showing, or how it works, but everyone seems to agree it's broken and useless. The only difference is that some people are making excuses for Unity under the pretense that the brokeness is intentional and/or that it's ok for it to be broken because there's a profiler that (allegedly) displays actual valid data. Is that correct?

    To be honest, I'm starting to doubt that I can trust the profiler... If Unity can't figure out how to display FPS or basic tick time, why would it be able to do even more complex calculations based on performance timing across multiple objects?
     
    Last edited: Oct 7, 2012
  31. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    I don't know if they do any of that, but in any case if they do it's bugged, as the numbers just don't match, they're flat out otherworldly when compared to the actual fps (both in and out of editor) so there definately is a bug here.
     
  32. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    Up, can't let this die as other people seem interested too, could someone from the unity team shed some light on this issue?
     
  33. Morning

    Morning

    Joined:
    Feb 4, 2012
    Posts:
    1,141
    I made my own FPS script to count FPS and it reports the same numbers fraps does, not the editor.
    Code (csharp):
    1.  
    2. using UnityEngine;
    3. using System.Collections;
    4.  
    5. public class FPSCounter : MonoBehaviour {
    6.  
    7.     int framesAccumulated;
    8.     int framesFinal;
    9.     float oneSecondTimer;
    10.     public Vector4 RectData = new Vector4(0,0,100,35);
    11.  
    12.     void Update () {
    13.         if (oneSecondTimer < 1){
    14.             oneSecondTimer += Time.deltaTime;
    15.             framesAccumulated += 1;
    16.         }
    17.         else{
    18.             UpdateFPSCounter();
    19.             oneSecondTimer = 0;
    20.             framesAccumulated = 0;
    21.             oneSecondTimer += Time.deltaTime;
    22.             framesAccumulated += 1;
    23.         }
    24.     }
    25.  
    26.     void UpdateFPSCounter(){
    27.         framesFinal = framesAccumulated;
    28.     }
    29.  
    30.     void OnGUI() {
    31.         GUI.Box(new Rect(RectData.x, RectData.y, RectData.z, RectData.w), framesFinal.ToString());
    32.     }
    33. }
    34.  
    Only updates once a second but that's enough for testing.
     
  34. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    But the issue still is not "how to measure fps", this is beyond trivial, the issue is wth is unity reporting if not fps and why, and why haven't they fixed it, any why is no one answering about a 2-3 second fix if it's a bug?
    On an even more silly note, i have a scene that worries me because it is reporting up 40-50ms cpu time per frame in the perf window, open up the profiler and THAT SAME perf window now states 20ms, what kind of compensation code is there, and who thought this would be of any help to anyone? Please remove this, this is horrible!
     
  35. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    Hey guys, let me try to shed some light on what's behind the Stats window and why that means there's some inherent limitations -- within which, however, it is quite useful.

    The key issue here is an architectural one: when playing in the editor, the editor and the game use the same single engine/runtime. Which means they run their stuff together and batch their stuff together to the GPU.

    This is why profiling a standalone player is the only way to do fully accurate profiling. Then you have isolation and can rely on only running your game stuff.

    So, for profiling in the editor, the best Unity can do is to separate work done for the editor from work done for the game as best as possible and only run the timers when doing stuff for the game.

    Code (csharp):
    1.  
    2. // *Conceptually* it looks liks this:
    3. // This is not actual code :)
    4.  
    5. StartProfiling();
    6. UpdateGame();
    7. RenderGame();
    8. EndProfiling();
    9.  
    10. UpdateEditor();
    11. RenderEditor();
    12.  
    In practice, this is quite a bit more complicated but IMO we're doing a pretty decent job at it.

    However, only timing the game stuff doesn't mean the editor stuff isn't there. Which is why you are seeing the issues that this thread is about :)

    Hopefully, this makes sense now. If, conceptually, GameUpdate() and GameRender() take hardly any time at all, you will see the Stats window report crazy high FPS. The fact that the actual effective FPS (as measured by the FPS counter script) is so much lower is because there's still the editor which does its thing and takes time.

    Note, however, that in this case the "Stats" window is actually more correct than the effective FPS measured with a counter script -- because if run standalone, that game would indeed have high FPS counts.

    And this is another case where the weak spot shows: when you're GPU-bound.

    The thing is we can only buffer up so much stuff to the GFX layer until it will take no more and stall and give the GPU time to catch up. Luckily, it's usually quite a lot you can submit on a Windows desktop machine but it pretty quickly becomes a problem on OSX. Still, the basic issue is there on either platform.

    And when that happens -- which is at a time we can't tell -- then the application will simply sit there and wait. If that stall happens to occur sometime while doing the RenderEditor() thing -- where we have profiling turned *off* -- then that time spent waiting is lost to the "Stats" window. However, exactly during that time it may actually catch up on stuff batched by RenderGame().

    Long-story short: when you're GPU bound, "Stats" will be increasingly off the mark.

    This is difficult to work around. Unity's GFX code is explicitly making sure not to get too far ahead and give the GPU time to catch up when it needs to, but when there's too much stuff in the pipes to even reach those wait points, we're busted nonetheless.

    Sorry for the long-winded post but I hope it helps to understand the strengths and weaknesses of "Stats".

    //Edit: Oh, and just to emphasize something I simply took as a given here: the goal of "Stats" is not to measure effective FPS rates, it is to give a good estimate on how your game runs after being built.
     
    Last edited: Nov 1, 2012
    CarlosAOFL and IgnisIncendio like this.
  36. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    I understand this and this sounds quite well, sound, however there are still 2 things:
    - Could you include an actual FPS counter (none of the specific code you mentioned), and display both FPS and "expected standalone FPS"?
    - This doesn't explain a very weird behavior i noticed that is that, as soon as i open up the profiler, my FPS goes way up / cpu time goes way down (and by way down i mean i've seen things like from 12 ms to 0.2 ms!). Closing the profiler drops the perf again.
    The issue i have with this system is that it tries to guesstimate real performance, i'd rather be told actual performance, even if it's completely off the mark, because it means i know this is a worse case, while your system isn't so in some cases it may report too good performance as you stated. To me it sounds like a risk and a complex solution for little to no benefit (if i really wanted to know standalone fps, i could launch standalone). So i'd still "very heavily" would be in favor of droping this completely.

    Edit: also every time i tested, actual FPS was much closer to standalone fps (barely slower) than your displayed FPS, it's often something like 70FPS turning into 85 in standalone while your counter announced 1500!
     
    wlwl2 likes this.
  37. manutoo

    manutoo

    Joined:
    Jul 13, 2010
    Posts:
    522
    Note: I have contacted Support pointing out that thread, that's why we got an answer from Rene Damm.
    I have been told it's better to fill a bug report in this kind of case, though, as it won't unnecessarily involve the Support staff that is not directly competent in such technical case.

    Rene Damm,
    I agree with all ronan.thibaudau said.
    And I'll complement with this :
    1. in the editor, when playing the game with the editor view tabbed off (ie: not visible), the editor overhead is minimal, thus you should display the real total FPS : a slightly too low FPS is way better than a totally wrong "guesstimate"

    2. if the editor view is visible while the game is playing, then either :
      a- turn off the Stat FPS, as it doesn't mean anything anymore
      b- use IDirect3DQuery9 with DX9 and ID3D10Device::Flush with DX10/11 before to call EndProfiling() , so the timing might become meaningful​
    3. in a general way, having a comparable too low FPS is way better than an high incomparable FPS ; ie: if the FPS counts the editor overhead, and I'm trying an optimization that lowers the GPU use, I'll see the FPS raises ; that it should be 115fps up from 105, but shows instead 85 up from 75, doesn't matter much, I'll still know it's faster ; having the current Stats FPS lowering when I actually optimized the real FPS, coz the guesstimate doesn't work correctly will just throw me off and thus make the Stats totally useless (and even dangerously misleading if I'm not aware of their exotic internal functioning, like it must be the case for 95% of Unity users)

    Anyway thanks for your input !

    PS: in bonus, if you could explain the differences between "Main Thread" and "Renderer", as it's not stated here http://docs.unity3d.com/Documentation/Manual/RenderingStatistics.html , it'd be great ! ;)
    From your post, I'd guess "Main Thread" is actually our whole game thread (including everything, scripts, physics, rendering, etc...), but it'd nice to have it confirmed.
     
  38. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    First off, there's been a lot of discussion about "Stats" internally too so I think one thing that can be established without question is that there's room for improvement :)

    Hmm, I wonder, why would you want to profile the time Unity spends drawing and updating editor windows when running your game. And even then, you can simply put an FPS counter on your camera and get your effective FPS immediately without having to rely on "Stats".

    The intention of "Stats" is explicitly to try to give feedback only about your game -- not feedback about the editor. That it's not quite doing a perfect job there is pretty obvious by now I guess :)

    I can see where this might be coming from but yes, this is clearly undesirable behavior.

    I don't think that's the right way. With this, you are modifying how your game actually renders instead of simply observing it. By enforcing extra synchronization that you won't do when you play your game normally, you will get wildly different profiling results between standalone and editor (yes, I know, I know, that problem we already have but we want to make it better, right :)

    Yes, exactly. The render thread executes all the GFX commands batched by the main thread (which does what you said, though some parts like physics and sound processing may also run in separate threads) while running in parallel to it. So, what "Stats" does is it queries the time both threads took on the frame and then simply takes the bigger number as the total frame time.

    Yes. So yeah, fact is: "Stats" needs to be more reliable on the timing side (let's not forget it gives you other and more accurate metrics too).
     
  39. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    Well the issue is that stats is giving me "much much worse" feedback about the actual performance of my game in standalone currently, than what the (including in editor) naive FPS approach does. And it's really not documented too. We all agree that you shouldn't count on the perf in editor, but if you are going to display some perf, it's better not to be optimistic about it, i understand the reason for the effort and that it probably took time but, looking at this thread, it's definately not working. A simple normal FPS indicator (with, if you want, a popup indicating that it may run faster in standalone) would be much more accurate (or at least for me, has proven to be much more accurate, as i've said i have cases where my fps counter says 70ish my standalone 85ish and editor says . . .1500+ !). In any case when you see something named fps and not "adjusted guesstimate fps" you expect you can count on it, and that it's a worse case in editor, so it's always a bad surprise to get much worse (sometimes 10X as bad) performance in the standalone.
     
  40. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    Up, here's a quick video showcasing the issue with the profiler window, as you can see it makes things rather . . . unusable (check out the fps window when i open close the profiler window).
    In that video you see a scene:
    - running at 50FPS
    - me opening the profiler window and the scene jumping to 300/fps
    - me moving the profiler window out of view just to be sure it's not speeding up by a weird hack not requiring the display of everything
    - moving the window back in view
    - closing it, poof we're back down to 50FPS
    - vsync is disabled in all cases, so it's not the profiler disabling vsync or anything like that.

    This is on latest unity 4, so things aren't progressing at all.

    EDIT : use the download button in dropbox, the quality in the player is horrible so you won't be able to see the FPS

    https://www.dropbox.com/s/yfaldgp27dw9qhx/Unity Profiler FPS.wmv

    Note that, this bug aside, i still very much feel you're deeply in the wrong about your overall position on this, and while it's always hard to look at a lot of work you did on a feature (the compensation code) and say "well we'll dump it because, we were wrong and the codebase is actually better without it", i think that's the right thing to do
     
  41. dentedpixel

    dentedpixel

    Joined:
    Jul 15, 2012
    Posts:
    683
    While building FPS Graph I ran into the same frustrating discrepancy between the FPS I was calculating and what the Unity Stats window was telling me. However I think I have came up with a good solution for calculating the estimated frame rate. Basically I use the formula:

    1.0 / (timeAtPostRender - timeAtStartOfUpdate) = frameRate

    I also broke down the rendering time and the miscellaneous tasks in a similar manner.
     
  42. GuisQuil

    GuisQuil

    Joined:
    Jul 3, 2012
    Posts:
    3
    (Hello first post, sorry if this is the wrong place to ask this.)

    I'm evaluating Unity3D 4, I created a project with just one scene added a GUIText to it, and added this script http://wiki.unity3d.com/index.php?title=FramesPerSecond to it.

    The GUIText shows 30.00 FPS while running this on both iPhone and iPad, my assumption is that this is the maximun FPS I can get with these devices?
     
  43. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    By default vsync is on, no clue if it works to turn it off on mobile but try that (in player settings you can change vsync to don't sync, do that on iphone and see the actual fps you're getting).
     
  44. GuisQuil

    GuisQuil

    Joined:
    Jul 3, 2012
    Posts:
    3
    Thanks, I changed Vsync Count under Project Settings / Quality but no change.

    i found this (https://docs.unity3d.com/Documentation/ScriptReference/Application-targetFrameRate.html)

    After adding " Application.targetFrameRate = 60; " to the the GUIText script, now i get two different readings from the iPad I and the iPhone 4S, when adding scenes with test geometry and 60FPSon both when the scenes are empty

    What does iOS developers use in Unity to test the app performance on the devices?
     
  45. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    If you want to test the perf don't limit it to 60, don't set it at all and keep vsync disabled, you should be getting 100s of fps on an empty scene.
     
  46. giyomu

    giyomu

    Joined:
    Oct 6, 2008
    Posts:
    1,094
    ?

    ios device won't refresh more than 60 fps.
    If you do not set the target frame rate from unity explicitly to run at 60 fps then the app will run at 30fps max
     
    Last edited: Apr 29, 2013
  47. ronan-thibaudau

    ronan-thibaudau

    Joined:
    Jun 29, 2012
    Posts:
    1,722
    As i said earlier, no clue if it works for mobile, but i'm sure it won't work if he explicitely limits it to 60 at least :)
     
  48. giyomu

    giyomu

    Joined:
    Oct 6, 2008
    Posts:
    1,094
    you can use the internal profiler debug on ios..if for example youbdo not have pro version , it will give various infos like drawcall, update time framerate etc..

    if you have pro just use unity profiler while you are playing your app.
     
  49. GuisQuil

    GuisQuil

    Joined:
    Jul 3, 2012
    Posts:
    3
    Thanks, If I make the jump I dont think i'll be using Pro at first, so knowing about the internal profiler helps a lot!
     
  50. Victor-K

    Victor-K

    Joined:
    Nov 10, 2013
    Posts:
    208
    Extremely easy and intuitive to use Smart FPS Meter allows you to quickly and easily count, getting and showing performance info. It's a frames per second and milliseconds data, usage memory data, advanced hardware info.
     
    Last edited: Oct 1, 2014
Thread Status:
Not open for further replies.