Search Unity

Performance issue with difference between MS .NET and Unity-Mono issue

Discussion in 'Scripting' started by cmberryau, Mar 26, 2015.

  1. cmberryau

    cmberryau

    Joined:
    Mar 6, 2013
    Posts:
    12
    I have an external assembly I am developing in VS 2013, compiling using .NET 3.5 framework. Works great in Unity, functionality wise. It's a fork of OsmSharp, a C# written .NET library for handling GIS data. My work on the project has been making it compatible and usable with Unity. Here's a link:

    https://github.com/cmberryau/OsmSharp

    There is a Project part of the solution called OsmSharp.Test.Performance, as you can imagine, it runs performance tests and spits out results. The tests that are currently running are data-related e.g comparing different data connectors performance on the same method.

    Anyway, when running a test on SQLite or MySQL fetching a tile in the test area can take anywhere between 300-500ms taking into consideration the density of the area and of course this is heavily machine and network dependant. This is while running the application from VS 2013 just using debug mode (hitting f5).

    When running the exact same test in Unity3D, using the exact same data source it takes at minimum 3000-5000ms.. it's a huge increase in time taken. Exact same code, exact same data source, exact same machine. I've spent a good day going through all possible changes and I'm yet to find something that sticks out.

    I've tried running as a standalone, removing UnityVS, not using ILrepack on the external assemblies, changing the .NET compatibility level, changing from development build to production build. Most things I can think of.

    Given that it is futile for me to profile the code running under MS .NET in VS 2013 (it won't exhibit the same performance characteristics of Unity3D's mono implementation), how can I profile the code inside the assembly running in Unity3D to find the performance difference? I tried using the profiler, and it errors saying there are too many stacks in the frame and it's discarding the frame (rather frustrating too, removing potentially pertinent frames).

    I've ran into performance differences before between Unity3D's mono implementation and MS .NET before but this kind of a difference is rather crippling given the potential application of the assembly.

    Further on this: When running the code externally from Unity using Unity's included mono.exe, the performance results are within reasonable limits. I've confirmed that it is infact using Mono by outputting the result from RuntimeEnvironment.GetRuntimeDirectory() and it's the Unity Editor shipped Mono assembly folder.
     
    Last edited: Mar 26, 2015
  2. cmberryau

    cmberryau

    Joined:
    Mar 6, 2013
    Posts:
    12
    Found a fix for this after filling my code with Stopwatches. Turns out Unity's Mono implementation does not optimise string appending as well as the MS .NET implementation.
     
  3. lordofduct

    lordofduct

    Joined:
    Oct 3, 2011
    Posts:
    8,531
    It won't be super accurate, but it'll be better than comparing to MS.Net.

    You can build and profile your code in MonoDevelop, or get a VS plugin to build to Mono (like this one).

    Then you just build to the version of mono unity uses (well... technically it uses a modified version of mono, but still, get you 'close enough').

    And yeah, the version of Mono Unity uses is very old. It does NOT have a lot of the optimizations that now exist in both .Net and Mono. It can get REALLY annoying.