Search Unity

OnRenderImage = slow RenderTexture.GrabPixels ?

Discussion in 'iOS and tvOS' started by 60days, Feb 10, 2012.

  1. 60days

    60days

    Joined:
    Jan 27, 2012
    Posts:
    23
    I'm using qualcomm AR on an iphone 4 at the moment, and it seems most of the render time is being lost to two 'RenderTexture.GrabPixels' calls between 15ms and 50ms each (according to the profiler).

    In an OnRenderImage() attached to the main camera I'm rendering some particles using a code-generated camera, using bloom on them (2 buffer textures and 3-5 graphics.blits) , combining them with the Main Camera`s render (another blit), then letting QCAR do whatever it does to have the video appear in the BG.

    Testing has shown that all the high-demand parts above (bloom, compositing, even rendering the particles entirely) have a negligible effect on the framerate. All that is needed to create the two slow GrabPixels calls is an empty OnRenderImage (source :RenderTexture, destination:RenderTexture) {} .

    In the editor the GrabPixels calls dont appear - only when the iphone is profiled. In the editor (on my crappy intel IGP) rendering is <1ms vs 70-150 ms on the iPhone.

    I've googled RenderTexture.GrabPixels but nothing comes up. Given how quickly I'm reading and writing new textures via blitting, I suspect somehow the GPU is falling back on the CPU to read/copy a rendertexture at some point (like GetPixels?), but this is my first time using a game engine or javascript for anything but web stuff, so I have no real idea. Should I be declaring the source and destination textures differently? Or avoiding OnRenderImage altogether and just using code-based cameras and blitting?

    Below are the profilers for the editor and iphone. The second grabpixels call is just off-screen on the ios one unfortunately:

     
    Last edited: Feb 10, 2012
  2. Dreamora

    Dreamora

    Joined:
    Apr 5, 2008
    Posts:
    26,601
    The problem is more that you have an iphone4 and do a fullscreen operation yet the gpu in this device was never meant and designed for the retina resolution (its the 3GS gpu and there it was fine and appropriate but only there).

    Fullscreen effects in general are no go on 4th gen / ipad 1 unless the game otherwise is extremely fillrate optimized as every redraw means you use one of your 4-5 screen renders you have per frame (to stay on 30fps) and I suspect thats exactly the problem you get to see there. You already used up your fillrate, ask for the GrabPixels and in consequence have to wait until the gpu has the time and resources to do it again.

    so yes avoiding OnRenderImage and postfx / blitting altogether is the best thing to do for the 4th gen, its not like you will be able to put the blit in front of the camera on a transparent surface without getting a massive hit directly again due to the fillrate
     
  3. 60days

    60days

    Joined:
    Jan 27, 2012
    Posts:
    23
    I was hoping to run at standard resolution (non-retina) to make up for the missing fillrate. I dont need to render anything except a single particle emitter and video background from the AR, so overall its low demand (other than overdraw, which is why I'm downsampling the particles and bloom even further). At the moment the whole project is in 480x320, the particles are at 240x160 and the bloom is at 240x160 to 120x80.

    Am I better off then just rendering the particles full rez? I didnt have major framerate problems even with thousands of overlapping particles until I started trying to be clever and doing downsampled particles (that I assumed would pay off in the end).

    It feels strange that the empty onrenderimage() slows it to a crawl (8-10fps), but adding the full processing as well (bloom, second camera render of particles, compositing) only drops it another 1 or 2 fps.

    EDIT : disabling QCAR removes any GrabPixels calls, bringing the camera.render thread down to 12ms even with bloom, particles, blitting etc. Apparently its some interaction between QCAR and OnRenderImage() ... Now I'm totally lost...
     
    Last edited: Feb 10, 2012
  4. J_P_

    J_P_

    Joined:
    Jan 9, 2010
    Posts:
    1,027
    Just curious, have you tried using QCAR's new GetVideoTextureInfo? It's in their 1.5 beta. As I understand it, Instead of rendering the camera feed natively it lets you render the camera feed as a Texture2D and map it to a mesh surface inside Unity. Doing so might make it easier for Unity to handle. We're using it to apply shaders to the camera feed.
     
  5. 60days

    60days

    Joined:
    Jan 27, 2012
    Posts:
    23
    I looked at the sample, but unfortunately 90% of the code seemed based around that specific case (distorting a mesh with the texture on every Update) and I couldn't easily discern where the distorting stuff ended and the UV coordinate mapping began (this would be my first time dealing with UVs in code and I know the plugin uses an unintuituve approach with black bars on the texture). I'll take another closer look, but I wish it was a simple example of a blit-like approach!