Search Unity

RenderTexture strangeness - Help appreciated!

Discussion in 'General Graphics' started by SunnySunshine, Jun 21, 2017.

  1. SunnySunshine

    SunnySunshine

    Joined:
    May 18, 2009
    Posts:
    976
    I can't seem to figure out how RenderTextures behave in regards to linear/sRGB. It seems like while the Linear/sRGB flag alters the actual texture, it doesn't matter when the texture is read.

    To test this, I have this very simple scene (Linear project setup):


    The "quad gradient" is drawn with a very simple custom unlit shader that simply outputs uv.x.

    Camera sRGB draws "quad gradient" into sRGB RenderTexture.
    Texture result:


    Camera Linear draws "quad gradient" into Linear RenderTexture.
    Texture result:


    (Notice the difference in color, which is to be expected.)

    The "quad texture" is shaded with a regular unlit shader, using the RenderTexture from one of the above.
    Camera default draws this Quad to screen.

    And here's the strange part - the quad texture will look the same regardless of what texture is used:

    When using linear RenderTexture:


    When using sRGB texture:


    They're both the same!

    I found out that if you write the render texture into a Texure2D with the linear flag set to false, it will behave as expected:

    Code (CSharp):
    1. private Texture2D WriteIntoTexture(RenderTexture rt)
    2. {
    3.     var tex = new Texture2D(rt.width, rt.height, TextureFormat.ARGB32, true, false);
    4.  
    5.     RenderTexture.active = rt;
    6.     tex.ReadPixels(new Rect(0, 0, rt.width, rt.height), 0, 0);
    7.     tex.Apply();
    8.  
    9.     return tex;
    10. }
    But should this really be necessary? I'd prefer not wasting resources on doing that conversion.
     
    Last edited: Jun 21, 2017