Search Unity

WebGL and texture generation

Discussion in 'Shaders' started by crushy, Apr 13, 2017.

  1. crushy

    crushy

    Joined:
    Jan 31, 2012
    Posts:
    35
    I'm having a very specific issue which I'm sure has an easy solution that's not coming up to me right now.

    I was porting a sound visualizer to WebGL. The way it worked was a monobehaviour would feed a one dimensional texture (with no fixed size) to a shader and the shader would use this data to display certain effects on screen.

    However this doesn't work on Webgl (despite working on all the other platforms, including an OpenGL editor). For some reason the shader's texture data is always set to 0. I narrowed this down to perhaps the texture encoding I'm using is RBGA while this page suggests I should be using something like DXT for WebGL. This seems like the most likely cause of the issue.

    So it should be easy enough, just create a texture with DXT encoding. However, I can't call SetPixels for a DXT texture.

    Does anyone have any idea on how I can work around this? Perhaps using another way to feed the data to the shader would work. It seems like a fairly obvious issue.
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    You do not want to use a DXT texture for this. For one DXT doesn't support textures only 1 pixel high, min height is 4 pixels, and WebGL doesn't support DXT textures that aren't powers of two. A "1 dimensional" RGBA texture is what you want, but you might be running into limitations in WebGL for non-square and non-power of 2 textures.

    If you want to use a non-power of two texture in WebGL (and OpenGL ES2.0) it must be in the format RGBA, have no mip maps, and have its wrap mode set to clamp.

    Texture2D myTex = new Texture2D(height, width, TextureFormat.RGBA4444, false);
    myTex.wrapMode = TextureWrapMode.Clamp; // disable repeat
     
    Last edited: Apr 13, 2017
  3. crushy

    crushy

    Joined:
    Jan 31, 2012
    Posts:
    35
    I was already using those settings, unfortunately that's also not one of the Texture Formats allowed for setpixels :(

    Code (CSharp):
    1. Unsupported texture format - needs to be ARGB32, RGBA32, RGB24, Alpha8 or one of float formats
    2. UnityEngine.Texture2D:SetPixels(Color[])
    For reference here's my texture creation code:
    Code (CSharp):
    1. tex = new Texture2D( textureSize, 1, TextureFormat.RGBA32, false);
    2. tex.filterMode = FilterMode.Point;
    3. tex.wrapMode = TextureWrapMode.Clamp;
    4. tex.anisoLevel = 0;
    Edit: I just noticed I'm using linear space for the texture and the WebGL build settings allow for Linear colour spaces. Brb, doing a build.

    Edit2: No, same result :/
     
    Last edited: Apr 13, 2017
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Sorry, RGBA32 is the correct format, not RGBA4444, that's a typo on my part.

    You might try using tex2Dlod in the shader? However as best I understand it, what you're doing should work. :/
     
  5. crushy

    crushy

    Joined:
    Jan 31, 2012
    Posts:
    35
    No good, I just don't understand what's going on between the 2 other than the texture encoding.
     
  6. crushy

    crushy

    Joined:
    Jan 31, 2012
    Posts:
    35
    Oh stupid me. It's actually GetSpectrumData returning an array of all 0s on WebGL. This is so backwards it never occurred to me that the issue would be here instead of the way more complex shader/texture interactions.