Search Unity

DX11 shaders & Mac/Linux (OpenGL)

Discussion in 'Shaders' started by MXAdd, Sep 1, 2015.

  1. MXAdd

    MXAdd

    Joined:
    Apr 23, 2015
    Posts:
    74
    Hi

    I have a couple of fragment shaders in my game that uses LOGICAL operations on int/uint type.
    The game targets DX11 hardware and on PC runs just fine,
    Now I'm porting it to Mac/Linux (SteamOS) but compiling for example this:
    int I = ((int)(pos.x)) & 255;
    leads to this:
    GLSL shader load error (stage 1 shader 109):
    0(61) : error C7548: '&' requires "#extension GL_EXT_gpu_shader4 : enable" before use
    now my question:
    Is the only way to fix this is to write shader in pure glsl
    or is there any way to specify "#extension GL_EXT_gpu_shader4 : enable" while crosscompiling from HLSL ?
     
  2. Plutoman

    Plutoman

    Joined:
    May 24, 2013
    Posts:
    257
  3. MXAdd

    MXAdd

    Joined:
    Apr 23, 2015
    Posts:
    74
    GL_EXT_gpu_shader4 is NOT OpenGL 4 extension, AFAIK it was present as part of OpenGL 2.xx extensions set, it's here from the year 2006, all modern graphics hardware support operations on integer numbers from ~9 years ...
     
  4. Plutoman

    Plutoman

    Joined:
    May 24, 2013
    Posts:
    257
  5. MXAdd

    MXAdd

    Joined:
    Apr 23, 2015
    Posts:
    74
    I know tkat, but I want to avoid this for portability reasons & code maintance quirks (having 2 separate complex shaders leads to x2 work when something needs to be changed and is error prone). cg/hlsl compiller is fine with int's and cg-generated glsl code is OK, but the driver that eats it at the end complains about missing extensions - and I'm looking for the nice way to inject it - or it is impossible at all ?
     
  6. Plutoman

    Plutoman

    Joined:
    May 24, 2013
    Posts:
    257
    My guess is no, if the cross-compiled code does not have any functionality to pass through that extension, then the only way would be manually. I'm presuming you've tried just adding it, and it doesn't end up in the result. At that point, the best solution I can give you is to plan on supporting two complex shaders but filing a bug report with an easily reproduceable simple project + scene + shader. If it's something simple, they can usually get it fixed quickly (relatively speaking.. it's not a small project, so turnaround is still going to be of some time).

    Shaders in general have a number of quirks that make portability and code maintenance frustrating, so I do understand the pain (if you've worked with compute shaders, it's quite painful to work with append buffers on occasion; where I have to maintain a separate, almost identical consume shader just to clear the buffer so I can fill it again next frame.. since I don't have low-level access to the DX11 API to reset the buffer counters and overwrite memory, and Unity holds no methods to do it - but even the DX11 API doesn't have a simple way to do it).