Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

Mobile ETC2 as default texture compression on Android

Discussion in '5.2 Beta' started by mh114, Aug 18, 2015.

  1. mh114

    mh114

    Joined:
    Nov 17, 2013
    Posts:
    295
    Hey, I noticed that ETC2 is to be the default texture compression for RGBA-textures on Android. Which makes sense, as RGBA4444 is horrible without dithering (I still maintain that we need built-in dithering for 16-bit!).

    Anyway, since ETC2 is not very commonly supported (starting with OpenGL ES 3.0, maybe some newer GLES 2 devices too?), it is a somewhat curious choice. Does Unity decompress the ETC2 texture during load on devices that do not support it? Then again, that would kinda defeat the whole purpose of compressing the texture.. Maybe we're better off with just providing several .APKs with different texture overrides like before?

    Or maybe the release notes are wrong and ETC2 is the default only when targeting GLES 3.0? Anyone tried this: what happens when you run an ETC2 compressed Unity app on a device that does not support it?
     
    haywirephoenix and MrEsquire like this.
  2. Astro75

    Astro75

    Joined:
    Dec 18, 2014
    Posts:
    48
    We use DXT5 on our sprites to save space on apk file. When the device does not support your texture type, Unity uses software decompression on load and your textures become 32bit RGBA.
     
  3. mh114

    mh114

    Joined:
    Nov 17, 2013
    Posts:
    295
    Yep, I actually knew that about DXT, I'm just curious if the same applies to ETC2.. Guess I'll try it myself unless somebody already has and reports back here. :)

    I'm still leaning towards several different APKs, although it seems like a hassle. Still, the memory hit is going to be quite substantial when all supposedly compressed textures bloat back to 32bit along with the compression artifacts, so you're effectively getting the worst of both worlds!
     
  4. Astro75

    Astro75

    Joined:
    Dec 18, 2014
    Posts:
    48
    We were using 32 bit textures until we hit 50MB Play Store limit and our downloads dropped. So we are still using the same amount of ram as before. And DXT5 looks really good. Other compression methods have ugly artifacts on semi transparent textures.
     
    mh114 likes this.
  5. mh114

    mh114

    Joined:
    Nov 17, 2013
    Posts:
    295
    Alright, that sounds like a reasonable strategy, didn't think of that! Good option to keep in mind, thanks! :)
     
  6. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Right now we build separate asset bundles builds for each of the texture compression types Android supports, check the device maker/model at startup, and choose the associated asset bundle manifest for the compression type that card supports. As you can imagine, this makes our build times really long (we have several thousands bundles, plus we do variants for different size options).

    We currently support DXT, ATC, PVR, and ETC as our possible formats. Originally we also built for ETC2 and ATSC, but the ETC build kept crashing on large textures (4k) and the ATSC build took approximately 13 hours to build the first time. ETC2 was slower than ATSC as well. So we have disabled them for now.

    If ETC2 is going to be the new default, please, please, fix the import speed. It's unbelievably slow. (I will still have to provide ETC1 formats for low end devices as well, so, ugh..)

    The combination of compression formats, requiring duplicate copies of assets to place in bundle variants, and slow texture compression times are absolutely killing us right now.
     
    yanivng likes this.
  7. mh114

    mh114

    Joined:
    Nov 17, 2013
    Posts:
    295
    Sounds quite painful, @jbooth .. I don't want to bloat the APK with all the textures compressed in various ways (nor do I want to download the correct bundle from a server), so I probably go with separate APKs for different formats.. Which is not very comfortable, either.

    This is much easier on the Apple side as all the devices support PVRTC (of course ATSC is probably better, but PVR is good enough for my purposes), too bad that there's no common compression + alpha for Android, until GLES 3.0...
     
  8. mh114

    mh114

    Joined:
    Nov 17, 2013
    Posts:
    295
    It seems that Unity indeed decompresses unsupported textures, and that applies to ETC2 as well. At least it works and doesn't refuse to load the textures, so that's good.

    @Astro75's DXT5 trick may be a good alternative for 32bit textures, to keep the APK size down. Hopefully 5.3 brings the LZ4 compressed assets so the APK size should shrink for everything (although lossy compression like crunched DX5 ought to shave off more bytes still).

    Yet I'm still undecided what to do; ETC2 support keeps getting better as more and more support GLES3.0, but in the meantime it may be still benefical to provide multiple APKs and let Google Play do the filtering..
     
    Last edited: Aug 20, 2015
  9. Twistplay

    Twistplay

    Joined:
    Dec 6, 2012
    Posts:
    36
    Having Unity's default now as ETC2 does create some challenges. Whilst it does reduce the APK size, it has the potential to increase memory usage on devices that don't support GLES3.0/ETC2 (which thousands of devices don't support), causing out-of-memory crashes.

    Previously alpha textures would default to 16-bit on Android, which would work universally (albeit with the usual banding artefacts due to lack of dithering).

    Now, at runtime if it finds an ETC2 texture on a device that doesn't support it, presumably it will unpack it to RGBA 32-bit, i.e. twice the memory consumption of before.

    You can choose the older ETC as the override in the build settings, but unfortunately this then overrides everything, including those you've specifically asked to be DXT5, for example.
     
  10. florianpenzkofer

    florianpenzkofer

    Unity Technologies

    Joined:
    Sep 2, 2014
    Posts:
    479
    The change of the default to ETC2 was made because RGBA4444 textures just don't give acceptable quality in many cases.
    Right now there is no global override to get the old behavior, but you can still change the format per texture.

    On devices that don't support ETC2 (typical ES2 GPUs like Mali400, Adreno 2xx, SGX, Tegra 3,4) we decompress to RGBA8888. The decompression code was optimized, so you shouldn't see longer load times. But you may run into rendering performance of out-of-memory problems.
    We do not decompress the texture in case you force the GraphicsLevel to ES2 on a device that has an ES3 driver.
     
    x0r and JoRouss like this.
  11. Twistplay

    Twistplay

    Joined:
    Dec 6, 2012
    Posts:
    36
    Agree quality of 16-bit is often sucky, but for applications where the art style means this is acceptable, people should be aware of the potential out-of-memory and performance implications particularly as it is not trivial to get the original behaviour back.

    I'm curious on the decompression optimisations, does it keep a local copy in memory in compressed format, and the RGBA8888 version exists just in OpenGL's memory space? I ask as this is what profiling seems to suggest - the memory profiler gives you the original smaller size of the texture, not the larger size, i.e. it's under-reporting the true memory impact (logged as case 725786)

    Also worth noting that I don't think it gives "... texture format is not supported, decompressing texture" warning for ETC2 textures when ETC2 is not supported, like it does with DXT/ATC/PVR.
     
  12. mh114

    mh114

    Joined:
    Nov 17, 2013
    Posts:
    295
    I've been saying this for ages, but just adding some simple Floyd-Steinberg or similar dithering would improve the RGBA4444 tremendously. Granted it's not very useful for standalones but with mobile devices (especially older devices) 16-bit dithered textures are a very good option for full RGBA (provided you dither the alpha channel too). It offers very nice quality boost over the horrible banding that often results from going 16-bit.

    I made my own asset processor that dithered the 16-bit stuff, but that approach had some problems so it would be awesome if Unity had 16-bit dithering built-in. Consider it at least, would be a quite low-hanging fruit to pick. :)
     
    ds44 and iivo_k like this.