Hello everybody, first post on this forum
I'm really sorry, I already asked this on Unity Answers (http://answers.unity3d.com/questions...ncompress.html), but without any conclusive result, so I'm trying my luck here... Here's the thing:
I'm struggling to have proper texture quality for a 2D game. A picture's worth a thousand words : http://nerik.me/texture.png
Looks perfect on the desktop, while when running on an iPad, gradients look very bad, as if the texture was in 16 bit instead of 24. I've tried using auto truecolor, 32 bit and 24 bit, always uncompressed, Point, Clamp, and this is applied to a 1024*768 quad, using the simple unlit shader found at Owlchemy Labs (http://owlchemylabs.com/content). The texture is on a code-generated quad. (also note that I often use textures generated by PackTextures, but not on the example)
As suggested on Unity Answers, I also tried GUI texture , but it doesnt change anything...
Is there really no way to display a proper 32 bit gradient with Unity iOs? Or what am I doing wrong? I doubt this could be a hardware limitation...
Thanks for your help
try setting 32bit display buffer?
Yeah, that was it! Thanks a lot man!
The manual states that "it has performance implications", do you know precisely how?
My app now crashes at startup, probably because I create a lot of objects using 6-7 2048*2048 atlases. So I'm now trying to create them only when they are needed (its a huge parallax 2d landscape), but I'm kind of blind here...
32 bit probably pushed you over the memory your app uses. Try trimming down your memory. To see how much memory you're using on device, you should hold the mouse down over the play button in xcode and choose analyse, then choose the top right box that xcode presents to you.
Then you will see when the app is running on device, a pie chart of memory, and you'll be able to see if it goes over 128 meg or so (if it does, you're finished and will get springboarded at some point).
The performance issue is that it will take a bit more processing to display 32 bits instead of 16 bits. I haven't seen a large hit from this.
I haven't noticed any performance difference between 32 bit and 16 bit. If it exists, it's small.
the performance delta only affects pre 3GS. 3GS+ runs natively on 32bit, but the old SGX535 on the first 2 generations is a '16bit oriented' gpu so to say, 32bit has to be emulated through 2 16bits.
where it makes a difference is when you have many texture switches as 32bit naturally has to push through 2x as many data