I mean, aside from the obvious, rendering or not rendering them, how does this affect the games and the emulation speed?
Also, is it better to leave this on or off?
I've always wondered about this myself, can someone shed some light on the subject?
First, it's not so obvious. 8-bit textures are always rendered.
The difference is that typically GSdx converts 8-bit textures to 32 bit textures internally, which bloats the texture memory usage when they're uploaded to your video card memory. But it's faster for the video card to render from 32 bit textures, because 8-bit textures require an inline shader decoder.
So if a game uses a lot of 8-bit textures, it can flood video memory and run very slow; such games will run much faster with allowed 8-bit textures because extra shader work is less invasive than running out of video memory. But if a game only uses a few 8-bit textures and you allow 8-bit textures for use, it can slow things down because of extra shader work.
So, basically, allow 8-bit texture unchecked = conversion from 8-bit to 32-bit texture on the plug-in level, which requires a lot of VRAM to store the converted 32-bit texture and some processor time for the conversion.
Checked = conversion to 32-bit texture on the hardware level, which requires less VRAM, because the texture stored in VRAM is 8-bit, and less processor time. However, this requires a fast GPU core, since it is doing the conversion as it outputs the texture.
If you have a game that is using ~99% CPU in GSdx's title bar try turning it on and off see which helps better for speed in that game/scene basically as it works different in games and systems :P
For me the 8-bit textures creates random graphic glitches and doesnt help any of my games so I always turn it off.
(10-06-2009, 01:46 AM)dr_thrax Wrote: [ -> ]So, basically, allow 8-bit texture unchecked = conversion from 8-bit to 32-bit texture on the plug-in level, which requires a lot of VRAM to store the converted 32-bit texture and some processor time for the conversion.
Checked = conversion to 32-bit texture on the hardware level, which requires less VRAM, because the texture stored in VRAM is 8-bit, and less processor time. However, this requires a fast GPU core, since it is doing the conversion as it outputs the texture.
Actually, what I got from the post was this:
Allow 8-bit Textures checked: Allows 8 bit textures to be processed normally, using less VRAM, but adding more work for the inline shader.
-Good for games that use very many 8-bit textures, to prevent running out of Video Memory.
Unchecked: Processes 8-bit textures as 32-bit textures. This is easier for the video card to render, but takes up more VRAM.
-Good for games with few 8 bit textures.
Feel free to correct me if I'm wrong here.
... and, as far as I know the only games that really benefit from enabling 8 bit textures are Xenosaga2 and maybe Odin Sphere. (but I might be wrong about the latter ,, it could have been something else that made Odin Sphere so slow)
I dont have Xenosaga II but yeah it helps in Odin Sphere.
Fight Night Round 3 and Grand Theft Auto Vice City Stories got improvements for me with 8 bit textures.
i just get massive slowdown with 8bit textures checked -_-;