GSdx Memory Coherency
#1
The PS2 has a unified memory system, meaning that the CPU and GPU are generally coherent. Hence, you could have a game that might update a texture silently at any time. How is this dealt with in PCSX2 is general? When it comes to it, exactly how does this work on the PS2 itself? I suspect that there are some coherency restrictions in the hardware with respect to caches.

I'm bringing this up because I was noticing a long standing bug (the Jak and Daxter eye rendering bug), which seems to me to be very likely caused by dynamic texture updates not making it to the GPU.

Are we sure the coherency system is working properly in all cases?

Does anyone know the code path used for texture updates? I'd like to take a look at it.
Reply

Sponsored links

#2
It is not working, as you correctly guessed. There is cache invalidation for the emulation side and the actual renderer side in GSdx and it is insufficient and hacky. The way it is implemented means fixing it requires rewriting the entire plugin though.
People have tried to fix this for years but the best we could do was more hacks Sad

If you want to check it out, try:
In GSRendererHW.cpp > InvalidateVideoMem and InvalidateLocalMem.

Good luck!
Reply
#3
It looks fine in software mode IIRC
[Image: ref-sig-anim.gif]

Reply
#4
Indeed. This is a typical hardware renderer issue. The dreaded "texture cache" entire topic Tongue2
Reply
#5
I suspect this needs to be fixed on the emulator side. Can textures be invalidated from there?

A big hack that would fix this would be to force invalidate every texture all the time, but this would be slow (though perhaps acceptable given that PCIe bandwidth is in the GB/s range these days). If I were to add such a hack, where are the hooks located to add it to the GUI and so forth?

The proper fix would be for the emulator to detect writes to a surface from the CPU side, and then invalidate whatever texture that is in the cache. This gives rise to two questions: is invalidation exposed from outside the plugin (if not, it needs to be), and secondly, is there a good way to detect this? Hopefully, the PS2 requires some sort of special instruction to make sure the surface is evicted from whatever CPU caches out to main memory, but if not...

*EDIT*

Here's a thought - allocate the system RAM with VirtualAlloc, then use GetWriteWatch to see when textures are modified. When a texture is used by the GPU, you just check the modified page array that GetWriteWatch produces, and update the texture as needed. Then, call ResetWriteWatch so that it's ready for the next time it might be modified.

This would only work on Windows, but I think there are equivalent APIs for other OSes.

Where in the code is the PS2 memory space allocated? Also, are there compile flags for what OS is targeted? I'd need #ifdef WINDOWS or something.
Reply
#6
There is already some code in GSDX to hack/disable the texture cache. It's extremely painfully slow, but it helps in finding cache issues.
[Image: ref-sig-anim.gif]

Reply
#7
I can't for the life of me figure out where the main memory buffer is allocated. Any hints?
Reply
#8
(05-26-2014, 07:06 PM)refraction Wrote: It looks fine in software mode IIRC

To me, this alone already says that things do work properly.
[Image: nbKSK.jpg]
Reply
#9
(05-28-2014, 07:57 AM)KrossX Wrote: To me, this alone already says that things do work properly.

No, CPU mode is coherent automatically, since it can use the textures directly out of memory. For the GPU, they have to be staged into a texture cache, which needs to be notified of any texture updates somehow or another.
Reply
#10
If it can work fine, even in software mode, it means the memory model and / or interface with the plugin works fine. It also means, that it's possible for the plugin to work properly. So, it wouldn't be a problem of memory management but rather... just GSdx's buggy texture cache. It could also be unimplemented features, like mipmap.
[Image: nbKSK.jpg]
Reply




Users browsing this thread: 1 Guest(s)