Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
GSdx Memory Coherency
Ugh, about this the basic situation is that the hardware renderer's "texture cache" (I hate calling it that because it's not an accurate name for what it actually does) is a speedhack which INTENTIONALLY doesn't invalidate everything it should. In reality, textures are reuploaded to the GS all the time, framebuffers and depth buffers are downloaded and reuploaded as necessary in some games and a naive implementation is very very slow. So instead it makes assumptions when the game accesses the same GS local memory address again.

A proper solution would be based on the uploaded data but that's a complicated topic that I don't trust myself to explain my ideas on right now.

Sponsored links

let me jump in too.

(05-28-2014, 03:31 PM)pseudonym Wrote: So instead it makes assumptions when the game accesses the same GS local memory address again.

i can't really read that gsdx code but that's what i thought i read somewhere too. also some hash whatever. to check the content? it is basicly a cache. avoiding pci latency and uploads and conversion.

that jax eye bug is something i can't really. how can that texture disappear? is the assumption faulty? i mean you can have a texture use one bitpattern with two palettes. i could mask the iris of that texture with a white palette. or the other way around - do a "subrect" palette update for the iris rendering. would the plugin notice that change? in case there is some conversion for the cache, is it perhaps wrong that way? it should be two cached items then.

just a wild idea. i don't falter to post.

edit: hassle - given it notices the palette update this should render correct with 8-bit textures enabled. perhaps 4-bit texture too. or could the partial palette update slip? how are pallettes cached and plugged and set for rendering? well.. i can't read that code. Laugh
I dont think the eyes have dissapeared, they're just too high up stuck in the skull.
[Image: ref_sig_anim.gif]
Like our Facebook Page and visit our Facebook Group!
ehh... no. i just should not guess wild based on google img. trash mashed together.

bits of thread. that jax1 bug. is value conversion issue? vu basicly calculating the direction addresses. effectively a relative memory offset for the eyes. and it's possibly wrong.

the other bug doesn't make sense either. one eye is rendered. the rest is black. basicly using the same texture but possibly not the same memory location? well... is that the cache making faulty assumptions?
(05-28-2014, 02:01 AM)refraction Wrote: There is already some code in GSDX to hack/disable the texture cache. It's extremely painfully slow, but it helps in finding cache issues.

Where do I activate said hack?
Well, I can confirm that it's a texture cache issue - disabling the cache fixed it, although it made everything very slow. Now, if only I could get GetWriteWatch to work properly...
sorry for the late response, guess you found it Tongue2
[Image: ref_sig_anim.gif]
Like our Facebook Page and visit our Facebook Group!
Ah, I think I figured out why it wasn't working - I was using the offset rather than a real pointer. Oops.

So, the next question - how do I get the actual pointer out of the offset? Also, how does the virtual memory system effect this? I'm assuming that there's no reason that a texture can't span multiple pages, but this means I have to worry about checking every page. Finally, how do i get the actual texture size in bytes?
Doesn't Gabest usually maintain GSDX, I really hope a good D3D Coder comes along to maintain it.
I really can't figure out the memory system - too many layers. What path does a texture take from system memory to eventually a surface in the texture cache? I need to be able to catch it at the top.

Users browsing this thread: 1 Guest(s)