FFX: Slowdown while displaying particle effects - why?
#11
No, the Demi spell brings my GPU usage to 99%, which is normally ~30%.
Reply

Sponsored links

#12
Talking about texture sizes and the level of detail that's needed to render when particle effects are rendered over them resulting in values in 10x or more than the PS2 ever did. We're not talking about native res, we're rendering things at 4x size, but actually what that means instead is exponentially larger.

a 320x240 image for example is 76800 pixels.
3200 x 2400 image for example is 7680000

This might be 10x scaling, but the end effect is the total number of pixels is 1000x more. Then apply an effect to each of those pixels over the top and you might actually hit the limit for bus bandwidth between a 4x speed PCI interface.

Or so that's how I read the thread, someone can correct me if i'm wrong

Edit: There's a lot of processing going on there I made a quick youtube to show the processor spiking while demi is cast. Sorry for the constant stutter in the recording, running high res + high res screen capture pretty much close to maxs my system.


This is uploading now and should be available in under an hour from my edit.
Reply
#13
(09-05-2011, 06:43 PM)skythra Wrote: Talking about texture sizes and the level of detail that's needed to render when particle effects are rendered over them resulting in values in 10x or more than the PS2 ever did. We're not talking about native res, we're rendering things at 4x size, but actually what that means instead is exponentially larger.

a 320x240 image for example is 76800 pixels.
3200 x 2400 image for example is 7680000

This might be 10x scaling, but the end effect is the total number of pixels is 1000x more. Then apply an effect to each of those pixels over the top and you might actually hit the limit for bus bandwidth between a 4x speed PCI interface.

Or so that's how I read the thread, someone can correct me if i'm wrong


John Carmack can Render multiple 128000×128000 (16384000000 Pixels) textures on todays Graphic Cards (http://en.wikipedia.org/wiki/MegaTexture) - Im Pretty Sure modern GPUs can Handle PS2 at 10x, the only reason the GPU cant, is because the CPU is too weak to Emulate the PS2 causing Hugh Bottlenecks

Reply
#14
Hey memorycard, wait about an hour and you'll see a neat video in the above post.

It shows the GPU spiking every demi cast.

It's not CPU at all. Although my FFX doesn't slow at all from demi, it's pretty clear that demi is a big problem for a GPU and in fact, if someone has less than a 470 GTX.

Now you can render a texture, but what happens when you filter it? Apply something to it?

Edit: What was proposed in that technique, is something that isn't implemented in either PCSX2 nor the original PS2. So i'm not sure how your article is relevant. He uses a distance based level of detail to be able to render these giant textures, not every texture is rendered at it's maximum detail ALL the time.

Edit2:Edit:
http://i248.photobucket.com/albums/gg195...a/demi.png
Because I didn't bother with the CPU usage in the youtube, it just demonstrates the demi covariance with GPU usage. In this screenshot you see the "blips" in GPU from demi, but you don't see CPU in cores 3 and 4 "blip". They stay varying around teh point they always were with and without demi.

I've reported my post below, i must have clicked reply and not edit.. hurr me Sad
Reply
#15
It's definitely not the CPU at all, MC.

I did a quick check of how my GTX 460 SE handles Demi casts. As you should expect, there seems to be a lot of variables at play, so 100% accuracy seems near impossible. With this example, the different camera angles were clearly a factor.

- @ 4x Native scaling -
Using Demi would either drop very briefly to ~56fps, or never go below ~80fps. OCing my SE to the standard GTX 460 clocks does the trick. Used a mere ~270mb of v-ram in total.

- @ 5x Native scaling -
Using Demi would drop as low as ~31fps, or keep as much as ~42fps. With my max OC (850/1850), it would drop as low as ~44fps, or keeps ~52fps. Used ~370mb of v-ram.

Any GPU with fractions of the bandwidth would certainly see it's limits on this one. See if native res gives you full speed, and work your way up from there.


EDIT: I just had to come back and add this next part.

I was testing out the same exact scene on my laptop (C2D, GT 230M), and had an interesting experience.

- @ Native res -
Using Demi would either drop to ~54fps, or (with some camera angles) never go below ~70fps. I'm sure a bit of OCing would take care of this, but then...

- @ Custom 720x720 res -
Using Demi would stay at ~65fps or more. You may be confused right now, because 720x720 is greater than what is native. It does look a bit better, but naturally, would not be expected to run at a faster rate.

I've gone over this several times to see if I was mistaken, or just seeing some fluke happening, but it's coming out the same over and over again for me. It seems like it's somehow specific to this scene, because 720x720 usually comes with a performance loss over native res in other GPU limiting games. If this helps anyone else's FPS (using a custom 720x720 rather than native), or makes sense to anyone, please do reply. Smile
Reply




Users browsing this thread: 1 Guest(s)