Sly 2 Band of Thieves no shadows
#11
Yes the PS2 support bitmask blending on the GPU side. Modern blending GPU units are just useless.

To emulate shadow properly you end up drawing triangle by triangle the image which is slow.
Reply

Sponsored links

#12
(05-02-2017, 07:52 PM)Shadorino Wrote: I don't know anything about this, but I think the devs should focus on emulating the shadows on D3D11 instead of increasing speed on OGL. OGL's high resolution rendering is off, and the UI is screwed up as well, it looks like Sly is winking.

did you try matching the de-interlace modes? i noticed those were set differently. also, can you turn down the blending accuracy and still get correct shadows? that will save a lot of cpu

and the reason your GPU is getting maxed is because you're set to 8x res, which is overkill. go with 3-4x + FXAA. that's the sweet spot usually.
#StopRNG
Reply
#13
(05-03-2017, 06:23 PM)gregory Wrote: Yes the PS2 support bitmask blending on the GPU side. Modern blending GPU units are just useless.

To emulate shadow properly you end up drawing triangle by triangle the image which is slow.

Thats honestly something new for me, not a beginner hardware wise. As far i remember if correctly the GS was very basic yet flexible even compared to a GF 256DDR which come a while before PS2.
SH lighting could be and was done on the VUs and that was quit special although a geforce could do other effects better.

Sounds strange, that 17 years later modern GPUs havent really catched up, even if just in one department.
Reply
#14
To be fair, you can do something with recent GPU (maxwell and later, too bad I have a Kepler GPU). At least it would be possible to emulate it better. It would incurs a penalty on the GPU but it won't have any impact on the CPU. Currently the CPU impact is huge.

But it isn't strange. Bitmasking is useless on modern rendering. SH is a good example as even the PS3 Hd version didn't manage to implement some effects of the PS2... Besides the blending unit is still hardcoded on the GPU, nothing change since day 1, no flexibility at all.

And it is the same story for all fixed unit (texture filtering for example which is also emulated on shader). Pixel Alpha testing which is also emulated on shader.
Reply
#15
Having a kepler and a maxwell (and a room full of other gpu/hardware) but i cant code Tongue
Theres probally a reason manufactures such as Nvidia and AMD went/are going this way, i mean 17 years of advancement is a lifetime in pc hardware.
PS2 wasnt anything special to be honest (in my eyes), a Super PS1... P3 1ghz/GF2 GTS combo being the better performer overall then the PS2 at launch.
But i know, the PS2's architecture is very different and NOT easy to code on, like with the C64 they got quit much out of it thanks to creative coders.
Reply
#16
Quote:I mean 17 years of advancement is a lifetime in pc hardware
It didn't change that much. I mean the rendering pipeline is basically the same. However the fast part of the pipeline was replaced by slow processors (aka shaders). It was a huge change because it gives us the capability to tune some step of the graphics pipeline. But fixed unit are still mostly the same. The blending unit support more formats and a couple of extra operations. Nothing too fancy for 20 years of evolution. Performance on those fixed units is critical so you can't bloat them. Bit masking is useless for modern rendering so they remove it.

Quote:PS2 wasnt anything special to be honest (in my eyes), a Super PS1... P3 1ghz/GF2 GTS combo being the better performer overall
You need to compare cost vs power vs performance.
Reply
#17
(05-05-2017, 12:10 PM)gregory Wrote: It didn't change that much. I mean the rendering pipeline is basically the same. However the fast part of the pipeline was replaced by slow processors (aka shaders). It was a huge change because it gives us the capability to tune some step of the graphics pipeline. But fixed unit are still mostly the same. The blending unit support more formats and a couple of extra operations. Nothing too fancy for 20 years of evolution. Performance on those fixed units is critical so you can't bloat them. Bit masking is useless for modern rendering so they remove it.

You need to compare cost vs power vs performance.

Okej, yes the p3/gf2 combo cost much more then a ps2, more then most can or are willing to affort in its time.
Reply
#18
(05-02-2017, 07:58 PM)CK1 Wrote: You're going to get artifacts/an chromatic abberation effect at higher resolution, you can't avoid that. The UI is fine for me in OpenGL btw.

I don't get chromatic aberration, even at 8x IR, as long as the "Sprite/Round Sprite" hack is enabled.
The "Unscale Point and Line" hack has to be enabled in order to get the cel-shading on the characters (unfortunately that is also the hack that screws up the UI on OGL). The PS3 version removed this cel-shading, as long as others effects present on the PS2 version.
However, on PCSX2 if the cel-shading is not emulated you can see a gap on the character's edges where the black cel-shading lines are supposed to be. Problem is, OGL doesn’t render this cel-shading well at high resolution, as shown in my previous screenshots.

(05-03-2017, 07:17 PM)dogen Wrote: did you try matching the de-interlace modes? i noticed those were set differently. also, can you turn down the blending accuracy and still get correct shadows? that will save a lot of cpu

and the reason your GPU is getting maxed is because you're set to 8x res, which is overkill. go with 3-4x + FXAA. that's the sweet spot usually.

My monitor is 2560x1440, so my goal is 5x IR. I cranked it up to 8x because there was no performance impact on D3D11, and it gave a little bit of anti-aliasing without the need of MSAA, which completely killed performance.
The Interlacing is on Bob tff for D3D, because it gave the best image quality. On OGL it is set on Auto (which I'm assuming is Blend tff/bff), because Bob tff made the whole screen bouncing up and down.
Reply




Users browsing this thread: 1 Guest(s)