Why is 1.5.0 about 33% slower than 1.4.0?
#1
With PCSX2 1.4.0 I can get 60+ FPS on God of War at 2x Native Resolution with the OpenGL HW Renderer.

With PCSX2 1.5.0 (latest from git on the PPA) I get about 40 FPS instead.

It has been a long time between 1.4.0 and the latest in the 1.5.0 line so its difficult to tell what could have changed between these versions to cause such a performance hit.  I have been switching back and forth to try and see if there is any configuration difference, but that does not appear to be cause.

Does anyone know what could have caused this?  Is 1.5.0 just more accurate and thus quite a bit slower than 1.4.0 or is there something else going on?
Reply

Sponsored links

#2
In order to help you further with your problem, please make sure the following are all provided.
  • Your hardware specs - CPU, Graphics Card, Memory, Operating System.
  • The version of PCSX2 you are using.
  • Any non default settings you are using.
  • What games you are trying to play and if you are playing them from ISO or DVD.

Thank You.
Reply
#3
OS: I have tried this on both Ubuntu Linux 17.10 and Windows 10 with the same results.
Hardware: CPU is Intel Core i7-7567U (~2270 STR), Graphics is Intel Iris Plus 650 (~1500 G3D), 16 GB RAM
Version: 1.4.0 and 1.5.0 (latest dev build)
Non-Default settings: 2x Native Resolution
Games: God of War from an ISO

As stated above, this happens in both Windows and Linux when using the OpenGL HW Renderer.  However when using the Direct3D 11 HW Renderer on Windows, I can get (unlocked) frame rates of 120+ FPS in the same scene. (Still using 2x Native)
Reply
#4
What about Blending Accuracy in OpenGL? Reducing it will probably improve your FPS. As a side note, OpenGL has higher accuracy than DX11 due to effects that are emulated. This may translate into reduced performance during some scenarios.
Reply
#5
(04-07-2018, 10:37 PM)CK1 Wrote: What about Blending Accuracy in OpenGL? Reducing it will probably improve your FPS. As a side note, OpenGL has higher accuracy than DX11 due to effects that are emulated. This may translate into reduced performance during some scenarios.

There was very little performance impact by changing the Blending Accuracy.  I tried every combination of settings I could think of.  Everything that was not off or at its lowest setting, I tried to adjust, and at best case I was able to net another ~5 FPS, but that's still a far cry from what it was in 1.4.0 (or what it is in 1.5.0 with DX11)
Reply
#6
In 1.5.0 the depth emulation is enabled by default and to disable it,you have to enable an option for gsdx hw hacks
In 1.4.0 to have depth emulation,you have to enable an option

Depth emulation cost performance and as far as I know,it works only in OGL mode
Reply
#7
(04-08-2018, 07:31 AM)vsub Wrote: In 1.5.0 the depth emulation is enabled by default and to disable it,you have to enable an option for gsdx hw hacks
In 1.4.0 to have depth emulation,you have to enable an option

Depth emulation cost performance and as far as I know,it works only in OGL mode

I already tried playing with Depth emulation in 1.5.0 (at least on Linux).  It does not have much of an impact on frame rate, but it does introduce some pretty serious graphical glitches if its off.  These glitches are not in 1.4.0, and I have not changed any setting (except resolution) from the default.
Reply




Users browsing this thread: 1 Guest(s)