multi-threaded GSdx
#11
That very strage. I don't see the link between thread and the renderer! It could explain the linux case too.

Hum multi thread seem to work (with a standalone GSdx)
extra thread: 0
Quote:The 12 frames of the scene was render on 634ms
A means of 52.833332ms by frame
The 12 frames of the scene was render on 630ms
A means of 52.500000ms by frame
The 12 frames of the scene was render on 612ms
A means of 51.000000ms by frame
The 12 frames of the scene was render on 627ms
A means of 52.250000ms by frame
The 12 frames of the scene was render on 640ms
A means of 53.333332ms by frame
The 12 frames of the scene was render on 614ms
A means of 51.166668ms by frame
The 12 frames of the scene was render on 610ms
A means of 50.833332ms by frame
The 12 frames of the scene was render on 622ms
A means of 51.833332ms by frame
The 12 frames of the scene was render on 616ms
A means of 51.333332ms by frame
The 12 frames of the scene was render on 626ms
A means of 52.166668ms by frame


Mean: 621.888916ms
Standard deviation: 9.266959ms
Mean by frame: 51.824078ms (19.296051fps)
Standard deviatin by frame: 0.772247ms

Extra threds:3
Quote:The 12 frames of the scene was render on 304ms
A means of 25.333334ms by frame
The 12 frames of the scene was render on 291ms
A means of 24.250000ms by frame
The 12 frames of the scene was render on 290ms
A means of 24.166666ms by frame
The 12 frames of the scene was render on 278ms
A means of 23.166666ms by frame
The 12 frames of the scene was render on 279ms
A means of 23.250000ms by frame
The 12 frames of the scene was render on 278ms
A means of 23.166666ms by frame
The 12 frames of the scene was render on 276ms
A means of 23.000000ms by frame
The 12 frames of the scene was render on 274ms
A means of 22.833334ms by frame
The 12 frames of the scene was render on 296ms
A means of 24.666666ms by frame
The 12 frames of the scene was render on 283ms
A means of 23.583334ms by frame


Mean: 282.777771ms
Standard deviation: 7.284348ms
Mean by frame: 23.564814ms (42.436150fps)
Standard deviatin by frame: 0.607029ms

As you can see I got a 2x boost on fps.
Reply

Sponsored links

#12
(03-22-2014, 08:23 PM)Cole Wrote: Here are a few pictures of Biohazard 4 which I have played in OpenGL and it runs great. Cool

In hardware mode we have the known errors but under software mode it runs like on the original hardware. Biggrin

You're doing really great progress.

Could you remove the limiter and then gradually increase the number of threads? There is a plugin, I don't know the name of it, that gives the minimum frame rate. If you could run that as well, that would be useful.

[edit] @ Gregory: Is there any suggestion on how to get that sort of boost? A guide?
i7 4930k @4.3, 4x4 GB RAM @2133 (15-15-15-27, quad channel), EVGA 570 @stock, Arch 64b.
Reply
#13
(03-23-2014, 12:56 AM)nstgc Wrote: Could you remove the limiter and then gradually increase the number of threads? There is a plugin, I don't know the name of it, that gives the minimum frame rate. If you could run that as well, that would be useful.

I will test it today.
MB: Gigabyte GA-X79-UP4 - Intel® X79 Express Chipset
CPU: Intel® Core™ i7-4960X CPU @ 3.60 GHz / TBF2.0 @ 4.00 GHz
RAM: 4x 8192 - 32 GB Corsair Vengeance® Pro Series DDR3-2133 PC3-17066 CL11
GFX: Gigabyte® Nvidia® GeForce™ GTX TITAN X - 12 GB GDDR5 (GV-NTITANXXTREME-12GD-B)
OS - MS - Windows 10 - Professional
MS - XBox Series X - With Activated Developers Mode




Reply
#14
(03-23-2014, 12:56 AM)nstgc Wrote: Could you remove the limiter and then gradually increase the number of threads? There is a plugin, I don't know the name of it, that gives the minimum frame rate. If you could run that as well, that would be useful.

I have a small problem under OpenGL the Framelimter doesn't work under OpenGL software.
MB: Gigabyte GA-X79-UP4 - Intel® X79 Express Chipset
CPU: Intel® Core™ i7-4960X CPU @ 3.60 GHz / TBF2.0 @ 4.00 GHz
RAM: 4x 8192 - 32 GB Corsair Vengeance® Pro Series DDR3-2133 PC3-17066 CL11
GFX: Gigabyte® Nvidia® GeForce™ GTX TITAN X - 12 GB GDDR5 (GV-NTITANXXTREME-12GD-B)
OS - MS - Windows 10 - Professional
MS - XBox Series X - With Activated Developers Mode




Reply
#15
what do you mean?
Reply
#16
It does not work, I press F4 and it remains at 59.95 fps under OpenGL Software.
MB: Gigabyte GA-X79-UP4 - Intel® X79 Express Chipset
CPU: Intel® Core™ i7-4960X CPU @ 3.60 GHz / TBF2.0 @ 4.00 GHz
RAM: 4x 8192 - 32 GB Corsair Vengeance® Pro Series DDR3-2133 PC3-17066 CL11
GFX: Gigabyte® Nvidia® GeForce™ GTX TITAN X - 12 GB GDDR5 (GV-NTITANXXTREME-12GD-B)
OS - MS - Windows 10 - Professional
MS - XBox Series X - With Activated Developers Mode




Reply
#17
Force your driver to disable VSync (and try PCSX2 menu too, I'm not sure the shortcut work)
Reply
#18
(03-25-2014, 08:01 PM)gregory Wrote: Force your driver to disable VSync (and try PCSX2 menu too, I'm not sure the shortcut work)

It works, thanks you for the information.
However, I can provide the test until tomorrow.
MB: Gigabyte GA-X79-UP4 - Intel® X79 Express Chipset
CPU: Intel® Core™ i7-4960X CPU @ 3.60 GHz / TBF2.0 @ 4.00 GHz
RAM: 4x 8192 - 32 GB Corsair Vengeance® Pro Series DDR3-2133 PC3-17066 CL11
GFX: Gigabyte® Nvidia® GeForce™ GTX TITAN X - 12 GB GDDR5 (GV-NTITANXXTREME-12GD-B)
OS - MS - Windows 10 - Professional
MS - XBox Series X - With Activated Developers Mode




Reply
#19
Ok, I have now made ​​a small benchmark.

I have tested with the SSE4.1 GSdx plugin R.5932 in OpenGL hardware and in software mode.

[Image: 8tdm5hja.png]
MB: Gigabyte GA-X79-UP4 - Intel® X79 Express Chipset
CPU: Intel® Core™ i7-4960X CPU @ 3.60 GHz / TBF2.0 @ 4.00 GHz
RAM: 4x 8192 - 32 GB Corsair Vengeance® Pro Series DDR3-2133 PC3-17066 CL11
GFX: Gigabyte® Nvidia® GeForce™ GTX TITAN X - 12 GB GDDR5 (GV-NTITANXXTREME-12GD-B)
OS - MS - Windows 10 - Professional
MS - XBox Series X - With Activated Developers Mode




Reply
#20
extra thread is only for SW. So basically there is no difference. Do you see more thread in monitoring system tool like top?
Reply




Users browsing this thread: 1 Guest(s)