..:: PCSX2 Forums ::..

Full Version: What are the differences between the GSdx plugins?
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
I'm curious about what the differences between the current GSdx 5334 (MSVC 16.00) plugins. By default AVX is selected, but what about SSE2/SSE41/SSSE3?

Can anyone explain them to me or point me to a post that covers this?
Basically, the difference is in the (possible) use of new instructions each the expansions brings. You can look at it as AVX has all the instructions from the previous sets plus some new (yet reportedly it will not change things for hardware mode).

Having the plugin for the new set means you need a CPU able to "understand" these new instructions, so you can't use the AVX plugin with CPUs without support for it. The same for SSE4.x which lacks support on all AMD CPUs before Bulldozer. In this case the use of SSE2 is forced.

There is some gain in the newer versions but that's not something which would make a weak machine to magically get playable speed.

Bottom end: go for the most recent plugin your CPU can manage and you are safe.
Great thanks for the replies and information. Seems like with an Intel i7 I should just stick with AVX.
(05-06-2013, 03:47 PM)nozomi Wrote: [ -> ]Great thanks for the replies and information. Seems like with an Intel i7 I should just stick with AVX.

Yep. although plugins are a tad different from a game. I mean in a game with option for DirectX 9 and DirectX 10 (although not the same thing in this case, new versions of DX will eventually incorporate the new instructions from those new CPUs) the last version tend to be a lot heavier but that's because the game will probably using features not possible or too hard to implement in the older. That's not the same for PCSX2 because what dictates the usage is the game being played, so DirectX 10 may have some nice things and still perform better than DX9 in most cases. Even so, in a game or two, DX9 may be better choice. As can be seen, it's totally different from dealing with only one game, PCSX2 plugin needs to be able to work well with all PS2 games over there, does not matter how much different they are one from the other.

The plugin is meant to deal with different games and deal with whatever the game tries to do. PCSX2 can have the luxury of providing different plugins to make use of new instructions that could do in one or fewer steps what could be done only with a larger code otherwise, so is where may come the possible gain, a gain that is dependent more on the game being played than on the plugin itself.

PC games don't do it for a reason (most of times), they are meant to sell the greater number of copies so they avoid instruction sets many CPUs would not get and still being powerful enough to run the game without them. Eventually new games will bring harder minimal requirements but for now that's like it is.

For example, if they did a version of the engine to use AVX (or even SSE4) they would be rooting out all AMD CPUs till (and including) Phenom II, what they wont find attractive idea.
I think it should be possible to detect at run-time if the hardware supports the instruction set and decides whether to use it.
(05-07-2013, 01:09 PM)xemnas99 Wrote: [ -> ]I think it should be possible to detect at run-time if the hardware supports the instruction set and decides whether to use it.

It does, it's actually printed at the console, but leaves the decision of what plugin to use for the user (still the most advanced one that the CPU allows is selected by default already). Of course will be denied trying to run AVX or SSE4.1 on CPUs not able to do it.
(05-07-2013, 07:54 PM)nosisab Ken Keleh Wrote: [ -> ]It does, it's actually printed at the console, but leaves the decision of what plugin to use for the user (still the most advanced one that the CPU allows is selected by default already). Of course will be denied trying to run AVX or SSE4.1 on CPUs not able to do it.

Yes. I mean the PC games can do the same way. They don't need to stick to SSE2 or below but they can detect the hardware at run-time like when launching the game and select the plugin/mode/whatever accordingly.
(05-07-2013, 10:01 PM)xemnas99 Wrote: [ -> ]Yes. I mean the PC games can do the same way. They don't need to stick to SSE2 or below but they can detect the hardware at run-time like when launching the game and select the plugin/mode/whatever accordingly.

Those are instructions the CPU can execute, then yes if the application has modules compiled with specific arch this can be done, but it's not like happens most of time.

A lot more common is having one or more engine versions, one based on DX9, other based on DX10 and so on but I'm yet to see different modules for those instruction set... and for a reason.

There is no sense coding extra modules to make actual advantageous use of the new instructions what could translate in not only a minor performance gain but to allow things hard to implement or even feasible with the older set. If not for any other reason because dealing with testing and debug of different modules versions to do the same thing is something the devs wont do.

Then, rests the architecture can be defined at compile time so the most advanced set is used... but for this exist the minimal hardware requirement and this minimal requirement is what defines the set the application is compiled.

Notice how this approach is completely different from the plugin (GSDX in the case). While in the game's development the focus is in getting the best of an unique and specific application, the plugin is designed to deal with a code it does not know a priori coming from different games. It has to deal with whatever comes from the game, more specifically with comes from the emulator's core which already has to deal with a non previously known game's code.

The emulator works very different from the game, it dynamically recompiles chunks of code (one the most common techniques nowadays) and so it could make use for generic translation using different instructions set... but ask the core developers if they are willing to do it.

That's done on GSDX first to decide the DirectX to use, and then to decide the instruction set to use. But notice as just compiling with AVX support did nothing to help hardware mode... that's because the plugin simply does not use AVX exclusive instructions in this mode, it uses it only in the software mode ATM.

This hopefully helps understanding that is not enough compile with a flag, to get the best of new instruction set the program, plugin, whatever needs to actually use it. May exist a natural gain independent of the high level source code at compilation time but it's not gain enough to lure game developers to buy fleas for themselves to itch.

PS: Besides, is not like the game developers are desperately trying to get the last performance drop, the emulator's devs are, this is the reason for those many different GSDX versions, and the gain is not so big even so.

PPS: If you meant the CPU will chose the instruction at run time, think again, it will just execute the machine code as it comes. At most it will make branch prediction so to be able to better distribute and use the incoming code, but nothing beyond this. The last thing that is needed is CPU making logic interpretation mistakes.