Gsdx future discussion
#41
(03-06-2016, 11:06 PM)gregory Wrote: First topic: Drop old renderer

I would like to create a legacy GSdx plugin. It is more or less the current state of the GSdx. It would remain available for DX9 (very old GPU on Windows & MacOSx) and the GL3.3 (for less old GPU on linux).

The future GSdx will just drop DX9 and GL3.3 limited drivers (various GL4 features are compatible with GL3 hardware class, even GL2 HW class). It means that Windows DX10/GL3 GPU will only have access to DX10/11 renderer. On linux, DX10/GL3 will be supported on the free driver.

In order to keep the number of build manageable, I propose to keep only SSE2 for it. (if you have a recent CPU, time to upgrade the GPU).

Next step, in a more distant future, will be to introduce pure HW GL4 features (i.e that requires DX11/GL4 class GPU).

I don't think there will be any issues with this. Few people still has hardware that old, and they generally do not expect support for it. If anything, it is surprising PCSX2 has retained DX9 support for so long.


gregory Wrote:Second topic: Support integral texture coordinate
Hardware GPU filtering unit uses float (or normalized integer) value as texture coordinate. It is fast but it suffers of a couple of limitations. The idea will be to use pure integer (aka not normalized). It would help to
* Fix texture sampling on some games such as Lupin 3rd
* Reduce memory usage (allow to use a subset of the texture instead of a full 1024x1024 texture)
* Reduce texture cache downsclaling
* Future: bypass the texture cache in some special draw (you know the top-left corner bug Wink
* Sampler management will be much easier (i.e no more sampler actually Tongue2)

But there is a catch. HW unit only supports float so integer filtering will be done in SW (must be read as in shader)
* Bad for the performance. Increase memory bandwidth requirement (in case of linear filtering). Increase ALU computing (might be hidden by the memory access)
* No more anisotropic filtering
* No more custom resolution (well maybe it can be partially emulated with a float factor)

If this is required in order to fully emulate the GS, my suggestion would be a fork. One branch focusing on picture quality and performance, the other on accuracy and compatibility. From a preservationist perspective the latter would be preferred, and so too for users wishing to play a game affected by relevant bugs. But most users are returning for the gameplay and or story, and would rather that the graphics looked as good as possible. As evidenced by the number of posts asking about performance issues and requests for such things as widescreen hacks.

The main issue would be the increased workload from duplicating work. A serious issue for a project that already has fewer developers than optimal. This can be somewhat ameliorated by only supporting some versions in each branch, say focusing on software mode for the accuracy branch.
Reply

Sponsored links

#42
(03-08-2016, 12:17 AM)refraction Wrote: i think BRZ is more desired than that to be honest.

I presume you mean using xBRZ for texture upscaling. That would seem an inappropriate request in a topic that is about sacrificing quality for compatibility, as it does the opposite.

Also, it would not be a good idea to use upscaling without a good filtering solution. Once you have a situation where there are fewer pixels than texels, you can get blinking artefacts unless multiple texels are sampled.
Reply
#43
(03-14-2016, 01:54 PM)artifactual Wrote: I presume you mean using xBRZ for texture upscaling. That would seem an inappropriate request in a topic that is about sacrificing quality for compatibility, as it does the opposite.

Yes that is what I meant. It isn't completely inappropriate, if it was possible with the new system it could be implemented as a method of texture filtering, PS2 textures are inherently horrid, so standard filtering methods just look like a blurry mess generally and some people do like xBRZ. The compatibility comes from using integers, which removes the choice of using Anisotropic filtering, but that doesn't shut the door on everything, only if we use your suggestion of forking for a performance hack filled version and a compatible accurate version (which wasn't the original topic) would this defeat the point in having any third party hacks along with this change, but that wasn't the topic originally so it is adequately appropriate.
[Image: ref-sig-anim.gif]

Reply
#44
Speed impact is mostly on the GPU. So it would require a bigger GPU, if users want big upscaling (4x and more).

I don't think, we're sacrificing quality. But GPU resources. I think it will still be possible to compute the derivative of the sample and adapt the filtering accordingly.

It depends of what people want to do
1/ Use the biggest upscaling factor as possible, even if it means that part of the game isn't rendered
2/ Limit the upscaling factor, and keep the full scene rendered correctly.

Personally I'm in favor of 2 because GPUs are still progressing every years. And the future is known. HBM will increase memory bandwidth. It will unlock new GPU capabilities. Beside new GPUs now support fully programmable blending.
Reply
#45
(03-14-2016, 12:14 PM)gregory Wrote: Yes I know. But you see it in the wrong way. The main goal is to drop support of unsupported-by-vendor openGL drivers. I puts several alternate paths and emulation layers for them. The current situation is
1/ GL is better than DX because of latest feature which aren't supported on old drivers.

The reason I see it in the "wrong way" is likely because my use-case is very different from yours.

For some of us, the primary priority is a graphics plugin that can render the game with the highest GPU efficiency possible to achieve 100% speed with as little noticeable graphical disturbances as possible under the constraints of that efficiency. For Dolphin, the Ishiiruka fork exists to do this, but all PCSX2 has in this vein is the DX9 renderer (for the games which are not too buggy to play on it)

The major point of DX9 is not supporting old discrete GPUs which you have correctly pointed out earlier in this thread that probably 1% of the population still use, but rather the point of DX9 is supporting awful modern iGPUs which cannot handle the GPU speed requirements of higher end renderers at even 2x res. And surprisingly this includes iGPUs from as recent as 2014. Those old discrete GPUs are rare in the wild now, but iGPUs are plentiful.

(03-14-2016, 12:14 PM)gregory Wrote: So the idea was to create a legacy GSdx for openGL. So why not put DX9 in the same boat.

The thing I don't get here is what that essentially entails for DX9 compared to the situation now? DX9 hasn't been updated in a long time as far as I know so it already essentially is just there for legacy support. The major thing people who still use it would want is to assure it can still be used for newer builds of PCSX2 so that if bugs in other areas of the emulator (not related to rendering) are fixed, people using it can still take advantage of at least those fixes even if graphical bugs will never be fixed on DX9. That is, as opposed to a situation of completely dropping DX9 and then people who have to use it don't get fixes in core emulation, audio, etc. and potentially even have to maintain multiple installs because for some games they can run DX11 or OGL but others that are more high end graphically they need DX9.
Reply
#46
By "wrong way", I mean, we move old GL support into a separate legacy plugin. If we create a legacy plugin, I think we can put DX9 into the legacy plugin. The previous git discussion was about the complete removal of DX9. This new proposal is far from it.

Quote:The thing I don't get here is what that essentially entails for DX9 compared to the situation now? DX9 hasn't been updated in a long time as far as I know so it already essentially is just there for legacy support.
Before, you have a plugin with 2 options. Now you have 2 plugins with 1 option. Nothing change really for you. You can still use DX9. Actually you have the guarantee that DX9 will still work when we change GSdx code. So I'm asking the same as you, what is your impact? You won't have multiple install, but an additional plugin choice, you will still have the bugfix in core/audio. Well, it is will be a bit different to switch the renderer.
Reply
#47
(03-14-2016, 03:30 PM)gregory Wrote: By "wrong way", I mean, we move old GL support into a separate legacy plugin. If we create a legacy plugin, I think we can put DX9 into the legacy plugin. The previous git discussion was about the complete removal of DX9. This new proposal is far from it.

Before, you have a plugin with 2 options. Now you have 2 plugins with 1 option. Nothing change really for you. You can still use DX9. Actually you have the guarantee that DX9 will still work when we change GSdx code. So I'm asking the same as you, what is your impact? You won't have multiple install, but an additional plugin choice, you will still have the bugfix in core/audio. Well, it is will be a bit different to switch the renderer.

If I understand you correctly it doesn't seem like too much of a problem. The opening post sounded more along the lines of the github discussion. The "I propose to keep only SSE2 for it" [the legacy plugin] does seem a bit of an issue in getting the maximum performance CPU-wise though but I don't know the exact impact of SSE2 vs a more modern instruction set.
Reply
#48
The SSE2 stuff is about the optimization of the number of "default" build. It would still be available as a self-build.

Question: what do you mean by slow iGPU ?
Reply
#49
I don't see how moving DX9 to the legacy plugin could upset anyone. Or at least it shouldn't.

The fact is our DX9 backend isn't being worked on. And it isn't gonna be worked on. It's not gonna change. Whether it stays in mainline or gets pushed to legacy, the DX9 backend will be EXACTLY the same. So yeah, I don't see how it could be an issue really. There's nothing to lose and a good bit of code simplicity to gain.

As for SSE2 only - if we decide to go the route of the legacy plugin(and it seems like we will), it will be quite easy for someone to build all of them(SSE2, SSSE3, SSE4.1, AVX, AVX2) versions and post them on the forum or the main PCSX2 site or whatnot. They will be available, at the very least I will see to it.
[Image: XTe1j6J.png]
Gaming Rig: Intel i7 6700k @ 4.8Ghz | GTX 1070 TI | 32GB RAM | 960GB(480GB+480GB RAID0) SSD | 2x 1TB HDD
Reply
#50
The way I see it. If you use the advance plugin, you could have a build available for the buildbot for each new commits. This way people test new stuff. We will keep the building of the legacy plugin so users can easily use it. But I don't want to spend tons of the limited buildbot resources to build various version of the legacy plugin which will mostly be in pure maintenance mode (aka no new feature)
Reply




Users browsing this thread: 1 Guest(s)