GSdx: run-time internal resolution modification?
#1
Has anyone ever looked at changing the internal resolution at runtime (somewhat smoothly?) I have a crazy plan to switch between 1x and 4x in FFX when in the menu vs rest-of-game, to avoid the bad text in menu.

I started looking at the code regarding this today, concentrating on the upscaling_multiplier in GSRendererHW. Attempts so far:

1. Hooked in from GlobalCommands and added a GSApi that does setConfig("upscaling_multiplier, x). This 'works', in that I can restart the plugin and get a new upscale, but there's a jarring sound-break as the plugin resets (as well as a black screen flicker but that's not so bad. The sound is far worse). So that's not great. Basic code change was like:

ScopedCoreThreadPause paused_core(new SysExecEvent_SaveSinglePlugin(PluginId_GS));
GSsetUpscaleMultiplier(new_upscale); /* Added this API */
paused_core.AllowResume();


2. Tried just doing it in the GSRendererHW, but couldn't work out what minimal state needed to be reset to achieve the goal:

i. Just changing m_upscale_multiplier didn't work.
ii. Changing m_upscale_multiplier and calling ::Reset() worked but was slow (slower than reloading the entire plugin, it felt!).
iii. Changing m_upscale_multiplier and forcing :WackoetScaling() to trigger also 'worked' but led to crazy graphical glitching (when going back to 1x native resolution, the screen would zoom in on one small portion.)


It's possible #2 has promise, but I need to dig deeper into the GSRenderer* code paths to understand why simply changing m_upscale_multiplier didn't work on the next Draw() call.
Reply

Sponsored links

#2
What are you using as a trigger for changing the internal resolution ? some kind of hotkey ? In that case, I'd rather doubt it gets merged to the master branch.

There are already lots of hotkeys set for various core/plugin functions .

Quote: Changing m_upscale_multiplier and forcing :WackoetScaling() to trigger also 'worked' but led to crazy graphical glitching (when going back to 1x native resolution, the screen would zoom in on one small portion.)

SetScaling() is called at every Vsync event , not sure why you need to force/explicitly call it ?
Reply
#3
Instead of doing all this, why don't you try fixing the bad text issue instead? Laugh
Reply
#4
> What are you using as a trigger for changing the internal resolution ? some kind of hotkey ? In that case, I'd rather doubt it gets merged to the master branch.

Yes, I never intend to get this merged, just hacking on my end Smile.

> SetScaling() is called at every Vsync event , not sure why you need to force/explicitly call it ?

By force, I mean I put some logic in to avoid the early-exit in SetScaling. As far as I can tell from adding some logs, almost every call to SetScaling meets the early-exit criteria (m_upscale_multiplier <= 1 || good_rt_size) and doesn't actually update anything - even after changing the m_upscale_multiplier. Hacking around the early exit causes the resolution to visibly change, but also occasionally causes strange graphical glitching. I can grab a video if anyone cares to see it.

> Instead of doing all this, why don't you try fixing the bad text issue instead?

FFX is one of the most popular games emulated by PCSX2, and I think is the one originally used to develop it. I assumed that if fixing the broken textures at higher resolutions were possible, it would have been done by now. The wiki page (http://wiki.pcsx2.net/index.php/Final_Fantasy_X, "Broken Textures") and general searching indicates that nobody seems to think it possible to fix at higher resolutions (all advice is 'put up with it').
Reply
#5
Quote: FFX is one of the most popular games emulated by PCSX2, and I think is the one originally used to develop it. I assumed that if fixing the broken textures at higher resolutions were possible, it would have been done by now. The wiki page (http://wiki.pcsx2.net/index.php/Final_Fantasy_X, "Broken Textures") and general searching indicates that nobody seems to think it possible to fix at higher resolutions (all advice is 'put up with it').
Such a wrong reasoning. It is true there isn't a clean fix. However maybe some hack could be done.

I don't remember the root cause exactly. There is a high probabilities that FFX uses triangle to render sprite (2 triangles == 1 rectangle). It would be costly but potentially doable to fix a bit the texture coordinate (in the same way, it is done for sprite, don't remember the hack name). However it could also break the rendering.
Reply
#6
Quote:Such a wrong reasoning. It is true there isn't a clean fix. However maybe some hack could be done.


My apologies for the assumption then.


Quote:I don't remember the root cause exactly. There is a high probabilities that FFX uses triangle to render sprite (2 triangles == 1 rectangle). It would be costly but potentially doable to fix a bit the texture coordinate (in the same way, it is done for sprite, don't remember the hack name). However it could also break the rendering.

Yes, it was the breaking of the rendering that I was concerned about. I'll take a look, but it's far more out of my depth to isolate the specific digit texture map and apply an offset for only that. But one can only try Smile.
Reply
#7
Check "RoundSpriteOffset" function.

You can have the primitive info here with "m_vt.m_primclass". You can check the Z behavior of the rendering with "m_vt.m_eq" or "m_vt.m_min" and "m_vt.m_max" (the idea will be to detect rendering of triangle in a plan).
Reply




Users browsing this thread: 1 Guest(s)