Ratchet & Clank - up to 15 fps drops sometimes
#11
Even still since he has a GTX 1060 OGL would be optimal I think.

Defaults + MTVU + OGL HW + Mipmapping + 3x native + Basic blending + partial crc should be good I think. Optionally anisotropic filtering 16x and 4x MSAA should produce good visuals. If speed is good you can try bumping to 4x native.

Edit: Forgot MSAA is broken in OGL. Well if you want it, use DX11.

If all else fails you can do an EE underclock, it seems to provide good gains in R&C. Although he shouldn't need it.
[Image: XTe1j6J.png]
Gaming Rig: Intel i7 6700k @ 4.8Ghz | GTX 1070 TI | 32GB RAM | 960GB(480GB+480GB RAID0) SSD | 2x 1TB HDD

Sponsored links

#12
Oh I see what you mean, yeah Mipmapping does work in D3D but Blending Accuracy doesn't. I know it's not needed for this game but it was mentioned and could confuse them if they are in D3D mode, that was my point Smile

But I think generally OpenGL works better, not sure how much better in this game, but it's not a bad idea to use it as a default renderer Smile
[Image: ref-sig-anim.gif]

#13
Ratchet & Clank games use a very low internal resolution (lower than what PCSX2 reports, I think). To make it look good on my 1440p monitor I use a custom 8192x4608 resolution. For your GTX 1060 you need something lower. I get around 40-50% GPU usage with my GTX 1080 using the DX11 hardware renderer, but in some situations when there are lots of projectiles and explosions it's a lot more than that (reaching a GPU limitation).

I also use EE overclock to 300%, to avoid internal fps drops, although that procuces external fps drops for my CPU. But I prefer dropping to 45-50fps and have a speed reduction rather than fullspeed at 30fps internally (G-Sync+ULMB keeps everything smooth, and without motion blur as long as it's at >=40fps).
CPU: AMD Ryzen 7 7800X3D
GPU: Nvidia GeForce RTX 2080 Ti GAMING X TRIO
Motherboard: ASUS ROG Strix X670E-F Gaming WiFi
RAM: Corsair Vengeance 6000MHz CL30 DDR5
Monitor: Asus PG278QR
OS: Windows 11
#14
(01-03-2017, 01:48 PM)refraction Wrote: ...  Blending Accuracy doesn't. I know it's not needed for this game ...

But I think generally OpenGL works better, not sure how much better in this game, but it's not a bad idea to use it as a default renderer Smile

Blending accuracy is required to replace the in-game shadow crazy computing (same mess as Jak series) by a faster dedicated Shader. However it is still slow as you need to render the shadow triangle by triangle.


And MSAA isn't broken on openGL. It is unimplemented because I don't want to bother with it Wink
#15
(01-03-2017, 02:11 PM)masterotaku Wrote: Ratchet & Clank games use a very low internal resolution (lower than what PCSX2 reports, I think). To make it look good on my 1440p monitor I use a custom 8192x4608 resolution.
Do you have Large Framebuffer checked? The game looks particularly blurry without it.

EDIT: might only apply to _x Native resolution modes
Windows 10 Home 64-bit | Intel Core i7 4770 @ 3.4GHz | MSI GeForce GTX 1070 Ti Titanium 8GB
Corsair Vengeance 16GB DDR3, 1866MHz @ 9-10-9-27 | 2TB Samsung 860 Evo | Samsung S23A700D
#16
GPU and CPU usage is about 30% when maintaining 60 fps, when under 60 fps, GPU usage is about 40% and CPU usage is about 60%

Using BIOS USA v 2.0

Emulation Settings that I changed on build 1749:

GS Window:
- Aspect Ratio: Fit to Window/Screen
- All boxes checked off except "Switch to 4:3 aspect ratio wh en an FMV plays"
(Disabled window resize border, Always hide mourse cursor, Hide window when paused, Default to fullscreen mode on open, Double-click toggles fullscreen mode)

Speedhacks:
- EE Cyclerate is 3, increased to 300%
- MTVU option checked

GSdx Plugin Settings:
Adapter: Nvidia GeForce GTX 1060 6 GB
Renderer: OpenGL (Hardware)
Interlacing (F5): Auto
Allow 8-Bit Textures: Yes
Large Framebuffer: Yes
Internal Resolution: Custom (1920 x 1080)
Texture Filtering: Bilinear (Forced)
Anisotropic Filtering: Off
Mimapping (Ins): Basic (Fast)
CRC Hack Level: Partial (OpenGL Recommended)
Enable HW Hacks: No
Accurate Date: No
Blending Unit Accuracy: Basic (Recommended low-end PC)
#17
Having the Internal Resolution set to "Custom" can often be a performance killer; plus it's actually not 1080p that's being rendered Wink

Your GPU should have no problem with 3-4x native resolution seeing as I'm running with a 1070 Smile
#18
I changed it to 4x, and my GPU and CPU load when up a bit but not too much (GPU 50, CPU 60). The problem is the fps dropping below 60 in the exact same situations

This area gives me consistent >10 fps drops
http://vignette2.wikia.nocookie.net/ratc...0524215340
#19
The GDsx Plugin Settings do not have any affect on where I get drops, they only have affect the GPU and CPU load a bit, not too much. As for fps those settings have very very minimal impact, maybe like 1-2 fps change maximum

CK1
Having the Internal Resolution set to "Custom" can often be a performance killer; plus it's actually not 1080p that's being rendered Wink

Your GPU should have no problem with 3-4x native resolution seeing as I'm running with a 1070 Smile


Have you been able to run this game? what settings are you using?
#20
(01-03-2017, 09:11 PM)Qualia Space Wrote: The GDsx Plugin Settings do not have any affect on where I get drops, they only have affect the GPU and CPU load a bit, not too much. As for fps those settings have very very minimal impact, maybe like 1-2 fps change maximum

CK1
Having the Internal Resolution set to "Custom" can often be a performance killer; plus it's actually not 1080p that's being rendered Wink

Your GPU should have no problem with 3-4x native resolution seeing as I'm running with a 1070 Smile


Have you been able to run this game? what settings are you using?

I only have the 2nd and 3rd games, but maybe what you are seeing is the game performance dropping internally even if your fps is 60. The engine is pretty much the same throughout the R&C PS2 era.

I use OGL HW 4x native resolution, Bilinear (Forced) Accurate Date, Blending Accuracy High and 8-bit textures enabled. However, even my 1070 8GB and a 6600K OC'd to 4.7Ghz; I too still get performance drops in the first level of R&C 2 for instance. My CPU usage ranges from 25-40% and my GPU from 25-100% FYI.




Users browsing this thread: 1 Guest(s)