Rogue Galaxy not getting as much performance as I should?
#1
You can see my system in the signature.

I'm trying to run it at 1920x1080 @ 3x native with no other additions, and I get about 55%-75% speed outside of menus (which are fullspeed). Increasing the preset doesn't really speed it up at all.

However what I don't understand is that EE is at 22%, GS at 25%, VU at 15%, ui at around 0%.

Why am I getting such low utilization if the fps is so poor? What here is the bottleneck keeping me from getting full speed?
CPU: i5 3570k @ 4.2 | RAM: 8GB DDR3 | GPU: 1GB Radeon HD5550 @ Stock
Running: 1.1.0.5764
Reply

Sponsored links

#2
Your graphics card cannot handle 3 times resolution. Try turning down the scaling and see if you get an fps increase.
CPU: SandyBridge-E 3970X @ 5 Ghz GPU: Quad sli GTX 690's
MOBO: EVGA Classified SRX STORAGE: 8x Samsung 840 Pro 512 GB (4 TB) RAID 0
RAM: CORSAIR Dominator Platinum 96GB 2133 Mhz OS: Windows 8 Pro 64 Bit
COOLER: Phase Cooling OCZ Cryo-Z (Temps: -37 C) PSU: EVGA SuperNova Nex1500 Classified Biggrin



Reply
#3
Yeah I just learned this, apparently the 4890 isn't strong enough for this game. I have to bump it to 2, but it's all blurry Wacko

It's odd because other games run ok at 3x and even 4x.
CPU: i5 3570k @ 4.2 | RAM: 8GB DDR3 | GPU: 1GB Radeon HD5550 @ Stock
Running: 1.1.0.5764
Reply
#4
I have another issue, sorry for the double post, but the flashback sequence, the textures were just a mixture of black and beige shapes, the entire design of the flashback was all bugged (but the game progressed fine). How do I fix that in the future?
CPU: i5 3570k @ 4.2 | RAM: 8GB DDR3 | GPU: 1GB Radeon HD5550 @ Stock
Running: 1.1.0.5764
Reply
#5
Press F9 and run it in software mode. Apparently Dragon Quest 8 has this problem too.
CPU: SandyBridge-E 3970X @ 5 Ghz GPU: Quad sli GTX 690's
MOBO: EVGA Classified SRX STORAGE: 8x Samsung 840 Pro 512 GB (4 TB) RAID 0
RAM: CORSAIR Dominator Platinum 96GB 2133 Mhz OS: Windows 8 Pro 64 Bit
COOLER: Phase Cooling OCZ Cryo-Z (Temps: -37 C) PSU: EVGA SuperNova Nex1500 Classified Biggrin



Reply
#6
The 4890 is a monster of card for it's time and still a powerful card despite not having the newest DirectX support, so it's hardly the problem at 3x upscale. The CPU is OK also. What happens is this game is demanding indeed and a bit of EE cyclerate and/or VU cycle stealing may help.

About the flashback cutscenes, that's right. I remember that skip draw removed the offending layer (I'm not sure the value but believe to be lesser that 6). F9 may work as well if not making the game slow to a crawl.

Whatever, if using the skipdraw remember to return it to 0 after watching the flashback for it affects the normal play (mainly the subtitles, IIRC).
Imagination is where we are truly real
Reply
#7
The GPU usage is constantly at max load while playing this game. Increasing EE and VU did zippo.
CPU: i5 3570k @ 4.2 | RAM: 8GB DDR3 | GPU: 1GB Radeon HD5550 @ Stock
Running: 1.1.0.5764
Reply
#8
Rogue Galaxy is one of the more demanding game on the GPU, I know my 9800gt Could not handle DQ8 or Rouge Galaxy and similar looking games at 1080p and actual keep 60 fps at all, Soon as I switch to a 450gts those games ran fine at 1080p for most part off of i7 920 stock speeds.

Pretty sure its the GPU specially if you saying it at full usage and it running like that, that card might be good for PC games , but Emulations of ps2 games specially the heavy ones is a whole other thing.
Reply
#9
Yeah I'd like to upgrade to a 670gtx 4gb, but alas, I am poor. I'll just deal with blurry IQ for this game.
CPU: i5 3570k @ 4.2 | RAM: 8GB DDR3 | GPU: 1GB Radeon HD5550 @ Stock
Running: 1.1.0.5764
Reply
#10
Worthy a try is putting Windows at a performance power save plan. I'm pretty sure GPU is not the problem there, least yet the CPU. The low load figures points to it.
Edit: What I mean is Windows and Intel have a problem identifying the correct load on the system, so they reduce the CPU clock what makes the system looking still more unloaded since EE loses output to feed GS... rinse, repeat.

On the other side, supposing the GPU is the problem, why simply don't reducing the upscale?
Imagination is where we are truly real
Reply




Users browsing this thread: 1 Guest(s)