CPU Benchmarking - Quad vs. Dual in various RPGs
#31
       

...and here we go, just upgraded from my p35-ds3+e2160 to ep45-ud3lr+e7400! Finally, lets see how new CPU compare on same settings/GPU(in fact, I didn't even reinstall OS. I booted my Windows server 2008, it founded and reinitialized new hardware without giving a crap! I mean I changed for new motherboard with different chipset!).

Results are..: -first pic [2600mhz] >> wow, nearly 10fps boost!! and near same as authors results with e8400 and nvidia gtx-260 - just about 1-1.5fps lower. So cpu vs cpu DOES matter, at least e2160 vs e7400.
I got nearly same fps on my radeon 3870 as author on his 260gtx+, at least for this particular game and frequency...>

-second pic [3800mhz] >> this was my max stable/safe overclock in moment. Unfortunately, its somehow again lagging behind his e8400 - quite a lot. About 10fps! It still increased from 2600mhz about 15fps! Its nice scaling, but how he could get 72+fps damn it... Part can be in FSB, I was on 400x9.5, he needed 422x9 probably, but still.... doesn't feel that much more. I wonder if author really benched on same setting/location.

By the way I tried sse4.1 as well, actually I got even about 1fps lower, which could be related to something else. In other words, sse4.1 in this game didn't make any difference at all for me.

So my conclusion is: on some frequency(2.6ghz) I matched e8400+nvidia260 with my e7400+radeon3870, on same clock e7400 vs e2160, e7400 gave me BIG gain - whole 10fps, on higher clocks like 3.8ghz+, there seems to be quite big difference between e7400 vs e8400 for some reason. Because on 2.6ghz fps was about same, I don't think its GPU fault. My theory would be that either on higher clocks more L2 cache can actually start to make big difference, or that because my e7400 default FSB is 1066 and e8400 1333, maybe when overclocked lower fsb cpu have it harder to comunicate with NB and need a lot "retransmissions" ,kinda something like bad WiFi crap. Or am just thinking too much today and should actually go sleep, damn it!

Sponsored links

#32
(02-20-2009, 01:51 AM)elite Wrote: -second pic [3800mhz] >> this was my max stable/safe overclock in moment. Unfortunately, its somehow again lagging behind his e8400 - quite a lot. About 10fps! It still increased from 2600mhz about 15fps! Its nice scaling, but how he could get 72+fps damn it... Part can be in FSB, I was on 400x9.5, he needed 422x9 probably, but still.... doesn't feel that much more. I wonder if author really benched on same setting/location.

You are GPU-limited. That is, you hit the limit of your GPU on this particular game/scene. The author's GPU is a good bit faster, and thus he realized larger % speed increases at the higher FPS, and also some games are more limited by GPU than others.

At lower FPS the GPU speed difference was irrelevant, since it was the CPU emulation that was holding up both your system and the author's (GPU happily idled on one core while the CPU struggled along on the other core). Speed up the CPU enough and then the GPU begins to struggle, forcing the CPU core to idle along while the GPU finishes its tasks. Thus increasing mhz on the CPU beyond the point of the GPU's ability results in smaller and smaller overall speed gains.
Jake Stine (Air) - Programmer - PCSX2 Dev Team
#33
(02-23-2009, 01:04 PM)Air Wrote:
(02-20-2009, 01:51 AM)elite Wrote: -second pic [3800mhz] >> this was my max stable/safe overclock in moment. Unfortunately, its somehow again lagging behind his e8400 - quite a lot. About 10fps! It still increased from 2600mhz about 15fps! Its nice scaling, but how he could get 72+fps damn it... Part can be in FSB, I was on 400x9.5, he needed 422x9 probably, but still.... doesn't feel that much more. I wonder if author really benched on same setting/location.

You are GPU-limited. That is, you hit the limit of your GPU on this particular game/scene. The author's GPU is a good bit faster, and thus he realized larger % speed increases at the higher FPS, and also some games are more limited by GPU than others.

At lower FPS the GPU speed difference was irrelevant, since it was the CPU emulation that was holding up both your system and the author's (GPU happily idled on one core while the CPU struggled along on the other core). Speed up the CPU enough and then the GPU begins to struggle, forcing the CPU core to idle along while the GPU finishes its tasks. Thus increasing mhz on the CPU beyond the point of the GPU's ability results in smaller and smaller overall speed gains.

Thanks for explaining. It interesting to think that even radeon 3870 could be bottleneck. Then again, this game at this resolution looks beautifull, have clean and nice graphic etc.., is probably shader heavy as well. It looks nicer to me than some recent PC games, actualy.

I wonder how big difference would be, say 8800gt vs this 260gtx and my 3870 vs 8800gt.
#34
Tut tut, not even benching on an i7.

WTB support for 8 cores. Wink
#35
(02-23-2009, 08:13 PM)elite Wrote: Thanks for explaining. It interesting to think that even radeon 3870 could be bottleneck. Then again, this game at this resolution looks beautifull, have clean and nice graphic etc.., is probably shader heavy as well. It looks nicer to me than some recent PC games, actualy.

It's texture-heavy. The ATI has a slow texture upload path, at least in the way Gsdx uses it. So very texture-heavy games get a pretty big speed hit on ATIs as compared to similar (or even slower) NVIDIAs. On my own ATI, which benchmarks about the same as a Geforce 8600 on PC games, score 15 fps on FFXII. Just how bad is that GPU bottleneck? I get 60-70fps in Gsdx's SOFTWARE MODE. -_-
Jake Stine (Air) - Programmer - PCSX2 Dev Team
#36
Compare between 8600gt and 1650gt are not valid compare. I thinks there are between 8600gt and HD.3650 to be valid compare.
Notebook ASUS A43TA|CPU AMD Llano APU A6-3400m Triple core (1 core disable) OC to 2.6+Ghz|GPU CF|HD 6520 400Mhz/667Mhz iGPU|HD6650M OC 780Mhz/985Mhz dGPU|RAM 8GB DDR3 1333|Windows 7 Ultimate Sp.1 x64 bit.
>> Emulation speed differs for each game. There will be some you can run fast easily, but others will simply require more powerfull hardware <<.
#37
Correction: A 1650pro is roughly on par with a 7600gt or 7800gt, give or take depending on game and stuff. The 1650gt however is not the 1650pro; It has nearly double the RAMDAC clock, and so it's a good 25% faster on whole. Better yet, the 1650gt is very overclockable: it runs quite cheerfully at 30% over it's base clock rates on both memory and GPU. I have no problem running most games that aren't named Crysis, at 1650x1050 with full AA and Anisotropic filtering, etc. But it chokes badly on Pcsx2. Wink
Jake Stine (Air) - Programmer - PCSX2 Dev Team
#38
String up gabest fry him for his obvious nvidia bias in his plugin.... Nah I'm just kidding but seriously if this was a benchmark used for something there would be posts like that all over the place.
#39
perhaps that card is unmature from technology and features and have ram DDR2 be sides DDR3. Not 7800gt, that card compare to ATI X1800pro/xt.
Notebook ASUS A43TA|CPU AMD Llano APU A6-3400m Triple core (1 core disable) OC to 2.6+Ghz|GPU CF|HD 6520 400Mhz/667Mhz iGPU|HD6650M OC 780Mhz/985Mhz dGPU|RAM 8GB DDR3 1333|Windows 7 Ultimate Sp.1 x64 bit.
>> Emulation speed differs for each game. There will be some you can run fast easily, but others will simply require more powerfull hardware <<.
#40
           
           

Ok people another test. This time I benched GPU. I couldnt compare different cards as I have only one, but I simulated GPU power by underclocking/overclocking through my catalyst control center. There are some things that are important for u to know that are hard to show in screenshots, I'll get to it later.

I compared my radeon 3870 on 3 freq's: 300mhz, 600mhz, 885mhz. Gddr4 freq. was same. This was on my new comp and settings I mentioned in the last posts, but on 3.6ghz instead of 3.8ghz. Btw if u're interested in e7400, 3.6ghz was highest TRULY stable for me on my p45 chipset. I decided to go with safe voltages and also against "load line calibration" option + truly stable(intel burn test 64bit, ooct, prime fft 10k, memtest, cpuburn...). Just my tip, maybe u get better cpu, maybe not. One thing u can see now is that I got +- same fps in same scene with 3.6ghz than with 3.8ghz!!! Air was right what he said, I got GPU limited in that point. >>

Look at first 3 pics first. U can see that from 300mhz to 600 gave huge gain, yet between 600 and 885 was no diff. at all... well it was but not in that scene. This is what I CANT STRESS ENOUGH, between each freq. was noticeable "FEEL" of speed, some scenes may have same fps, BUT LOWER GPU CLOCK = MORE SHORT OCCASIONAL SLOWDOWNS THAT WAS HARD TO CAPTURE. Now look on another 3 pics. Even when they wasnt 100% same(robot positions etc..) fps was konstant all time. 300mhz gave only 35fps! Look second - 600mhz! And finally: 885!!! 10fps diffference in that position, whereas before it seemed same.

Overal, from 600mhz and more I went, whole game got that feel its runing smoother, later maybe even too quick, this is what numbers doesnt always show. Because there may have been like 1/2-1/4 second slowdowns u wouldnt see on certain specific pictures - like that battle one.

Some times ago I tested some PC graphic benchmark on different clocks. Where it was beating my card to like 20fps on one point, difference between 300mhz and max was maybe 3-4fps. But on points with higher fps like 60+, that clocks sudenly got a much bigger gain. So when OC GPU, its not about get smooth fps in game that shutter a lot, its about to get from " very playable" fps to nice smooth fps, from shutter to smooth can be only solved by new gpu/technology. Thats my 2 cents about it.

My last words to this bench: GPU does matter for PCSX2 big time! Because even if 3870 isnt newest or quickest, its still pretty decent and modern card. I could actually play all PC games, inluding crysis(no AA) on very good or max details(at least smooth playable not like 100+fps), and Gears of War was TOTAL smooth, at least with patch. Just as example.

And its worth to overclock GPU for PCSX2 as much as its important to overclock CPU. Jeez man, cant wait for new 40nm gpu's to upgrade...




Users browsing this thread: 1 Guest(s)