Thread Rating:
  • 5 Vote(s) - 4.2 Average
  • 1
  • 2
  • 3
  • 4
  • 5
CPU Benchmarking - Quad vs. Dual in various RPGs
yes that might be but I tested all this before I started to use Playground. Now I also only use Playground and I yet have to play Persona 3.

Consider also that I didn't activate any speedhacks or tried to optimize anything. I just started the game and that's it. But still you can clearly see that even if a game runs without optimization it can get to playable with a faster CPU and that is the reason why I am now content with my new CPU Wink.

Sponsored links

Ah ok, yeah. Didn't see that you tested it with the beta versiob.
Fine to see you here Smile

I got a "free" upgrade of my C0 wolfdale to E0.
Basically it's good for another 300-500Mhz, just like you hoped for.
For some reason I cannot get it stable at 4.5Ghz, even 4.4Ghz is too much.
But I'm suspecting my mainboard is limiting me here, as the CPU doesn't scale with vcore anymore above 1.4V Tongue2
FINALLY, This was like last piece of information I needed. Hell, I even registered to reply. I was planing quad Q9400, I have same mobo >> p35-ds3. I was expecting OC quad to about 3.2(8x400) but was worried. For example I read there is OC difficulty of 45nm quads on p35 etc, Btw I noticed my HD 3870 use about 30-40% of my current CPU, its e2160 oc @3.0ghz. That means nearly 1 full core, or about 2000+Mhz. Thats why I was thinking to go with quad...
Please Jl. or somebody with GTX260 or 8800gt or 9800gt(not 9600) can u tell me what cpu usage u get with pure graphic apps that doesnt use cpu like rthdribl? Something in Directx9-10 NOT opengl!!!! Thank u in advance for this important info!
So now if gpu offload would be ok/equal to nvidia, for me its between e8400 or e7300, I suspect e7300 OC a bit worse but with e8400 am bit worried if I can reach more than 400 or even 400fsb on my mobo. Am now at 375x8 for e2160 and I believe I got fsb wall but dunno if with mobo or cpu...
E8400 is the best choice, you could OC it to much higher frq, and cache size is more than twice of E7300.
I wonder how an ATI GPU like 4850 or 4870 would change these results? If I recall correctly there was some information suggesting that quad core helps there, but I can't find anything concrete about that from the internet. As I have a 4850 and a plan to upgrade my CPU sooner or later.. well, I'll probably go with the quad regardless of this though.
(01-04-2009, 02:36 PM)Raffael19 Wrote: I use the PS2 Playground emulator (and I'm really asking myself why so much people use the 0.95 beta version, because the beta version runs totally unstable on my PC, additionally it's much slower than the Playground version).
Yep. but Shadow of Colossus runs much faster in 377x3vu version of 0.95 (AMD 3.2ghz, 8600GT) than in playground version.

I couldn't resist so I made some test myself. First of all, its not 100% comparable. I have same mobo: p35-ds3. Normally running RAM 4-4-4-12 but for this test I set default 5-5-5-18(4gb ddr2), I set FSB to 325 and multiplier to 8x on my E2160 to achieve 2.6ghz. Am comparing with E8400 set on 2.6ghz. He have nvidia GTX260(216) vs my Radeon 3870 gddr4.

I only tested xenosaga I because it took just 1-2min to get to second combat - same place where he also benchmarked. FPS wasn't that variable and most of the time kept stable so little diff. shouldn't matter much, what matter is some HW difference. First, its 2.6ghz 1mb L2 65nm vs 2.6ghz 6mb L2 45nm. Also GPU. Point of my test was to see if my 3870 gpu doesnt offload too much into cpu thus see if quad would be for ati radeons necessary.

I couldn't find PCSX2 0.9.5 Beta SVN 394 so I used SVN 377 that is available, I also used iso instead of dvd, but gfx and spu are same versions. Also all settings for gpu, spu and emu itself are same. But there was one problem, I have 4:3 1600x1200 lcd while author have 1920x1200 16:10. What I did was set internal resolution to 1920x1200 and used windowed mode maximized. On picture 2 u can see config, u can also see some options wasn't accessible, but I believe its same as author settings.
Under Dx10 I always saw these non-settable. My OS is Windows server 2008 x64, which is more less vista x64 ultimate without DRM and with some enhancements. It shouldn't be too different but I tweak my OS and turned some services off(but aero is on). On taskbar u also see am running minimum of apps, only google-talk and daemon tools, no antivir etc...

Now, when u check pictures u see resolution 512x448 on the top but its wrong. Picture 1 have set internal resolution 1920x1200 as shown on picture 2. when really set to default 512x448(native) in picture 3 u can clearly see difference in quality so it should be ok.

So what we see is ~10fps difference on 2.6ghz(picture 1 only!). First we must take into consideration L2 cache diff. and 65nm vs 45nm. But then again, his 65nm quad gave same fps. Rest should hopefully mean difference in GPU. OS and SVN version may also play a role. Thats why its not very good comparison but I was eager to see it anyway. Its 20% performance difference. I also cannot say final statement for radeon 4xxx series, but I thing if ati would write drivers to offload to cpu, they do it for all cards. I believe GPU made 10-15% diff.(or 5-8fps).
What am not 100% is if that is because GPU driver offloading or because my 3870 is simply weaker that GTX260 and that game would have use for stronger cards. In any case, quad is likely harder to OC and will not reach such a high clocks, so higher clocked dual vs lover clocked quad could end up same fps or even worse for quad.

My final recommendation would therefore most likely also be C2Duo for ati cards, but am still not 100%.

Side note: As for PC games, I finished GTA 4 on my e2160 @3ghz and radeon 3870. In game benchmark gave me 29.9fps. 1600x1200 around medium settings. CPU usage was around 80%. For me it was pretty smooth with occasional VERY SMALL hangs. But who knows a future...

Am still thinking about quad, but will most likely skip it. I would really appreciate if somebody could give answer/test to my previous post.

One last test, I disabled multi thread and dual core support in pcsx2. Other options including resolution 1920x1200 remains(but my cpu is back at 3ghz but it doesn't matter now). 50% cpu usage means 1 full core on my E2160 @3ghz, I got avg. 57% usage. ~7% from 3ghz is ~200mhz. It was jumping between 51%-61%, mostly staying on 57%. Thats 1mb L2 cache CPU. That 7% include OS, GPU offloading and everything else. Radeon 3870.

How much gain can be from 200mhz? 2.6ghz gave me ~39fps, 3ghz gave me ~47fps, so 200mhz == ~4fps gain. Just to be in picture...

Edit: ..indeed, 200mhz += ~4fps, but that mean 200mhz for both cores and in dual core mode! If this single core test showed 7% extra usage by gpu and rest of the system, that 7% was offloaded only from 1 core. And I don't believe driver of any gpu, even 4xxx would offload to more that 1 core. So what I mean is that if in full dual core pcsx2 mode OC of CPU +200mhz += ~4fps, that 7% of 1 core or ~200mhz could take back about half(2fps). Not very accurate but anyway...
Driver offloading(if it even exist in such amount) of my radeon card would take ~2fps of ~40-50. I can't believe nvidia driver would use much less cpu, its pretty minimum already. I found today some posts on the net(I think ngemu forums), one person went from 8800gts to 4850 and experienced even fps increase(we still speak about pcsx2 of course), while other stated that from his previous 9600gt to 4xxx resulted same performance in games but drop in one: dragon quest VIII. Both on dual cores. I don't thing this is because driver > CPU offloading, more like card/driver performance against some specific code/functions(yeah maybe even for undemanding pcsx2). So unless somebody prove else, I say that all stuff about ati drivers too much offloading into cpu is a crap. And even if they would, maybe they work based on some priority scheduling, which means they would offload only if cpu has free resources. Imagine u're running pcsx2 on strong 4xxx card that maybe use 40% of its potential, it would be stupid if cpu would be cooking on 100% and card up to 50% yet card would still EAT cpu for lunch, can't believe it.
I PM'ed Jl. to make single core test to report cpu usage of e8400, how much it goes over 50% with his gtx260.
Also if some of u have other cards like 4xxx or 8800 it would help if u posted here. U don't even have to use same game or setup, just turn off multi threading and dual core and report cpu usage. Point is to see how much over 50% it get. We shall see...

Edit2: - new tests applied on xenosaga I: cpu OC 3ghz >> 3.33ghz == ~5fps gain, btw RAM timings and FSB have some role: 333x9 was ~2fps slower than 375x8, also 5-5-5-18 >> 4-4-4-12 == ~1-1.5fps gain
very good post !
OS: Windows Vista x32 Home Premium
CPU: Intel® Pentium® Dual CPU T3200 @ 2.00GHz
Ram: 3,00GB
GPU: ATI Mobility Raedon HD 2400

Users browsing this thread: 1 Guest(s)