Does CF or SLI help increase D3D Resolution?
#11
(05-28-2009, 07:15 PM)KrazyTrumpeter05 Wrote: Not true anymore if one of them is set to run as a dedicated physics processor Tongue2

But, yeah, they need to be the same otherwise.

SemanticsTongue
AMD Phenom II 940 @ 3.6GHZ, 4GB PC8500 @ 1100MHZ, 4870x2 @ Stock.
Reply

Sponsored links

#12
(05-28-2009, 11:49 PM)tenow Wrote:
(05-28-2009, 06:41 PM)Saiki Wrote:
(05-28-2009, 06:32 PM)tenow Wrote: moreover, even 1 faster GPU will not help much, MORE CPU POWER - thats the key to success.

excuse me while I completely contradict you there...

8400GS = 16fps on God Of War
8600GT = 60fps on God Of War

processor speed 2.2 pentium dual core

yea.. gpu does nothing, right?
na-ah!!! ***** vs descent gpu. bad example.

ATI 3850 vs newest nvidia GTX 275, MGS3 native res - SAME FPS.
GPU must not bottleneck, after that - no difference

? What's your point? Two cards that are close performance-wise get the same FPS, who'd-a thunk it.

Saiki was just pointing out that crap graphics card will kill your FPS, even if the CPU is technically good enough to play the game. Both GPU and CPU are equally important for successful emulation, your first post came off sounding like the GPU was unimportant.
Without knowing the OP's card no one can say if he would be better off upgrading his CPU or GPU.
"This thread should be closed immediately, it causes parallel imagination and multiprocess hallucination" --ardhi
Reply
#13
(05-29-2009, 12:26 AM)echosierra Wrote:
(05-28-2009, 11:49 PM)tenow Wrote:
(05-28-2009, 06:41 PM)Saiki Wrote:
(05-28-2009, 06:32 PM)tenow Wrote: moreover, even 1 faster GPU will not help much, MORE CPU POWER - thats the key to success.

excuse me while I completely contradict you there...

8400GS = 16fps on God Of War
8600GT = 60fps on God Of War

processor speed 2.2 pentium dual core

yea.. gpu does nothing, right?
na-ah!!! ***** vs descent gpu. bad example.

ATI 3850 vs newest nvidia GTX 275, MGS3 native res - SAME FPS.
GPU must not bottleneck, after that - no difference

? What's your point? Two cards that are close performance-wise get the same FPS, who'd-a thunk it.

Saiki was just pointing out that crap graphics card will kill your FPS, even if the CPU is technically good enough to play the game. Both GPU and CPU are equally important for successful emulation, your first post came off sounding like the GPU was unimportant.
Without knowing the OP's card no one can say if he would be better off upgrading his CPU or GPU.
A 3850 is no where near a GTX275...
AMD Phenom II 940 @ 3.6GHZ, 4GB PC8500 @ 1100MHZ, 4870x2 @ Stock.
Reply
#14
(05-29-2009, 01:25 AM)Sythedragon Wrote:
(05-29-2009, 12:26 AM)echosierra Wrote:
(05-28-2009, 11:49 PM)tenow Wrote:
(05-28-2009, 06:41 PM)Saiki Wrote:
(05-28-2009, 06:32 PM)tenow Wrote: moreover, even 1 faster GPU will not help much, MORE CPU POWER - thats the key to success.

excuse me while I completely contradict you there...

8400GS = 16fps on God Of War
8600GT = 60fps on God Of War

processor speed 2.2 pentium dual core

yea.. gpu does nothing, right?
na-ah!!! ***** vs descent gpu. bad example.

ATI 3850 vs newest nvidia GTX 275, MGS3 native res - SAME FPS.
GPU must not bottleneck, after that - no difference

? What's your point? Two cards that are close performance-wise get the same FPS, who'd-a thunk it.

Saiki was just pointing out that crap graphics card will kill your FPS, even if the CPU is technically good enough to play the game. Both GPU and CPU are equally important for successful emulation, your first post came off sounding like the GPU was unimportant.
Without knowing the OP's card no one can say if he would be better off upgrading his CPU or GPU.
A 3850 is no where near a GTX275...

I should have been more clear Sad

In the context of that game, they are giving the same performance. If you're getting constant FPS (and it isn't 60) with such disproportionately powered cards, it's clearly not the limiting factor. The CPU can't keep up with the less powerful card, so throwing more GPU-power at it doesn't change anything.

Both examples in the thread embody the opposite ends of the spectrum. Both switched out the GPU for a worse one, and ended up with radically different conclusions. I was trying to point out the inherent problems with suggesting he update his CPU over his GPU without any information about either.

It made sense in my head Sad Why can't you all be telepathic.
"This thread should be closed immediately, it causes parallel imagination and multiprocess hallucination" --ardhi
Reply




Users browsing this thread: 1 Guest(s)