Low fps?
#21
Changing gpu settings to constantly work in max power is very BAD idea;p.

And you should check other info before posting that, on the screenshots on #15 clocks are not max because it was made on native res. You can see they apply max on earlier post made in x6 native without any problem.
Reply

Sponsored links

#22
um no in post 4 DX11 6x for you can clearly see he is running both GPUZ which show his max core is like 1006 mhz and memory is 1502mhz with boost of 1052mhz (this page show max clocks not the actual clocks) And RTSS which is in bottom of the screen show his core or his memory is running at 810 mhz depend on how he setup RTSS. for gaming especial emulators putting the card in max performance is good idea having it adaptive and see the card downclock and see obvious slowdown should be first clue that adaptive is bad for gaming. adaptive is fine for desktop/web browsing. but gaming? i dont think so , Adaptive power is bad for gaming. why you think the first thing most people will say is max sure max performance is used and not adaptive?
Reply
#23
Dunno how reliable is RTSS in compare to gpu-z if they show different data(and saying it's "clearly" visible is kind of weird unless you are some kind of psychic, since it was under the taskbar;p), but as far as I knew nvidia users were proud of their gpu's drivers;p, would be funny to end up with same bs as with some radeon drivers. I would rather say through that it doesn't give it more power(if that's the case) simply cause it's limited by things it has no control over. Anyway yes forcing clocks when not on buggy drivers IS BAD, doesn't matter the ocassion, especially on top end gpu's which does not need half of it's power in alot of "gaming" to begin with.
Reply
#24
I use both RTSS and GPUZ and trust me that screen he showed with GPUZ is only showing the Clock the card is capable of and RTSS is not showing 1000mhz for anything. RTSS is accurate As it show what the real time clocks are of the card, Only the GPUZ sensor page show realtime clocks and if he goes to that page it will show his card is downclocked which is exactly what RTSS is showing.

Both AMD and NVidia drivers suffer from this Downclocking for no reason which why adaptive power is bad for gaming, We differ on opinions about adaptive power, I dont care what game what driver I use ALL my game and emulation use max performance. and not Apdative, As the only reason I want my GPU down clocked is if I sitting idle on my desktop or web browsing. I dont want my game be down clocked for any reason other then the card is over heating. cause Adaptive in both camps is anything but perfect, Much like Speedstep on intel cpu and amd equivalent
Reply
#25
(07-16-2013, 07:46 PM)tsunami2311 Wrote: In post # 15 look at the clock rates and in post #4 look at GPUZ see what his base 3d clocks should be and boost clocks should be his card is down clocked in pcsx2. His GPU is downclocking his core or memory to 810mhz that is problem in its self and he need to make NVCP profile for pcsx2 and set it to max performance. Cause his card does not have 3d clock rate of 810mhz for either the core or memory. let alone a boost rate of 810mhz

Do this and i bet problems stop. Making a nvidia profile for PCSX2 set to max performance was the VERY first things I did when I got one of these cards that had option to pick from Max performance and adaptive cause by default adaptive is used. And I dont trust the algorithm that is used to chose between max performance and adaptive and be correct which is same reason I have speed step turned off on my cpu.

Setting to max performance only increase the fps by 3. Still didn't solve my problem
Reply
#26
(07-15-2013, 06:20 PM)Saiki Wrote: @honam1021: post your gsdxini contents here please?

^still haven't done this as asked
Reply
#27
(07-18-2013, 09:56 PM)Saiki Wrote: ^still haven't done this as asked

Here you go:
[Settings]
ModeWidth=640
ModeHeight=480
ModeRefreshRate=60
Renderer=3
Interlace=7
AspectRatio=0
upscale_multiplier=6
windowed=1
filter=2
paltex=1
vsync=0
logz=1
fba=1
aa1=0
nativeres=0
resx=1024
resy=1024
extrathreads=0
ShadeBoost=0
UserHacks=1
UserHacks_MSAA=0
UserHacks_AlphaHack=0
UserHacks_HalfPixelOffset=0
UserHacks_SpriteHack=2
UserHacks_SkipDraw=0
UserHacks_WildHack=1
UserHacks_AggressiveCRC=0
ShadeBoost_Contrast=50
ShadeBoost_Brightness=50
ShadeBoost_Saturation=50
Reply
#28
(07-19-2013, 03:55 AM)honam1021 Wrote: Here you go:
Code:
[Settings]
ModeWidth=640
ModeHeight=480
ModeRefreshRate=60
Renderer=3
Interlace=7
AspectRatio=0
upscale_multiplier=6
windowed=1
filter=2
paltex=1
vsync=0
logz=1
fba=1
aa1=0
nativeres=0
resx=1024
resy=1024
extrathreads=0
ShadeBoost=0
UserHacks=1
UserHacks_MSAA=0
UserHacks_AlphaHack=0
UserHacks_HalfPixelOffset=0
UserHacks_SpriteHack=2
UserHacks_SkipDraw=0
UserHacks_WildHack=1
UserHacks_AggressiveCRC=0
ShadeBoost_Contrast=50
ShadeBoost_Brightness=50
ShadeBoost_Saturation=50

x6? that's the #1 issue right there..

and I'm not really sure what "wildhack" is turn that off for now, unless you know exactly what it does. then explain
Reply
#29
(07-19-2013, 06:21 AM)Saiki Wrote: x6? that's the #1 issue right there..

and I'm not really sure what "wildhack" is turn that off for now, unless you know exactly what it does. then explain

What settings should I use for 1080p?
Reply
#30
Well, fist things first, you should scale things back to get FPS and GRADUALLY increase it till you find your breaking point

also:

1080p is a set of HDTV high-definition video modes characterized by 1080 horizontal lines of vertical resolution[1] and progressive scan, as opposed to interlaced, as is the case with the 1080i display standard. The term usually assumes a widescreen aspect ratio of 16:9, implying a resolution of 1920×1080 (2.1 megapixel) often marketed as Full HD.

6x512 != 1920
Reply




Users browsing this thread: 1 Guest(s)