Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
internal res question
#1
I have a question about the internal resolution of the game.

Right now i am playing ffx at 1280 x 960 with an internal res of double that.

Though i see people have internal resolutions that are square (2048 x 2048 example)

Which is best to use?
Intel Core i5-4690k @4.5ghz
Samsung 16GB DDR3 @2000mhz
GeForce GTX 680 4GB
Windows 7 x64
Reply

Sponsored links

#2
The best you use for it depends on which waspect ration you are using but if in 4:3 it's best to use multiple of the native res of the game.
Reply
#3
if it looks ok with multiples of your screen resolution, use that. but some games show graphical glitches like vertical/horizontal lines, if you don't run them at the native resolution of the game. using specific internal resolutions (e.g. 1200x1200 iirc) may avoid such glitches but still net you a better graphical output than on native. the latest beta of gsdx also introduced anti aliasing (insert the line "msaa=x" x=[0,2,4,8] into your gsdx.ini), give it a try.
CPU Core i5-3570K@3.4GHz | GPU Nvidia Geforce GTX 570 | RAM 8GB DDR-3 1600MHz CL9 | OS Win7 Ultimate (x64) SP1
EMU PCSX2 v1.1.0 r5645 | GS GSdx SSE4 r5632 | SPU2 SPU2-X r5559 | PAD LilyPad r5403 | CDVD cdvdGigaherz r5403
Reply
#4
Window size have not link with internal resolution, it's two unconnected values. Each PS2 texture have a maximum size of 4Mb, and a maximum resolution X*Y, where X and Y could be different between games. For example, 640*416. Graphic plugin do following thing: upscale this texture to internal renderer size, it's made texture borders less pixelized and image itself become more clear. Than scene would be rendered based on this "improved" textures and vertixes. Window size is number of pixel in output image, but it's just the view buffer, rendered scene is not dependent on it. The best way to set up internal renderer is to know game renderer and multiply it by 2 or 4. ZeroGS do it automatically on x2-x16 modes, but GSDX does not.
Reply
#5
maximum file size of 4mb.. so then i couldn't run a game at 8 times it's native res (5120 x 3840) could i?

Also, so what you're saying is pretty much do not use square internals. Use multiples of the native res (like i am doing now) correct?
Intel Core i5-4690k @4.5ghz
Samsung 16GB DDR3 @2000mhz
GeForce GTX 680 4GB
Windows 7 x64
Reply
#6
Seems using multipliers of the internal res fixes some issues, for me it doesnt make a difference so I just play at 1536x1536 just for the AA like effect while still working good Tongue
Core i5 3570k -- Geforce GTX 670  --  Windows 7 x64
Reply
#7
Why 1536? Seems kind of a...random number.
http://www.twitch.tv/krazytrumpeter05
Want to stream your games? Let me know and I can help you get set up with Open Broadcaster Software.
Reply
#8
Average between 1024 and 2048... because 2048x2048 textures are too big and would just make stuff slow for me and 1024 is too slow quality. I've tried 2x and 4x native before and as pixel count goes 2x looks bad for me (on games with something like 640x448 I see too low vertical res which then applied on my fullscreen resolution looks just worse upscaled) and 4x was just too big and would make everything just slow (kinda like 2048 would). But 1536x1536 is not as "random" as you think (it's still dividable down to 3 by a power of 2 xD Also on 3d modelling sometimes you get textures that are like 1300 and would look just right on 1536 but some programs make you use 2048 because of power of 2 rule which I think is a huge waste so I'd rather use 1536)

I like square combinations on the internal res because that way the edges on textures would look less pixalated looking at them from different angles (unlike you know in pc games using MSAA diagonal lines that are more horizontal than vertical look less pixelated than the other way). And i dont use my monitor's native resolution because then I can see the pixels easily which doesnt look right to me, so I use a higher resolution so then the downscale to the display would still make the image sharp but a bit smooth to not notice pixels much and so it's just good enough for me.
Core i5 3570k -- Geforce GTX 670  --  Windows 7 x64
Reply
#9
Probably a random number that doesn't slow down her games. She probably just tried random ones till it hit the wall then started stepping it back.
Reply
#10
That makes me wondering why default internal res giving by GSDX is 1024x1024, not others ?
durable PC since 2008
CPU : E7200 @ 2.53 GHz 1.04 V / OC 3.7 GHz 1.36 V (390*9.5, 4GB DDR2 780)
GPU : 8600GT (GDDR3, 256 MB) / OC (750 / 900)
BOARD : GA-EP31-DS3L (rev 1.0) broken and bought a second hand one replaced
PSU : 300 W
OS : Windows 7 Home Basic x86
Reply




Users browsing this thread: 1 Guest(s)