PAL or NTSC
#21
(11-02-2009, 03:34 PM)Umino Wrote: of course the PS2 doesnt use any HD resolution for their "widescreen enabled" games - but it seems to be possible because there is some "HD enabler" disc sold by Xploder wich claims to be able to "boost" games resolution to either a 480p, 720p or even a 1080 resolutions - i saw that one as i was working at a nearby Game Stop - it came with a component cable to wire the PS2 to a HD television

Nah, still just an ugly hack. It uses the same thing I just described above. It enables the output circuit stretcher, but doesn't actually increase the game's internal framebuffer resolution. So all you get is a bilinear upsampling of a 640x448 image. Well, it can turn some games from interlaced to progressive though, for a nice quality boost on the V-scan direction at least. (made possible because most interlaced games use a full size 448/512 scan framebuffer, and just render alternating scanlines each frame).

(11-02-2009, 03:34 PM)Umino Wrote: even microsoft and nintendo realized the usefullness of such a game-independent graphics option and built it into their latest consoles while the PS3 still only relies on the output options the specific game offers if i am not mistaken

... Yeah. And I do wonder: this very well could have been an intentional planning foopah on Sony's part to help retain backwards compatibility with the original built-in legacy PS2 support (which was via an actual EE/GS chip from the PS2 installed into the PS3). And if so it would have been a huge backfire since the EE/GS hack turned out to be a wash.
Jake Stine (Air) - Programmer - PCSX2 Dev Team
Reply

Sponsored links

#22
taking out the PS2 backwards compatibillity was in fact one of the biggest backfires sony ever done in my opinion - and and has effectively kept me from buying a PS3 until now

funny thing is - as they noticed they could not sell the PS3 well at that high price they made it cheaper - removed the legacy hardware and replaced backwards compatibility with some crappy emulation at first- then they decided to completely take out backwards compatibility

my theory on that:

they simply removed it because they set the price for the actual PS3 console so low that there was not enough difference between production price and retail price so they did not earn very much from a sold unit

now they needed to make people buying a PS3 also buy a lot of new PS3 games because there you can get a lot bigger profit ratio per sold game title than per sold hardware unit

i can perfectly understand that - if you buy a new console and can use your old games further - maybe even get some improvements the original hardware could not provide - you won't buy so much new games to get startet - at least not if you are on a budget - like i am now and was back then as i got my PS2 - i was happy to be able to play my PS1 games with texture smoothing and much faster loading times and all that

what supports me on that theory is the fact that with some newer version of the PS3 firmware the PS2 versions of singstar are playable on the PS3 - why just singstar and other PS2 games NOT ? because singstar is one of sonys own games that sells like hell and others NOT - that was before the PS3 had its own versions of singstar

another theory - the hardware backwards compatibillity of the PS3 got dumped - now what to do with the already produced EE/GS chip ? -> solder it to a mainboard, crunch it into a new PS2 console and label it "PS2 slim2" - because that console has the EE and GS wich used to be 2 separate chips toastet into one chip

anyway that was way off topic but i just needed to have that said

(11-02-2009, 03:57 PM)Air Wrote: Nah, still just an ugly hack. It uses the same thing I just described above. It enables the output circuit stretcher, but doesn't actually increase the game's internal framebuffer resolution. So all you get is a bilinear upsampling of a 640x448 image. Well, it can turn some games from interlaced to progressive though, for a nice quality boost on the V-scan direction at least. (made possible because most interlaced games use a full size 448/512 scan framebuffer, and just render alternating scanlines each frame).

my question would be now - when you deinterlace the PS2 output on an natively interlaced game - like you can do on GSdx - wouldn't the same deinterlacing artifacts appear (like bob-ing, stripy picture and "gemetry ghosting" on fast moving scenes)

i'm asking because i wanna get a COMPOSITE & COMPONENT AV to VGA converter box to hook up my PS2 to my TFT monitor wich replaces the bulky CRT TV in my living room

that box turns any input to a progressive and upscaled VGA output hopefully suitable to even run my monitor with it's native resolution of 1280x1024 @ a minimum of 60hz - so it basically does externally what the "HD enabler" claims to do to the PS2 output internally - i'm just wondering if there would be any deinterlacing / frame sync artifacts
my comp:
Core2 Duo E6550 @ 2.33Ghz
Vista Ultimate x86 / 2GB RAM
GeForce 8600 / 256MB VRAM
Reply
#23
(11-02-2009, 10:19 PM)Umino Wrote: my theory on that:

they simply removed it because they set the price for the actual PS3 console so low that there was not enough difference between production price and retail price so they did not earn very much from a sold unit

now they needed to make people buying a PS3 also buy a lot of new PS3 games because there you can get a lot bigger profit ratio per sold game title than per sold hardware unit
They sold the PS3 at BELOW cost price, even after removing the ps2 hardware they were still losing money on each console. Notice the several billion dollar loss Sony made after a year?

It took them a long while to cut down the costs of manufacturing to minimize the loss and start making a profit from software sales, now they are I belive, selling above cost price with the PS3 Slim which is a culmination of all there hard work at reducing costs.

Quote:my question would be now - when you deinterlace the PS2 output on an natively interlaced game - like you can do on GSdx - wouldn't the same deinterlacing artifacts appear (like bob-ing, stripy picture and "gemetry ghosting" on fast moving scenes)
Yes, if the game doesn't support Progressive Scan then you will get the usual artifacts caused by deinterlacing the image.
Computer specifications:
Windows 10 | Ryzen 3700X | ASUS Crosshair VIII Hero (WiFi) | MSI 1070Ti | 16GB 3600MHz RAM
Reply
#24
Mortal Kombat Deception has an option where you put it at 50fps or 60fps
Reply
#25
i got my VGA box today - the delivery guy showed up at a very bad time - i was right about to go to work and i was already tight on time - but that wasn't his fault - i mean - who the heck goes to work at 10AM - almost others are already working then - anyhow - as i was back home i couldn't wait to try it out so i took my PS2 and wired everything up - a small documentation was also included in the package (just some basic stuff like what it is and how to use it) - now this part sounded very interesting:

Quote:HD Game Box uses the newest 3D De-interlace Video optimization and smooth chip.
Variable High-tech graphic optimization algorithm and technology are built into the product,
such as edge-preserving pixel interpolation and motion-adaptive 3D de-interlace algorithm
combined with the simlutaneous conversion of the frame rate to get higher definition display

i was like "wow - thats a mouthful - let's see if the actual picture looks as good as this sounds"

so i fired up DOA2 (PAL) wich can run at 50 as well as 60 hz - first i tried the default - 50hz

the OSD of the box said Y-PB-PR input - 576i

the OSD of my monitor said VGA input - 1280x1024 - 60hz

fine - thats exactly how i wanted it to be - the picture was nice and crisp as expected though i noticed the image not being completely stretched - it had small black borders all around the video display being about as thick as my finger - not just the usual "PAL-bars" on top and bottom but also a vertical bar on the right and a slightly bigger one on the left

now i tried 60hz mode (in the OSD of the box it was labeled 480i) and was kinda surprised - even though the input has a lower resolution now the bordering bars are gone and the image really fills the whole screen

what the F... ??? shouldnt it be easier to stretch a bigger image where this "wonderfull" high definition chip has les work with upscaling or is it because the image wich has only 480 pixels in height fits better into the aspect-ratio sheme of computer monitors ? (you know 640x480 and the multiples of that like 1024x768 and 1280x960 aren't that uncommon on computer screens)

anyway i chose the CPU vs CPU option and let two of the beauties beat up each other to see how that chip manages to de-interlace and upscale 3D scenes in motion

now the next surprise

at first there was no sign of de-interlacing artifacts, no shaky screen no nothing and all the edges were kinda smoothed out as if you would have turned on some FSAA or something

not until the battle finally startet and everything on the screen startet moving rapidly - suddenly all the rough edges came back again like you know 'em and the screen startet shaking a little - except for situtation with little to no camera movement (if one of the figters had her back against a wall or was stuck in a corner or something - or flat on the floor because of a hardcore-roundhouse from the opposing fighter ;-)

so thats what this "motion-adaptive" thing seems to be all about - if it detects rapid motions it just turns all the "fake-HD" effects down to avoid artifacts ;-)

then i tried resident evil: code veronica wich has a very annoying screen shaking even on a real TV - with the game box - nothing - and there aren't much scenes with heavy camera movement because it's still one of the old-school resident evil games with fixed camera angles (wich i love so much)

so the box turned on all the nice effects again - and left them on - and now you could clearly see the artifacts - every moving thing in the game (monsters, main character) seemed to be dragging a "ghost image" of itself around after it - not much - but it looked a little like you turned on some "motion blur"

but anyhow - thats not hurting my eyes half as much as the screen always shaking by a line or two ;-)

bottom line:

despite the big-mouthed description in the manual there are some clearly visible artifacts on interlaced games (wich was expected) it looks pretty nice even on my cheap asus TFT - and doesn't stress my eyes half as much as my old CRT-TV wich now can rot in hell for all i care (or collect dust in the bedroom to be correct)

rocks with interlaced content on PS2 and should rock double time with natively progressive content

my bro will be more than happy when he gets his WII and will be able to play twilight princess or metroid 3 @ 480p on his computer screen because if he finally has saved up enough to buy a wii i will recommend him to spend that few bucks extra and get that box too because in our family's living room there's another aging CRT-TV waiting to be replaced

i also recommend that box to anyone who wants to use a game console / dvd player / vcr / cable tuner / whatever home entertainment device you got on a computer monitor

it can upscale everything up to 1920x1200

it has a very flexible selection of inputs - composite video (CVBS) / S-video (Y/C) / component video (Y PB PR)

once wired up you don't need to crawl under your desk everytime and mess around with the plugs and cables - like when you had your PC running and then want to play a console game - it has PC-VGA and PC-audio pass-through when its off and when its on it can either autoselect the input it detects a signal on or you can select your input manually

and for that amazing price (around 30€ on amazon.com) you can't do anything wrong

as i was searching the web for a solution to connect a game console to a PC monitori found converter boxes all over the net but most of them doing less while costing more - for example a euro-av (also known as RGB-SCART) to DVI converter that did cost about twice as much but could only upscale to 720p and the output frequency was also depending on the input frequencie - so 756i / 50hz SDTV IN - 720p / 50hz HDTV OUT - and the same goes for 60hz content - the problem is my monitor can't go as low as 50hz so i needet something that could not only change resolution but change frequency as well - and most of THOSE devices were priced about 200 and up

if i go shopping for games again i might consider preferring games wich are at least "60hz enabled" PAL games to have more games that really go "full screen" on my actual "living-room-setup"

there might even be some "progressive enabled" games around - though the only one i know about is the NTSC version of GT4 - but i remember reading something about PAL games that can go progressive as well

why PAL games ? - because PAL is all my PS2 can read and i'm not going to screw it up by installing a boot-chip

maybe some day i might get a second one pre-owned at game stop and then try it on that one - but not on "my precious" - GOLLUM, GOLLUM !!!!

edit: dunno why but i looked at the thread title again - it reads "PAL or NTFS" - LOL ??? - did something just go wrong or was that on purpose - NTFS is a windows file system :-)
my comp:
Core2 Duo E6550 @ 2.33Ghz
Vista Ultimate x86 / 2GB RAM
GeForce 8600 / 256MB VRAM
Reply
#26
Kingdom Hearts 2 and similar are probably the worst offenders when it comes to deinterlacing.
Computer specifications:
Windows 10 | Ryzen 3700X | ASUS Crosshair VIII Hero (WiFi) | MSI 1070Ti | 16GB 3600MHz RAM
Reply
#27
valkyrie profile also seems to be a good candidate to de-interlacing artifacts - at least on PCSX2 with de-interlacing enabled it is clearly visible - or maybe it's just the low framerates i get at that game wich make the de interlacin artifacts that noticeable because the game running slower the artifacts take longer to disappear again - if you have ghost images of 1 frame delay @ 50 FPS then if the game is slowed down to 5 fps (like it is on my system) they stay visible 10 times longer

however - about the progressive thing - there's a trick to enable progressive mode in PS2 games that support it but don't have an option for it in any in-game menu - just hold down [triangle] + [cross] on booting and then a screen should appear asking you if you want to run the game in progressive mode

there's also a list on wikipedia with games that are known to support progressive scan -> http://en.wikipedia.org/wiki/List_of_HD_..._PS2_games

tried my game box with RE4 in progressive mode - it DOES make a difference as far as i could see - of course no de-interlacing related blur or anything on camera movement and a bit more overall sharpness of the picture

but with that game i have those borders again no matter wich mode i set the game at - 576i / 480i or 480p - the situation where the borders disapeared in 480 mode only happened on DOA 2 so far - haven't tried all my games yet - maybe the box always DOES stretch to full size but the game developers simply left that space unused because it would be cut out by TV overscan anyway
my comp:
Core2 Duo E6550 @ 2.33Ghz
Vista Ultimate x86 / 2GB RAM
GeForce 8600 / 256MB VRAM
Reply
#28
Well, depends a lot on the games.
Like FFX NTSC, which is a stripped down version, it has no advanced Sphere Grid, it doesn't have the Dark Aeons sidequests etc.
Often there are a lot more content in EU-version, the US-versions are often "dumbed down" to fit the general public.
So basically, I would go for PAL anyday, since NTSC doesn't really give you the hardcore gaming experience.
Just survive the drop in FPS :/
Reply




Users browsing this thread: 1 Guest(s)