Thread Rating:
  • 1 Vote(s) - 5 Average
  • 1
  • 2
  • 3
  • 4
  • 5
-REAL- Stereoscopic/Anaglyph 3D PCSX2 samples (unimplemented)
#11
I'm no D3D guy. What is the flag and where does it get passed?
Reply

Sponsored links

#12
Do you need 3d monitor to see these images or just a red/blue glasses?
Reply
#13
(08-28-2012, 09:18 PM)rama Wrote: I'm no D3D guy. What is the flag and where does it get passed?
I think it's DX actually. It's sequential 3D which means everytime a frame gets rendered the angle is shifted from left to right to left to right and so on, so in a second the frames 1,3,5,7,9... represent the left eye and the frames 2,4,6,8... represent the right eye. You must give the instruction to your API that you give a 3D output using that method so it can synchronize your active shutter glasses to that.
For anaglyph you could implement it completely internally by rendering for example, in case of a 720p output, an image of 2560x720 with one image next to the other (side by side, left/right eye) and combine them with a cyan-red anaglyph filter (won't affect nearly at all game speed), and then you could enhance it optionaly with -the recently added- post processing with a dubois algorythm.

I will post in a bit some PSOne and DS games using the idea I have to overcome a hardware's limitations.

(08-28-2012, 10:43 PM)Game Wrote: Do you need 3d monitor to see these images or just a red/blue glasses?

The answer is simple.

For Anaglyph 3D - Just a normal screen with your cheap red/cyean glasses.

For Stereoscopic 3D - Nvidia 3D or any active shutter technology screen should work with the sequencial shift idea. (hint: you can use the other idea of the 2 images next to the other, but instead, one over the other and use the same technology for Passive displays with nearly zero cost at speed since every image gets rendered at half vertical resolution)

Ah PCSX2 is such an amazing emulator. 2012 and 2013 will be the years that it will cause a huge earthquake since the technology for a PC that can handle perfect PCSX2 becomes way too affordable! I hope this will draw more attention to this amazing project!
Reply
#14
really cool. how did you do that? did you cheat the game to render changing angles? or is it a vertex reprojection in gsdx? how you get the red cyan colors? this a gsdx hack too huh? is that a colorwritemask flipper and blending?

just cause:

the red cyan might actually work in one pass with a geometry shader for the reprojection. rendering every triangle twice. once each with it's a reprojection matrix for the left and right eye and a color mask bit and some pixel shader madness to blend that. ofc this would mosdef not do well with postprocessing. but direct rendering should do that nicely. Laugh
Reply
#15
(08-29-2012, 01:29 AM)xstyla Wrote: really cool. how did you do that? did you cheat the game to render changing angles? or is it a vertex reprojection in gsdx? how you get the red cyan colors? this a gsdx hack too huh? is that a colorwritemask flipper and blending?

just cause:

the red cyan might actually work in one pass with a geometry shader for the reprojection. rendering every triangle twice. once each with it's a reprojection matrix for the left and right eye and a color mask bit and some pixel shader madness to blend that. ofc this would mosdef not do well with postprocessing. but direct rendering should do that nicely. Laugh

It's a script actually that changes a value in memory at 60Hz, and taking 60 pictures (=1 second of 3D, but you can notice some jitter though)

Also from what I understand, you need Z depth calculation for what you just said above in order to archieve anaglyph 3D realtime. I don't remember if PS2 gives all the important functions to render that but for some reason I think It doesn't. My idea was to overcome hardware specific issues and force real 3D in consoles that give you limmited stuff to work on or are impossible (PSOne), nearly impossible (DS) or mostlypain in the a** to implement such thing (though the result will always be a slight step worse than from what you would do if hardware would support it).

Here are some examples of my idea taking effect in other emulators/consoles that their architecture restricts such thing, take them as a reference to my theory or just an eye candy (LOL):

Tekken 3 (PCSX-reloaded - PSOne [Console does not calulate z depth at all]):
[Image: psoneb.jpg]

Kingdom Hearts Re:Coded (DeSmuME - NDS [Same as PSOne issue plus some extra issues/restrictions with messing everything related to how games get rendered internaly)
[Image: 49942512.jpg]

Kingdom Hearts: 358/2 Days (DeSmuMe - NDS)
[Image: ds2cd.jpg]

Blooper/PS:
This idea came in my mind from the idea of how to take advantage of the trick I thought of how to play a couple of games at 120fps without speedup. XD
Reply
#16
yeh yeh. examined some more and i see. looks like you're indeed changing the camera angles. that's why the ffx-2 shot's focal plane is offset. i guess you got alil unlucky there cause the camera's rotation axis is bogus or floating freely. this might never work to give correct focal plane. but yeah else those player assigned lookat matrices and the centered yawpitchroll matrices seem to work. i prefer the centered tho cause the depth is getting bigger aka you go into screen. the lookat stuff is good to get a holographic depth but the ui depth is bogus. some close things are actually foreground while the ui is at depth 0. this not really cool.

also what i noticed if you do like you do that, at that original 60 fps and switching even/odd red/cyan frames there's that problem with the characters or general animation. you can see that in the tekken 3 shot. there's inverse depth on anna cause she's moving/animating inbetween. seee... animating is at 60 fps. and you shoot a red with one animation frame and the cyan with the next. this is not correct. the pose actually has to freeze for both frames.

one more: maybe you wanna explain why the red/cyan sides (virtual depth) changes on the panels on the right of the tekken 3 shot? i don't even need the glasses for that to notice. i guess there's still something wrong with your anaglyph script. mmh...
Reply
#17
@xstyla

I can do the same without the need to shift the ange every frame if there was a way to implement it natively. The problem is that sometimes the speed drops under 60 fps and the shot tried to catch the next frame so it misses a small gap in movement.

BTW in the case of Kingdom Hearts games, thanx to a mechanic I can inverse the 3D effect so it will pop out instead of have a depth behind the screen, actually it's just the opposite idea which will fix the HUD stuff you may not like.

As for the Tekken shot, it's a PCSX bug caused by the widescreen hack...
Reply
#18
I would really like to re-play many great titles on PCSX2 in stereoscopic 3D!!
This should give us whole new experience just like being in the games! (like Fatal Frame :-|)

I have requested this long time ago along with many fellow:
http://forums.pcsx2.net/Thread-GSdx?pid=...#pid187134

Really wish this time can work! I strongly believe the time is right by now as recently I was able to get a S3D fast enough GPU @ $120 and Passive Polarized 3D monitor @ $200. Now we don't need to spend a fortune to get that a 3D Vision glasses anymore. Really looking forward for any progress on this!
Reply
#19
Fatal Frame games have a game-breaking bug in hardware rendering mode and the only way to render 3D is via hardware mode.
What I posted was an idea to set a fps limmit to 30Hz (half speed) and make a cheat that changes the angle of view at 60Hz and capture/render frames. Doubling those numbers results into realtime 3D with no slowdown but the whole things needs to get branched internally so it will use the appropriate calls to the video card. Just with ascript it won't work to anything else than capturing video/snaps and after having to compile them yourself.
Reply
#20
Hi KLM, do you we can persuade gabest to add stereoscopic 3d in his gpu plugin?
pcsx2 is almost perfect except S3D is not ready yet.
Otherwise it must surpass the experience we can get from orginal PS2 or even the PS3 remake in anyway! :-D

However besides of frame interpolation implementation (which may cause trouble when the framerate is not stable), I would prefer row interlace display mode (breaking the two frames in lines and the pack in interlace manner?). By now many passive polarized TV/monitor on market at reasonable price range in compare with active 3D devices. Also there is no screen flipping issue. I was surprise that even vertical resolution was half but actually it is not noticeable if view by normal confortable view distance. Row interlace enabled me to play Xenoblade with Dolphin @3D+1080P for hours daily without headache problem :-P

If z-depth can be added back then it can be enable by driver immediately and use driver to control the depth/convergence anytime. This should be the ultimate goal of us but really need gabest's help :-(

Great works man!!!
Reply




Users browsing this thread: 1 Guest(s)