Xenosaga Episode III - Also Sprach Zarathustra [SLUS 21389] (U)
Intel Core i7 920 @ 3.4 Ghz | 6 GB DDR3 RAM in Triple Channel | GeForce GTX 285
2.5 TB Hard Drive Space | Windows 7 Ultimate x64

Official betatester of PCSX2

Sponsored links

PC Specs:

CPU: Intel Core i7 @ 3.4Ghz
GPU: Nvidia GTX 285 768MB

Build Description: PCSX2 17 01 2013 1.1.0 r5517- Windows
GSdx 5464 SSE41 [09 12 2012]
LilyPad 0.11.0 [28 12 2012]
cdvdGigaherz 0.8.0 [28 12 2012]
USBnull Driver 0.7.0 [28 12 2012]
SPU2-X 2.0.0 [17 01 2013]
DEV9null Driver 0.5.0 [28 12 2012]
FWnull Driver 0.7.0 [28 12 2012]
USA v02.00(14/06/2004) Console
DC and MTGS | VU1Rec and VU0Rec | rCache 0 | Console 1 | Patches 0

EE/VU Clamp modes: Normal/Normal
EE/VU Rounding: Chop/Chop

Speedhacks Used:


Gamefixes Used:


Amount of testing done (little/medium/much/completed-game): medium


Status is unchanged. Runs at 200% speed.



Intel Core i7 920 @ 3.4 Ghz | 6 GB DDR3 RAM in Triple Channel | GeForce GTX 285
2.5 TB Hard Drive Space | Windows 7 Ultimate x64

Official betatester of PCSX2
PC Specs:
CPU: AMD Ryzen 7 4700U (8 cores, 8 threads, released mid-2020ish, it’s an APU for ultrabooks)
GPU: Radeon Integrated Graphics
Laptop Model: 14ARE05 (Lenovo Flex 5)
RAM: 16GB soldered

Build Description: PCSX2 1.7.0 Linux (64 bit build, it’s all I could find in the repositories. Admittedly, this is a non-standard built)
Linux Distro: Manjaro 20.2.0 Nibia, KDE Edition (Manjaro is a fork of Arch Linux)
Kernel: 5.11.0 pf-Zen2 Kernel (specifically tailored for AMD's Zen2 architecture, likely not important. It’s a non-standard kernel that has worked the best for my system. I’ve also tested out some Real Time kernels and the standard kernel, and it made no extra difference for PCSX2 and this game)
The Linux version has the ability to use both software and hardware hacks at the same time- which I noticed the Windows version does not have. Sometimes I swear that by selecting one option, other options aren’t used, but PCSX2 isn’t upfront about what contradicts what.

BIOS Used: USA v1.06
Video Plugin: GSdx 64-bit (GCC 10.2.0 AVX/AVX) 1.2.0 [libGSdx-AVX2]
EE/VU Clamp modes: Full and Extra+Preserve Sign respectively
EE/VU Rounding: Chop/Chop
GSdx Settings: OpenGL Hardware, 3x Native, Interlacing (Automatic), Bilinear (PS2) Texture Filtering (works the best). Only OpenGL Hardware or Software is present in the Linux version of PCSX2.

Amount of testing done (little/medium/much/completed-game): Medium, I’ve done pretty extensive testing on only the first level, and a little afterwards- both in Linux as well as Windows on this computer (to be posted later), and with Linux on my previous computer. I did a lot of testing while saving in the middle of cutscenes to see what options could fix many of the cutscene issues- as that seems to be the game’s biggest weak point when it comes to emulation on Linux. I’ll continue to go through the game and report what I encounter.

Speedhacks Used:
I set the CPU cycle to “Normal”. I didn’t need any extra power, just compatability. I noticed that the emulator struggles with timing what bit of the game to play and when- so I decided to mess with anything timing related as little as possible just in case.
Enabling Wait Loop Detection seemed to help with some stuttering in the game- or it could just be my imagination. Either way, it doesn’t hurt.
I’ve found that the game doesn’t seem to like EE cycle skipping, so I set that to 0. It can handle being set to 1 to a certain extent if you don’t check the wrong boxes after that, but the game absolutely cannot handle anything higher than that without significant graphical issues.
Under microVU hacks, I did not select the “MVU Flag Hack”, as that could sometimes cause graphical issues. I noticed that the game runs smoother with the MTVU hack enabled. I have multiple cores, and using them works better than not using them. It also fixed some flickering caused by the “Aggressive” CRC hack option to be mentioned later. Somehow, despite being a “hack”, it actually increased compatability elegently without any drawbacks. “Instant VU1” was not selected.
Gamefixes Used: None

I’ve done A LOT of trial and error trying to get things to work as best as they can because I’m nutty that way and I’m a glutton for punishment. A few extra tips:
GSDx Config (Video Plugin Settings):
Enable user hacks, you’ll need it.
I didn’t see much extra benefit from Accurate DATE rendering and GPU Palette Conversion, but I kept them on for possible compatibility reasons.
I didn’t notice any extra benefits to Anisotropic Filtering- likely because this mode is meant to make textures look good at extreme angles- but Xenosaga has a fixed camera system, so that situation pretty much never happens, and even in situations where it should have helped, it still didn’t seem to work. I went all the way up to 16x Anisotropic Filtering once and it crashed my game. Dithering doesn’t seem to make much of a difference unscaled or scaled, so I turned it off. Keep mipmapping on, the game looks bad without it, but anything above “Basic” doesn’t appear to add any extra benefit.
The CRC Hack Level is important. FMVs have a tendency to do strange things in this game. Regular gameplay works smoothly and beautifully, but different parts of cutscenes can do weird things. Flashback scenes have the most glitches, with odd black boxes appearing, and lines down the middle of the screen. In normal cutscenes, colors and models will have a strange doubling effect, but then when the camera switches to another camera angle they’ll render just fine. You have multiple options to fix this. You can either use software rendering to fix these issues completely (which also means you can’t select a higher resolution), or you can use the “Aggressive” CRC hack setting. The Aggressive setting makes cutscenes work a lot better, but it also removes the sepia tone filter. Once I did this, some orange and yellowy flickering in the bottom half of the screen appeared. Somehow, checking the “MTVU multi-threading” option in the Speed Hacks section solved the flickering issue. There were still some occasional flashes that appeared in cutscenes, and by setting the blending accuracy to at least “Medium” it would solve that (admittedly minor) issue. Thus far in my playthrough that has appeared to iron out all the bugs in most cases (just no sepia tone filter, lol). Giant robot battles work very well but get slow when you start to activate effects heavy animations like Anima. They weren’t game breakingly slow or glitchy, thankfully. No matter how many options I experimented with, I couldn’t get rid of that slowness without decreasing the resolution.

My computer has 8 CPU threads Recommended settings are # of threads minus 2. So I set it to 6.
This game DOES NOT like Auto Flush, do not by any means use Auto Flush. Horrible slow down on my old computer, and on this computer as well.
On my previous computer, the game didn’t seem to like Edge Anti-Aliasing. It works just fine on this one though, but it most cases it won’t do much. It makes town maps look better, although it does cause a little bit of stuttering.

Advanced Settings Tab:
From what I’ve read elsewhere online, Force-disabling Half Screen Fix Detection will help some people. Disable Safe Features, as it will help some textures render on the over-map in towns.
“Merge Sprite” is one of several options that help keep the sprites in the game working. One sprite-enhancing option doesn’t seem to noticably work on its own, but selecting “Merge Sprite”, and setting “Round Sprite” to “Full” seems to help.
Turn on Trilinear Filtering, it prevents half the screen from changing to a darker color in some cutscenes.
Half pixel-offset should be set to “Special (Texture)”, otherwise, odd black lines and shadows will appear on the floor. Above or below that, and odd graphical glitches pop up.
Several sources I found suggested setting the texture offset Y-coordinate to 1000 would help. Setting it to those coordinates made things really blurry in some areas for me, like in towns on the over-map. But, textures do have a tendency to do strange things in this game, so fiddling with this could work if some blurry doubling of textures appears. I set my X coordinate to 25, and my Y coordinate to 100. I think it made the image sharper, but it’s hard to tell.
I found no use in selecting the OpenGL Custom Settings.
Post-Processing Tab:
Keep on “Texture Filtering of Display” and select “Fxaa Filter”. I detected no compatibility issues when keeping them on, and the game looks weird without those two settings.

Audio Settings:
Set latency to around 120, and that should help with audio syncing improperly. Audio settings aren’t anywhere near as important, and you could probably make do with whatever. I noticed that Pulse Audio didn’t always work that well on my computer, so I switched to PortAudio (Cross-Platform), and set the API to “ALSA”. That’s likely a very computer-specific thing though.

The game won't detect my memory card, so I have to use the built in save function in PCSX2. The game also noticably slows down sometimes around save points a bit more than in the original. If the game does detect a memory card at all, it will say it’s trying to format the card, but then fail every time. I spun up the Windows version of PCSX2 and the memory card feature worked just fine.

Bugs that are NOT PCSX2 bugs:
On my PS2, loading a boss battle or cutscene would take just long enough to make you start to wonder if something went wrong. I found it kinda funny that PCSX2 would also sometimes do the same thing. I wouldn’t consider these things to be compatibility issues as they are present in the original game. On the PS2, during the first level after the boss fight- Shion calls her E.S. in a cutscene- which I remember distinctly would always cause a slight pause until the next section of the cutscene was played- but instead, PCSX2 continued the cutscene fluently- a rare instance of the emulator doing a better job than the original hardware compatability-wise. On the PS2, the game could be a little slow around save points sometimes, so trying to emulate an already unstable part of the game is likely what is causing slowdown issues. It’s not 100% PCSX2’s fault that save points often don’t work, as I have replayed the game many times on the PS2, and weird things did tend to happen with them. I imagine having bugs on top of bugs would be a sticky problem to untangle.
Please read the previous setting I posted for my laptop. Everything I said there still applies, I just note where things changed here.

PC Specs:
HP Omen Desktop Computer with Hana motherboard
CPU: AMD Ryzen 5 5600G (6 cores, 6 threads - it’s an APU but the integrated graphics are disabled for some stupid reason)
GPU: Nvidia RTX 3060
RAM: 16 GB
OS: Windows 10
PCSX2 Build: 1.7 AVX2 version

BIOS Used: USA v1.06
EE/VU Clamp modes: Default
EE/VU Rounding: Default
GSdx Settings: OpenGL Hardware, 6x Native, Interlacing (Automatic), Bilinear (PS2) Texture Filtering (works the best).

Amount of testing done (little/medium/much/completed-game): I almost have completed the game. As of now, I’m about to start planet Michtam, the final level. I hear the Zarathustra dungeon can introduce some new glitches, and I’ll report on that when the time comes.

Speedhacks Used:
I did a lot of testing on my laptop only on the early levels before, but this time around, I did it on a desktop PC with a graphics card and I’m nearly all the way through the game. Things went down a lot more smoothly this time around. I had no need for speedhacks. I did test out a few, but I didn’t find any discernible difference with any of them.
I can play the game at 350fps, and even if I add a bunch of shaders and max out the settings, the numbers don’t change. The main issue was compatibility, not speed. I’ll get to that in a moment on how to fix a few of these issues.

Graphics Settings:
I set most everything to the default options.
The game runs best on the “Aggressive” CRC hack option. Unfortunately, it does remove some of the filters and shaders originally in some cutscenes. Particular cutscenes set in the past are no longer in a sepia tone, and the dream-like sequences with Professor Mizrahi look normal. Very little is sacrificed by just using the Aggressive setting.
Some reflective surfaces looked pixelated and strange. Not terribly distracting though. And shadows looked odd too sometimes in cutscenes. No setting I used fixed the issue. Also, during close-ups, the background of characters would sometimes turn black. I didn’t bother trying to fix that problem by changing any setting, so maybe there is a solution. Both these graphical issues are rare and don’t really affect the experience. Overall, cutscenes had the most graphical glitches. Regular gameplay was perfect most of the time.
I played the game primarily using OpenGL. Considering how fast I can run the game on my beefy desktop computer, I was surprised that OpenGL still stuttered, especially when particle effects or other special effects were on-screen. The scenes with U-DO in them with his red wave form, and the special effects from attacks in the E.S.’s have noticeable slow-down. I can still turbo through these things, I have plenty of power at my disposal, it’s just a problem with everything syncing up in the right way I suppose. It’s worse at some points than others.
DirectX ran smoother and at a more consistent framerate, but it also had more graphical issues – one of which was too egregious to ignore. The screen would turn dark, then bright, then dark again. Brighter objects on screen would turn even brighter when the screen went dark. No matter how many boxes I checked or hacks I used, I couldn’t get that graphical glitch to go away. My laptop had the same issue. It was consistent all the time, drove me crazy. Apparently the problem isn’t present in older versions of DirectX, like DirectX9.

Advanced Settings and Hacks:
A blackbox on the ground appears when using both DirectX and OpenGL. The solution in DirectX was to use the “Normal (Vertex)” option in Half-Pixel Offset. OpenGL needed “Special (Texture)”. Another option for OpenGL was to check the “Pre-Load Frame Data” option in Rendering Hacks, which wasn’t necessary on my AMD laptop with integrated graphics. I used both that and Special (Texture) just in case.
I checked “Disable Safe Features”, it helped with some graphical glitches. Force-Disabling Half-Screen Detection helped in some cutscenes. I turned on Trilinear Filtering as well. Checking “Merge Sprite”, and setting “Round Sprite” to “Full” didn’t seem to do much this time around though. Didn’t harm anything either, but apparently it helps other people using other types of computers.

A few extra notes:
The Xenosaga series are known for causing crashes at save points and having corrupted save-game files. Remember to use PCSX2’s Save State function.
I could just be superstitious, but I think perhaps maybe unchecking using NTFS file compression with memory cards solved some save instability I found. The Dev version has patches for memory cards that significantly stabilized my issues with saving the game, so I’d recommend that version for the Xenosaga series.
Also, the emulator only shows a pixelated image on the in-game save screen rather than a screenshot. I wonder if that function is causing some undefined behavior in the game, at least for Xenosaga 3.
So I tested out the Zarathustra dungeon on my HP Omen desktop, and there were a large number of graphical issues. Xenosaga 3 is terrible at shadows and reflective surfaces. Thankfully, it doesn't typically effect the quality of the experience that much. The Zarathustra dungeon was full of reflective surfaces, so it was an absolute mess graphically. Switching to software rendering solved the issue, as no other option worked, which was really too bad. The game crashed near a save point there, and it never crashed even once before. Some reflections just had a big black box in them with these ugly yellow pixelated textures on them. The graphical problem looked similar-ish to a graphical bug found in the Spongebob game in the Dolphin emulator. Maybe it's a similar issue? https://dolphin-emu.org/blog/2021/09/07/...1/#cuthere
DirectX was especially buggy in the Zarathustra dungeon. I absolutely do not recommend the DirectX Hardware Renderer. The screen kept on stuttering while using it, and it had all the same graphical glitches as with OpenGL. I managed to finish the game all the way through, so everything works just fine.

The Vulkan Renderer and the Nightly Build:
Not too long ago, after I started replaying Xenosaga Episode 3, PCSX2 came out with the Vulkan renderer, and they now call the Development builds the Nightly builds. I did some preliminary testing on the most recent Nightly build as of 1/17/2022. The Nightly build also crashes sometimes. Previous development builds were more stable, but I like what they're doing here, and I have faith they'll iron out the bugs. The scene where Shion gets out of her E.S. at Pedea beach crashes in every renderer.

I tested the Vulkan renderer on my HP Omen Desktop with it's Ryzen CPU and Nvidia RTX 3060. I didn't expect much, but I'm impressed it works as well as it does being so new. It's as fast as OpenGL, and I used the exact same settings as OpenGL. I've tested Xenosaga Episode 3 on three different computers, and I noticed that black boxes appearing on the ground seem to be a commonly occurring issue. With DirectX and OpenGL, this problem is solvable - but I was not able to find a way to solve this problem with Vulkan on my HP Omen. Again, it's very very new. I noticed that it wasn't faster than OpenGL most of the time, but it did deliver a more consistent framerate. Just had to ignore the black boxes on the floor. I tested and compared both Vulkan and OpenGL against each other at a variety of different resolutions.

The results with Vulkan on my AMD laptop were more interesting. It only has integrated Radeon Vega graphics. I had to select a lot of options to get it to work on Vulkan. For some odd reason, the graphics would start out super-buggy if I booted the game in Vulkan, but if I booted the game in OpenGL first then switched to Vulkan, it would mostly be fine. I won't go into deep details with that though as the renderer is very new, so a lot of bugs are likely to be ironed out whether I speak of them now or not. At 1080p, OpenGL chugs, and Vulkan is perfectly smooth. But the gap widens at even slightly higher resolutions. This laptop can't do 1440p or higher on OpenGL, but it did up to 90fps on Vulkan at 1440p. Admittedly, it did skip a frame or two though.

Also, the black box problem I encountered on my Nvidia graphics card didn't appear on my AMD iGPU with the typical settings I like to use. Having the accuracy of OpenGL, but with the smoothness of DirectX is really nice, and I think Vulkan will eventually be able to provide that. The Steam Deck and the upcoming Samsung Galaxy phone will do really well with Vulkan due to their AMD graphics. AMD invented Vulkan after all, and it really shows here. Very much looking forward to the results from that.

Users browsing this thread: 1 Guest(s)