[purple windows/underside cars] Burnout revenge/Burnout 3 graphic/texture glitch
#41
drivers usually don't fukk up graphics that way. and the plugin is using standard rendering formats and techniques. but it does use different code for transforming ps2 textures on different capable CPUs ranging from SSE2 to SSE5.

there goes your hatin' amd driver bug. squall. Tongue

Sponsored links

#42
(08-27-2011, 02:22 AM)xstyla Wrote: drivers usually don't fukk up graphics that way. and the plugin is using standard rendering formats and techniques. but it does use different code for transforming ps2 textures on different capable CPUs from SSE2 to SSE5.

there goes your hatin' amd driver bug. squall. Tongue

he hates AMD like Intel and Nvidia together

why don't you just say some facts and proofs
#43
It's mosdef just a gsdx-SSE2 bug and maybe a lil amd architecture specific but this nothing that couldn't be solved. I have a lil time and I'd try myself as a hobby coder on a amd system helping to fix that specific bug things, but I can't afford to buy or (obviously illegally) dl tons of games just for testing that sh!t. also a 1 or 2 complete frame dumping feature for debugging (yeah... I thought about that) is not available nor does it make any sense to up and down megs of data for that.

this' were it goes nowhere.

fish... yeah I felt like posting that. Smile
#44
(08-27-2011, 02:22 AM)xstyla Wrote: drivers usually don't fukk up graphics that way. and the plugin is using standard rendering formats and techniques. but it does use different code for transforming ps2 textures on different capable CPUs ranging from SSE2 to SSE5.

I just put you on ignore, because your post was completely devoid of reality.

you might want to get off that cocaine man.

AMD has a history of having problems with common texture formats (that even intel gets right)
#45
(08-27-2011, 03:40 AM)Squall Leonhart Wrote: I just put you on ignore, because your post was completely devoid of reality.

you might want to get off that cocaine man.

AMD has a history of having problems with common texture formats (that even intel gets right)

ohh wow. maybe you took a nose too much of your "nvintel owns everything" attitude.

maybe we should meet outside and settle that thing with some beats. muhaha. Laugh

fish... I don't really give a sh!t about ya. and this' off the scope of this thread anyway. for the future you should have some real facts and proofs instead of your half-ased infos. you a lil spoiled kid... aren't you?!

still no offense. just having a lil fun. Wink
#46
You must be looking for a ban, although being an idiot isn't an offence, bad language is.

my reference to intel was on the basis of their lackluster gfx hardware and drivers in days long past.

Your ignorant belief that AMD has no problems with standardised texture formats is the real problem, I don't mind using Intel drivers these days, the recent hardware isn't as fast, but its capable atleast of rendering things to the same level as an nvidia card. hell i'd even use S3's low performing chrome series just because they get their drivers right.

AMD/ATI drivers choke on alot of common texture formats, to this day many game developers have to convert their texture handling on a per gpu basis just because the Radeon drivers have either graphical anomaly or performance problems.

Code:
case GR_TEXFMT_ARGB_1555:
                if (ati_sucks > 0) return -1; // ATI sucks as usual (fixes slowdown on ATI)
                factor = 2;
                *gltexfmt = GL_RGB5_A1;
                *glpixfmt = GL_BGRA;
                *glpackfmt = GL_UNSIGNED_SHORT_1_5_5_5_REV;
                break;
#47
indeed
#48
(08-27-2011, 04:20 AM)Squall Leonhart Wrote: You must be looking for a ban, although being an idiot isn't an offence, bad language is.

my reference to intel was on the basis of their lackluster gfx hardware and drivers in days long past.

Your ignorant belief that AMD has no problems with standardised texture formats is the real problem, I don't mind using Intel drivers these days, the recent hardware isn't as fast, but its capable atleast of rendering things to the same level as an nvidia card. hell i'd even use S3's low performing chrome series just because they get their drivers right.

AMD/ATI drivers choke on alot of common texture formats, to this day many game developers have to convert their texture handling on a per gpu basis just because the Radeon drivers have either graphical anomaly or performance problems.

Code:
case GR_TEXFMT_ARGB_1555:
                if (ati_sucks > 0) return -1; // ATI sucks as usual (fixes slowdown on ATI)
                factor = 2;
                *gltexfmt = GL_RGB5_A1;
                *glpixfmt = GL_BGRA;
                *glpackfmt = GL_UNSIGNED_SHORT_1_5_5_5_REV;
                break;

where did you got that code ?
#49
[Image: itsasecrettoeverybody.png]
#50
lol. you asked for it. that code snippet you copypasted from google there is ridiculous to me. cause:

we all know that ATI didn't and still doesn't write good openGL drivers. and I did get that once again myself watching some of the latest hardcore demoscene prods. this no secret nor anything to hate them. cause most - if not all - of the top notch pc games and even GSDX is using the far more better DX, where openGL incompatibilities are just non exsistant. and for that format. RGBA_5551 isn't even commonly used by developers. this' a rare case scenario.

point for me. Tongue




Users browsing this thread: 1 Guest(s)