Anarchist420
Diamond Member
What are they and why did/does ATi have a problem with them while nvidia doesn't?
I heard from someone at Beyond 3D (in 2008) that ATi had trouble with Tjunctions, whatever they are,Source? or is this just FUD?
That doesn't mean that countless classics don't use it. That doesn't mean that Rage won't be using it.OpenGL is dead.... Linux users? lulwut? who games on linux? 😛
also troll post?
I thought Nvidia where the ones with bad OpenGL performance... but like I said... hardly any games use it nowadays.
ATi’s had superior OpenGL performance to nVidia for years in gaming, especially in Doom 3 engine games. I'm talking about a 6850 outrunning my GTX580 in Doom 3 at 2560x1600 with 8xMSAA (128.8 vs 114 FPS).I heard from someone at Beyond 3D (in 2008) that ATi had trouble with Tjunctions, whatever they are,
The classic source-ports often have performance problems on nVidia's parts, especially on Fermi derivatives. Even elementary rendering such as smoke or spark effects can cause performance to plummet.That doesn't mean that countless classics don't use it. That doesn't mean that Rage won't be using it.
Well, as long as IQ and compatibility are still perfect, I don't really care if I'm getting 40 fps when I could be getting 60, unless the hardware is slowing down because of a bad psu or a bad GPU fan or something like that. nvidia has more GL extensions than ATi does anyway, don't they?ATis had superior OpenGL performance to nVidia for years in gaming, especially in Doom 3 engine games. I'm talking about a 6850 outrunning my GTX580 in Doom 3 at 2560x1600 with 8xMSAA (128.8 vs 114 FPS).
Fermi parts are particularly slow in OpenGL with the GTX470 being a substantial performance downgrade to a GTX285.
The classic source-ports often have performance problems on nVidia's parts, especially on Fermi derivatives. Even elementary rendering such as smoke or spark effects can cause performance to plummet.
OpenGL is dead.... Linux users? lulwut? who games on linux? 😛
also troll post?
I thought Nvidia where the ones with bad OpenGL performance... but like I said... hardly any games use it nowadays.
Doesn't quake wars use open gl? also, I know back in the day when Doom 3 first came out, nvidia cards ran it MUCH better. Of course these were the 6800 and 7800 series.
Anyways, QW does run fine on my xfire system, at 2560x1600 with 8x or 16x AA I believe. Still get hundreds of FPS in most cases.
IIRC, ATi got faster frame rates if some command was entered into the console. Even ATi was stumped by it and spent forever trying to reduce IQ in Doom 3 in exchange for performance, but to no avail. Their IQ in Doom 3 was subpar partially because of the app specific optimizations and partially due to ATi's higher use of general optimizations (z-buffer and texture optimizations that I've noticed in the past) that signifigantly reduce IQ.Doesn't quake wars use open gl? also, I know back in the day when Doom 3 first came out, nvidia cards ran it MUCH better. Of course these were the 6800 and 7800 series.
Anyways, QW does run fine on my xfire system, at 2560x1600 with 8x or 16x AA I believe. Still get hundreds of FPS in most cases.
What single test ever gave you that idea?
Some of your arguments seems to have no basis in reailty but founeded in fuzzy-warm-feelings and hope?.
ATis had superior OpenGL performance to nVidia for years in gaming, especially in Doom 3 engine games. I'm talking about a 6850 outrunning my GTX580 in Doom 3 at 2560x1600 with 8xMSAA (128.8 vs 114 FPS).
Fermi parts are particularly slow in OpenGL with the GTX470 being a substantial performance downgrade to a GTX285.
You take no issue with nVidias $500 MSRP flagship GTX580 being slower than ATis $179 MSRP mid-range, but you feel concerned enough about T junctions to start a thread about it? LOL.This isn't meant to be a smart ass question, but why does it matter if one runs at 129fps while another runs at 114 fps? Unless it were an Fzero port with adjusted physics, I couldn't care less if I was getting 60 or 120 fps. Of course, that's just me and a few other people.
Lets say this is true. Why dont you explain to us what it means, citing specific examples in games where it has affected you?nvidia has more GL extensions than ATi does anyway, don't they?
Uh, no. Youre confusing two different issues, so let me explain them to you.IIRC, ATi got faster frame rates if some command was entered into the console. Even ATi was stumped by it and spent forever trying to reduce IQ in Doom 3 in exchange for performance, but to no avail.
You havent seen sub-par quality in Doom 3 until youve seen it in on an nVidia 7000 card with default driver texture quality settings. Wall surfaces would wave like flags during movement, while floors shimmered like a swarm of ants.Their IQ in Doom 3 was subpar partially because of the app specific optimizations and partially due to ATi's higher use of general optimizations (z-buffer and texture optimizations that I've noticed in the past) that signifigantly reduce IQ.
less extensions can mean reduced compatibility. I haven't tried many GL games on ATi hardware so I can't say it has affected me.Let’s say this is true. Why don’t you explain to us what it means, citing specific examples in games where it has affected you?
nvidia's not perfect and yes, the 3dmark app specific optimizations nvidia has used pissed me off, but I've always thought nvidia's IQ has been better, at least to me. i also haven't played Doom 3 on ATi HW, so I can't comment on that. Maybe the 6850 has as good of filtering quality, but I haven't seen a 6850 in action, so I'm commenting on the 5770's awful filtering quality, which I believe was hardware-based, not due to driver optimizations.Uh, no. You’re confusing two different issues, so let me explain them to you. The console command (e.g. bilinear) caused both vendors to lose performance because the game went from per-texture AF to global AF. ATi had an IQ driver bug with the per-texture AF which was later fixed with a 1 FPS performance hit. The other optimization was ATi replacing texture lookups with shader ALU calculations because it was faster on their hardware. For all intents and purposes the IQ was practically identical. Yeah, it’s an application specific optimization but both vendors do things like that these days. You going to start a thread complaining about nVidia’s 3DMark shader substitution (which actually caused IQ to change), and the insertion of static clip planes?
I put in on HQ and only got shimmering with AF enabled when I had a 6800GT. I never used default quality settings.You haven’t seen sub-par quality in Doom 3 until you’ve seen it in on an nVidia 7000 card with default driver texture quality settings. Wall surfaces would wave like flags during movement, while floors shimmered like a swarm of ants.
http://www.geeks3d.com/20110617/tes...gl-tessellation-with-recent-catalyst-drivers/
I just googled "open gl performance" and searches from the last month turned that thing up.
What an increase...!?