Let’s say this is true. Why don’t you explain to us what it means, citing specific examples in games where it has affected you?
less extensions can mean reduced compatibility. I haven't tried many GL games on ATi hardware so I can't say it has affected me.
Uh, no. You’re confusing two different issues, so let me explain them to you. The console command (e.g. bilinear) caused both vendors to lose performance because the game went from per-texture AF to global AF. ATi had an IQ driver bug with the per-texture AF which was later fixed with a 1 FPS performance hit. The other optimization was ATi replacing texture lookups with shader ALU calculations because it was faster on their hardware. For all intents and purposes the IQ was practically identical. Yeah, it’s an application specific optimization but both vendors do things like that these days. You going to start a thread complaining about nVidia’s 3DMark shader substitution (which actually caused IQ to change), and the insertion of static clip planes?
nvidia's not perfect and yes, the 3dmark app specific optimizations nvidia has used pissed me off, but I've always thought nvidia's IQ has been better, at least to me. i also haven't played Doom 3 on ATi HW, so I can't comment on that. Maybe the 6850 has as good of filtering quality, but I haven't seen a 6850 in action, so I'm commenting on the 5770's awful filtering quality, which I believe was hardware-based, not due to driver optimizations.
You'll have to elaborate on nvidia's use of static clip planes, that would help me out.
I had thought that id enabled something that reduced performance on ATi hardware at the time because nvidia paid them to. When that was disabled, ATi's performance was better at no IQ cost. That was what I was talking about. I remember that when I had a 5770, linear_mipmap_nearest or something like that was the best it could do, while with the nvidia hardware, it could do linear_mipmap_linear, and when I had a 5770 it shimmered like crazy in MDK2, a GL game (the only one I tried), while it doesn't shimmer on nvidia hardware.
You haven’t seen sub-par quality in Doom 3 until you’ve seen it in on an nVidia 7000 card with default driver texture quality settings. Wall surfaces would wave like flags during movement, while floors shimmered like a swarm of ants.
I put in on HQ and only got shimmering with AF enabled when I had a 6800GT. I never used default quality settings.
As for the t-junctions, I don't know because nvidia was reported not to have them. That was why I was asking about them.
Don't get me wrong, nvidia isn't all that great either. We need a 3rd (other than intel) or even 4th player in the GPU industry. There are many things about nvidia's drivers that piss me off (no choice for forcing all existing RGBA and D formats, for example). I don't play games as much as I used to, because the technology used could be a lot better than what could be used. I'm sick of all the lossy compression for example. I'd rather just have smaller textures and use up 2x as much HDD space for lossless audio in games. Another example is that there aren't any good monitors and all the HDTVs suck because they always have high input lag, are limited to 60Hz and few have RGB LED backlights. My Apple LED cinema display is decent, but it still kind of sucks because it has at least ~12ms input lag, pretty poor contrast ratio, it's limited to 60 Hz, and it doesn't have a very high color gamut, but at least it's an H-IPS (no banding, excellent viewing angle), glossy, and doesn't have more input lag than it does, like an HDTV would even on game mode.