Originally posted by: Gstanfor
The optimizations themselves are not hardwired in, but rather a different way (compared to previous nvidia GPU's) of performing AF is hardwired into nvidia's gpu's. This was in direct response to ATi's AF optimizations starting with R200 and the way consumers reacted to them. You can read nvidia's statement on this subject for yourself.
of course it's in the hardware. by design the hardware is limited to angle dependant AF, which is an optimization.
'angle dependant' means that (at the basic level), rather than applying whatever level of AF designated by the user via the driver to the entire scene, the fx/nv40/g7x cards from nvidia use an angle dependant mechanism in hardware to determine how much filtering a particular area requires.
each poly within the scene is 'examined' and depending on the angle at which it 'slopes', it (the hardware, not the user via the driver cp) chooses an 'appropriate' level of filtering from 2x all the way up to whatever level is specified by the user via the driver cp or the application.
that is certainly an 'optimization', and it runs regardless of wether HQ is selected or not -
meaning the user cannot under any circumstance turn it off. this means despite your reluctiance to admit you are wrong, proves you are.
now if you want to actually change the subject and start discussing where this 'method' became prevelant, then yes, it was with the r300 (9700) and continued thru the x850 series (there's actually a couple year old thread here where i compared the x800 and 6800 opts, and while they both used it, it was pretty clear nv40 still offered better AF). frankly until ati's r5xx, the last decent card for textrue filtering was the GeForce 4 - all cards from then until x1k offered very poor texture filtering, regardless of whether they were red or green.
however neither the 6800GT or x800 showed the 'shimmering' issue; that came to bear with whatever changes nv made in their 7xxx series. in fact, it was outright terrible until (iirc) one of the 78.xx series drivers were released. this reduced the shimmering when using HQ mode (the default mode was still terrible), however it did not eliminate it.
as far as nvidia's statement, i don't need to read it, as i am very familiar with how this all trainspired, having owned each gen of hardware from both ati and nvidia since geforce2 and radeon 32 DDR.
Oh, and BFG10K keeps comparing his G71 to his X800 in this thread (and others), so R3/4xx comparisons most certainly are valid for this thread, and precedent for this was NOT set by me...
depends on what you're comparing. the x800 doesn't shimmer; the g71 does. OTOH you're participating in this discussion from a fan/loyalist point of view, and rather than looking at the facts objectively, you are jumping around topics, generations, etc. in an effort to make excuses (and failing rather miserably i might add).
ati is not without it's faults, but on this particular subject it's nvidia that fails, not ati. your constant rhetoric and fallacies do not change this fact, nor does it change the fact you are emphatically incorrect in your assertation that selecting HQ from the nv CP turns off all texture filtering optimizations - it can't, as unlike ati's current cards, g70/71's hardware is incapable of applying full AF across the entire scene.