Yeah, more times than I probably should lol... I just get a bit frustrated when the issue gets spinned out of proporotion like it does so often. Not implying you were trying to spin it though, just it seems so much about this issue is misunderstood and misconstrued for the sake of trying to make one company look better/worse than the other.
Yeah, things like this have a bad tendency to spiral downward to the gutter and get quite a bit off course from the original issue.
Exactly :thumbsup:
I think the best thing an enthusiast can do for him/herself here is to understand the technology, how it's used, and its limitations. Then, each individual can construct a personal opinion on what they prefer for their gaming/computing experience. I've already said a lot of this is very subjective, and the people that want to label one card/method/company "good" or "bad" either clearly don't understand the technology or issues at hand or are purposefully distorting it for some other agenda. Either way, it's a disservice to the community. The objective here is to learn and gain some perspective, not try shove your opinion down someone else's throat.
Given enough time, perhaps a review of how each company does there AF texture filtering could be done, and then images with each setting could be shown to see the differnces, if any at all. Not necessarily trying to compare AMD to Nvidia directly per se since that will spiral downward rather quickly, but just a review of each and just what happens to IQ when the optimizations are enabled/disabled, and their respective performance impacts as well. I would enjoy reading something like that quite a bit.
I agree. Under the "there's more than one way to skin a cat" philosophy, it'd be interesting to see a break down of the different approaches available. However, I understand why a lot of review sites wouldn't do it, as it's extremely time-consuming and not really high-yield unless you find something revealing.
There's no magic "Fair" comparison. AMD/Nvidia will likely always have differences in their Output and just using "Highest" or some other generalized quality setting still won't tell you much. Reviews should certainly do Output Quality comparisons and most do, but to make exact Output for all testing from both AMD/Nvidia would take an absurd amount of tweaking and time.
Time is definitely a factor. Furthermore, I don't want to see review sites remove driver optimizations in their reviews by using "high quality" settings only. These companies invest a hell of a lot of time creating these optimizations, and they're done, in good faith, to improve performance without impacting image quality. Forcing the drivers to run without optimizations really defeats the purpose of good drivers/driver updates, and I'm not paranoid or arrogant enough to think myself better than either company's team of engineers.