ATI cheating in AF?

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ChkSix

Member
May 5, 2004
192
0
0
To sum this up, what I think they are really saying is:

Regardless of what the user sets in the CP, we switch things on and off within a program so that speed and FPS can be maintained consistently throughout the performance envelope. And if the end user doesn't notice or take the time to investigate the transitions or mode mixing and the IQ maintains an even distribution as much as possible, than it isn't wrong.

Even though when tests and benchmarks are done, they ARE done for specific modes.......................

I got a bridge to sell anyone interested with a nice view of lady liberty. :D
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: hysperion
chksix hit the nail on the head....do you really think ATI would put this algorithim on the 9800 series of cards when they were already handily beating the geforce fx series? Of course not...they put it on now to compete/win.

What? You're saying that, even though they had something that would have given them a HUGE edge over the GeForceFX cards, they would have left it out? Why? This has been around in the 9600-series cards since Cat 3.4 (that's last April) -- certainly it would have been a very desirable feature for the last year!

And Matthias the first paragraph I wrote in my last post was: "The problem with the adaptive algorithm isn't that it isn't an effective method. For all I or anyone else knows it doesn't hurt the IQ. In fact if ATI has developed an adaptive method that dramatically improves performance kudos to them."

Yet you still responded: "If their adaptive method increases performance but doesn't lower IQ, is it still wrong??"

Yes, because you seemed to be taking two different viewpoints.

NO!, not if they tell you they are doing it...but when they tell you they are doing full trilinear which is what benchmarks use to show how powerful the hardware is but it turns out they are not....that is cheating....If it isn't cheating why wouldn't they be hyping their new algorithm as a new feature? Obviously they didn't fell like telling people about it for the reasons I've posted....

ATI maintains that what they're doing is as good as "full" trilinear -- in fact, in some situations, it falls back to doing that anyway (such as with user-specified mipmaps). Now, I haven't seen anyone show yet whether or not it is really as good. It's certainly close -- very close in many instances, if not identical. But, my question stands -- if their adaptive algorithm is just as good as "full" trilinear in terms of IQ, would using it when the user calls for "full trilinear" be 'cheating'?

I am very disappointed in ATI for doing this without telling anyone, but it doesn't make their benchmarks invalid (at least not when compared against the 6800 with its 'brilinear' turned on). And if the IQ of it matches up with "full" trilinear very closely, and the 6800 with 'brilinear' doesn't, I'm not sure it's fair to bench *those* against each other either. But this starts to get very thorny, and is part of why IQ benchmarking is so difficult.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: LTC8K6
Adaptive algorithm with bi-linear (performance) and tri-linear (quality) options

That phrase appears to be in the specs on ATI's site for all of it's DX9 cards. 9500 and up.
I have not seen it for the DX8 cards.

This, I'm pretty sure, just refers to the fact that it can vary from bilinear to various amounts of trilinear filtering based on the driver settings. AFAIK, only the 9600s and X800s use the adaptive trilinear mechanism that everyone is talking about here.

Note this also:

SMOOTHVISION 2.1 employs an adaptive algorithm that takes from 1 to 16 filtered samples per pixel as required to achieve ideal quality, without wasting effort on parts of the image that would not benefit.

SMOOTHVISION is ATI's name for their multisample antialiasing (MSAA) algorithm. It has nothing to do with texture filtering.
 

ChkSix

Member
May 5, 2004
192
0
0
I am very disappointed in ATI for doing this without telling anyone, but it doesn't make their benchmarks invalid

I think many others might disagree here bro. Me personally, I think it makes their benchmarking very invalid. But like you said, it does become extremely difficult as well as controversial now to get accurate results.

In the end, anyone can try to paint a duck to look like a sheep, but it will remain a duck regardless of the coats of paint applied it. Nvidia did it with their NV3x and got beat for it bad, and good for ATi then. But now, ATi not only incriminates itself with it's own answers, but openly admits it switches AA, AF and filtering modes regardless of what the CP settings are. I wonder how anyone can benchmark one of their cards in anything other than no AA/Ani/Filtering and be confident with their end results.

As far as your algortihm question. Of course it is cheating if the other guy is using the full trilinear method and losing performance over it while ATi mode switches to keep up FPS and gain an edge. FPS (and IQ) alone can convince someone of purchasing product A over product B if it comes out ahead on the same or most 'benchmarks'. It is also cheating because it independently switches and cannot be 'toggled' on or off by the end user. Regardless of what is selected within the control panel, it is going to do it's own thing, and that my friend is outright lying and cheating to the consumers who think they are getting 100fps at 4AA/8Anio, when in fact they might be getting 100FPS at something different. It would be different and not considered cheating if one: they came out and said it from the beginning, and two: the user has the ability to shut it on or off to benchmark more precisely with a competing product.

If I were Nvidia, I would be stirring up a Sh|t Storm regarding this now. And if they can prove their performance coincides and the mode remains constant with what the end user selects in the control panel, then say goodnight to ATi, because it just took one big left hook to the upper temple this round.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I dont really consider it cheating if they would have informed the reviewers of this. There is something clearly going on here. So while the 6800 was forced to use full trilinear the X800 got away with brilinear. Sure it may look good but it isnt Trilinear.

Now there was at least one review that allowed for the brilinear to be used on the 6800 and the 6800 looked quite good. Not like it doesnt now but lets say the performance advantage the X800 had over it was nullified.

In the future I think it is safe to say that using Nvidias Brilinear against ATIs Brilinear is as close as you get.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: ChkSix
I am very disappointed in ATI for doing this without telling anyone, but it doesn't make their benchmarks invalid

I think others might disagree bro. Me personally, I think it makes their benchmarking very invalid. But like you said, it does become extremely difficult as well as controversial now to get accurate results.

It might invalidate results against NVIDIA doing full trilinear, but certainly not against NVIDIA using brilinear. It certainly complicated matters.

In the end, anyone can try to paint a duck to look like a sheep, but it will remain a duck regardless of the coats of paint applied it. Nvidia did it with their NV3x and got beat for it bad, and good for ATi then. But now, ATi not only incriminates itself with it's own answers, but openly admits it switches AA, AF and filtering modes regardless of what the CP settings are. I wonder how anyone can benchmark one of their cards in anything other than no AA/Ani/Filtering and be confident with their end results.

Where did they say they were switching AA and AF modes? Both NVIDIA and ATI use adaptive AA (ie, MSAA as opposed to SSAA), and NVIDIA uses adaptive AF as well with the 6800-series cards (though not with the FX-series).

Edit: you changed half your post, now I have to rewrite mine. Thanks.

As far as your algortihm question. Of course it is cheating if the other guy is using the full trilinear method and losing performance over it while ATi mode switches to keep up FPS and gain an edge. FPS (and IQ) alone can convince someone of purchasing product A over product B if it comes out ahead on the same or most 'benchmarks'. It is also cheating because it independently switches and cannot be 'toggled' on or off by the end user. Regardless of what is selected within the control panel, it is going to do it's own thing, and that my friend is outright lying and cheating to the consumers who think they are getting 100fps at 4AA/8Anio, when in fact they might be getting 100FPS at something different. It would be different and not considered cheating if one: they came out and said it from the beginning, and two: the user has the ability to shut it on or off to benchmark more precisely with a competing product.

That wasn't my question. Obviously, if it's gaining speed by degrading IQ, and there's no way to turn it off, that's a Bad Thing, and is giving ATI an unfair edge in the benches. But nobody seems to have noticed anything more than *very* slightly degraded IQ on the X800 cards (I read Toms Hardware's review, and the differences in their screenshots are slight, and in some cases they're identical). Certainly ATI's adaptive trilinear looks a *lot* better than NVIDIA's 'brilinear' filtering.

If ATI's adaptive trilinear is just as good as their (and/or NVIDIA's) "full" trilinear in terms of image quality, would it be a valid optimization to substitute it in when the user asks for "full" trilinear?

In fact, this same thing goes for any sort of IQ features -- if I ask for 4xAA/8xAF, and they do something that's *not* conventional AA/AF (but looks the same), is that OK? Does it matter if they tell me? Does it matter if I can turn it off and get conventional AA/AF?

If I were Nvidia, I would be stirring up a Sh|t Storm regarding this now. And if they can prove their performance coincides and the mode remains constant with what the end user selects in the control panel, then say goodnight to ATi, because it just took one big left hook to the upper temple this round.

Based on looking at AF benches where they turned on brilinear on the 6800, I think ATI still compares favorably. The 6800 gains very little in terms of FPS by enabling brilinear filtering (of course, I think the quality of it has improved as well from the FX-series cards).
 

ChkSix

Member
May 5, 2004
192
0
0
It might invalidate results against NVIDIA doing full trilinear, but certainly not against NVIDIA using brilinear. It certainly complicated matters.

You're right.

Where did they say they were switching AA and AF modes? Both NVIDIA and ATI use adaptive AA (ie, MSAA as opposed to SSAA), and NVIDIA uses adaptive AF as well with the 6800-series cards (though not with the FX-series).

Andy/Raja Would you say that our AF is not "full" AF? After all, we've been using an adaptive method for this since the R200. If you select 16x in the control panel, you may get 8x, 4x, or 2x depending on how steeply angled the surface is. Doing 16x AF on a wall you're viewing straight on would look exactly the same as no AF, but require 16x more texture samples (hence a speed decrease). Why would it make any sense to do this? This is exactly the same idea we're using for trilinear filtering. (From ATi's own 'open chat' dialogue)

Nvidia uses this approach as well? I wasn't aware of that. Thank you for the information. (I am being serious, not facetious)

If ATI's adaptive trilinear is just as good as their (and/or NVIDIA's) "full" trilinear in terms of image quality, would it be a valid optimization to substitute it in when the user asks for "full" trilinear?

Only if the option was left open to the end user in my opinion.
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: Matthias99

If ATI's adaptive trilinear is just as good as their (and/or NVIDIA's) "full" trilinear in terms of image quality, would it be a valid optimization to substitute it in when the user asks for "full" trilinear?


No. The only valid method would be the option for true trilinear and brilinear and allow the user to choose.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Full Transcript:

Q (TheRock): Why are you not doing full AF?
A (Andy/Raja): Would you say that our AF is not "full" AF? After all, we've been using an adaptive method for this since the R200. If you select 16x in the control panel, you may get 8x, 4x, or 2x depending on how steeply angled the surface is. Doing 16x AF on a wall you're viewing straight on would look exactly the same as no AF, but require 16x more texture samples. Why would it make any sense to do this? This is exactly the same idea we're using for trilinear filtering.
Q (rush): Why are your trilinear optimizations different to what Nvidia is doing?
A (Andy/Raja): We won't comment on competitor's algorithms. Our focus is to retain full image quality while offering the best performance possible.

No-one is suggesting that our texture filtering algorithms produce a worse output than previous generations . In fact, if we took them out now the speed would be marginally less and we will receive complaints from users that our quality is lower . We did receive feedback from several folks who think that the X800 IQ is better than our 9800 series - it is always our goal to improve quality in newer hardware generations . To assist in this we have many additional quality controls in hardware in X800 series than on 9800

Our target is also to avoid any need to detect applications, and as such we have to try to be sure that our image quality remains high in all cases. To achieve this we spent a lot of effort developing algorithms to make the best use of our quality tuning options. This isn?t a performance enhancement applied to popular gaming benchmarks , it?s a performance boost ? without image degradation ? applied generically to every game that uses box-filtered mipmaps, which is most of them. This means, incidentally, that it?s working during the WHQL tests (unlike optimizations activated by application detection), which means that it has to meet the very stringent image quality requirements of those tests.
Q (TheRock): Will full trilinear filtering be allowed to be set in the drivers?
A (Andy/Raja): We try to keep the control panel as simple as possible (even as its complexity increases), and if the image quality is identical or better there doesn't seem to be a need to turn it off. However, if our users request otherwise and can demonstrate that it's worthwhile, we would consider adding an option to the control panel.
Q (singer): Is this really trilinear filtering?
A (Andy/Raja): Yes, It's a linear function between the two mipmap levels based on the LOD.
Q (gs): When will ATI restore full trilinear so that review sites can actually rebench and retest your cards, since any previous review benchmarks is invalidated by this cheat/optimisation/whatever?
A (Andy/Raja): We have never removed "full trilinear". We certainly do not believe that any benchmarks have been invalidated by our techniques. In all cases reviewed so far we believe that we have higher image quality than other implementations.
Q (Sphinx): Is Ati cheating if Colored MipMaps are enabled and shows True FULL_TRI AF and Only Then. Like the Article in ComputerBase.de Descripe it as such one.
A (Andy/Raja): Absolutely not. If it were the case that we were only performing full trilinear with coloured mipmaps then you might have a point, but this is emphatically not what we do. We take advantage of properties of texture maps for performance and IQ gains. In cases where we are not able to determine that the texture content is appropriate for these techniques we use legacy trilinear filtering. This includes cases such as dynamically uploaded texture maps where we avoid performing analysis so as not to cause any possible gameplay hitches.
Q (volt_Bjorn3D): I think the whole community appreciates the time and the initiative for doing a chat about current filtering algorithm which certainly raised a few issues. Question: was the new filtering algorithm intentional and if you could tell us why werent review sites notified about it. Thanks
A (Andy/Raja): We are constantly tuning our drivers and our hardware. Every new generation of hardware provides the driver with more programmability for features and modes that were hard-wired in the previous generation. We constantly strive to increase performance and quality with every new driver release. There are literally dozens and dozens of such optimizations that went into our drivers in the last year or so. Sometimes many such optimizations are not even communicated internally to marketing and PR teams for example. And many optimizations are very proprietary in nature and we cannot disclose publicly anyways.
Q (astaroth): Is X800's hardware able to do "traditional" trilinear or is the new algorithm completely hardwired (not that I would mind ) ??
A (Andy/Raja): The X800 hardware is capable of all features of previous generations, and many more besides.
Q (alp): is this http://www.ixbt.com/video2/images/r...r420-anis0x.jpg (copy and past it) bilinear or bri/trillinear as is is supposed to be...i heard it is possilby a bug in cod causing it to set filtering to bi rather than a really bad trillinear filtering method, is this true?
A (Andy/Raja): This we believe is test error and the X800 images appear to be obtained using only a bilinear filter. We have been unable to reproduce this internally. Also, note that the game defaults to bilinear when a new card is installed and this may explain the tester error
Q (Singer): Why did ATI say to the general public that they were using trilinear by default, when in fact it was something else? (quality is ok, i agree, but you did deceive, by claiming it to be a trilinear)
A (Andy/Raja): We understand that there was confusion due to the recent reports otherwise. We provide trilinear filtering in all cases where trilinear filtering was asked for. As has been demonstrated many times by several people - almost every hardware has a different implementation of lod calculation and filtering calculations. If we start calling all the existing filtering implementations with different names - we will end up with many names for trilinear
Q (Wer): is bit comparison difference images can highlight IQ differences surely there must be some - why do you say there are no IQ differences when these comparisons show otherwise?
A (Andy/Raja): The bit comparision differences between implementations occur due to many reasons. We constantly make improvements to our hardware algorithms. Bit comparisions just say they are different - not necessarily that one is better than the other. We always on the lookout for cases where we can find IQ problems with our algorithms. We can guarantee you that there will be bit-wise mis-matches withour future generation hardware too and the future generation hardware will be better. Our algorithms are exercized by the stringent MS WHQL tests for mipmap filtering and trilinear and we pass all these tests. These tests do not look for exact bit matches but have a reasonable tolerance for algorthmic and numeric variance.
Q (crushinator): Is this Algorythm implemented in hardware? who's analysing texture maps, is it just the driver doing that or is it the chip?
A (Andy/Raja): The image analysis is performed by the driver to chose the appropriate hardware algorithm. This allows us to continually improve the quality and performance with future drivers.
The image analysis is performed by the driver to chose the appropriate hardware algorithm. This allows us to continually improve the quality and performance with future drivers.

Q (Bouncing Zabaglione Brothers): If its so good, why has it remained hidden from the public and not marketed as "ATI SmartFilter" or somesuch? Surely if its as good as you say (better quality, faster speed), ATI marketing should be crowing about it? One of the issue here is that it *looks* like ATI is trying to hide things, even if what you have is a genuine improvement for the customer.
A (Andy/Raja): The engineering team at ATI is constantly improving our drivers by finding ways to take better advantage of the hardware. These improvements happen during all the catalyst releases. We might have missed an oppurtunity to attach a marketing buzzword to this optimization!
Q (Hanners79): can you give a more detailed explanation as to why the use of coloured mipmaps shows the use of full trilinear, which doesnt correspond to what seems to occur in a normal, real-world situation?
A (Andy/Raja): Coloured mipmaps naturally show full trilinear as our image quality analysis reveals that there could be visible differences in the image. It should be noted that trilinear was originally invented to smooth transitions between mip-levels, and in the original definition mip-levels should be filtered versions of each other, as coloured mip-levels clearly are not. Despite this, we understand that people often make use of hardware for purposes beyond that originally envisioned, so we try to make sure that everything always functions exactly as expected.
Q (Cthellis): From previous comments, there have been mention of this technique used for a while (~12 months?) in the RV300 series of chips, but most as with R420. can you tell us which cards, which Catalyst versions, and/or which games exist where we can see similar tendencies,
A (Andy/Raja): We had new filtering algorithms in places since Cat 3.4 or so. Note that the image quality improved over various driver updates since. Also, as noted earlier, X800 provides better controls than earlier parts. It will be hard to find an exact match with our earlier hardware and drivers
Q (mabru): Dont you think such a tradeoff is inconceivable in a 500$ graphic card?
A (Andy/Raja): We think that what we do is expected of all our cards, in particular the more expensive ones. Our users want the best looking results and the highest quality results. They want us to go and scratch our heads and come up with new ways to improve things. Users of ATI cards from last year want us to come out with new drivers that improve their performance and maintain the image quality. We have dedicated engineering teams that work hard to improve things. It's an ongoing effort, exploring new algorithms, finding new ways to improve the end user experience, which is what all this is about. And we are listening too; if you don't like what we offer, let us know and we will strive to improve things.
Q (anonymouscoward): Image quality is a relative term. The real question is, "does the claimed 'trilinear filtering' produce a byte-for-byte replica of 'true trilinear filtering'?" Whether or not the image quality is "the same" or "essentially" the same is irrelevant to this questions
A (Andy/Raja): Byte for byte compared to what? "True trilinear" is an approximation of what would be the correct filtering, a blending between two versions, one which is too blurry and one too sharp. An improved filter is not byte for byte identical to anything other than itself, but that doesn't mean it isn't a better approximation.
Q (wild_neo): do you think that you still can compare the benchmarks with other brands, even if you use that different approach (non-equivalent technique)?
A (Andy/Raja): We've answered this, and yes, we feel we can compare ourselves to any brand, as we believe our quality and performance are higher. Perhaps at times we should be upset about people comparing us to lower quality implementations :)
Q (Toaster): whats the patent number and filing date of this algo?
A (Andy/Raja): This is in the patent pending process right now. So we will not put out the actual patent information at this time. Once approved, anyone can go read the patent.
Q (chris): What performance boost does this give you, anyway?
A (Andy/Raja): It's a very mild optimization at the default levels, so of the order of a few percent. But this is a neat algorithm - we encourage you to take the texture slider all the way towards "performance" - we don't think you'll be able to see any image change until the very end, and the performance increases are significant.
A (Andy/Raja): Thanks for your time. We appreciate your persistence through the technical difficulties. To ensure that you can all read all the answers, we will post the transcript of the session at www.ATI.com/online/chat.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: ChkSix
Where did they say they were switching AA and AF modes? Both NVIDIA and ATI use adaptive AA (ie, MSAA as opposed to SSAA), and NVIDIA uses adaptive AF as well with the 6800-series cards (though not with the FX-series).

Andy/Raja Would you say that our AF is not "full" AF? After all, we've been using an adaptive method for this since the R200. If you select 16x in the control panel, you may get 8x, 4x, or 2x depending on how steeply angled the surface is. Doing 16x AF on a wall you're viewing straight on would look exactly the same as no AF, but require 16x more texture samples (hence a speed decrease). Why would it make any sense to do this? This is exactly the same idea we're using for trilinear filtering. (From ATi's own 'open chat' dialogue)

Nvidia uses this approach as well? I wasn't aware of that. Thank you for the information. (I am being serious, not facetious)

Yes, this is what NVIDIA is doing on the NV40.

Now, AF is a case where a smart algorithm *can* (in theory) produce exactly the same results with less work. The level of AF required to reach the maximum attainable quality depends on the texture size, the exact filtering algorithm, and the angle and distance at which the texture is presented. If you try to filter more than that, you get no further improvement in IQ (and it hurts performance). Now, AFAIK, what NVIDIA and ATI do is a compromise -- they base the AF level solely on the angle of the textured surface (and not any of those other factors, like texture resolution) -- so there is probably some amount of IQ loss. But, in theory, if you used the exact level of AF that was required for each pixel instead of the full 16x (or 8x, or whatever), you would have exactly the same IQ but better performance (except in situations that required the full amount of AF to be applied everywhere; then you'd have identical IQ and identical performance). If you had an algorithm that did this (or came very, very close almost all the time), it would be foolish to not use it, and it might even make sense to force it to always be used instead of a "naive" AF algorithm.

Now, I still want an option in the drivers to disable ATI's trilinear optimizations, if only to see how and where their algorithm differs from "full" trilinear filtering. But, as a programmer, I can see why they might do something like this.
 

hysperion

Senior member
May 12, 2004
837
0
0
nvidia uses ssaa when it's at 8x....read the reviews and you'll see that....it's a big performance hit though.....if ATI had said they were doing this upfront, it never would have caused a controversy but here's what they really did.....

-Take the excellent architecture of the 9700
-shrink the die size to fit more transistors and slightly bump the clockspeed
-use gddr3 memory
-use adaptive algorithms to boost performance further, an algorithm that should be enabled on all 9800 series cards but isn't because it would cut into x800 sales.....maybe someone will figure out how to enable it and make the 9700/9800 an uber card again even now.....
-claim it's ok becaues IQ doesn't suffer, but not bother telling anybody....again what's to stop nvidia from implementing the same mode on their cards?

Conclusion:
Adaptive algorithms like the one ati has released are fine. Kudos to them for developing it...Next time, let us know before you let every reviewer hand you the performance crown based on a lie, yes a lie....you are lieing when you say something is trilinear when it is only trilinear half the time....I'm selling a new 911 turbo for full retail, (it's really a regular 911 carrera with a body kit and turbocharger added) to the buyer who didn't know, was I lying?......I mean it's just as good and just as fast.....

Has nvidia done the same in the past? Yes Do two wrongs make a right? NO...
 

TStep

Platinum Member
Feb 16, 2003
2,460
10
81
It this simple. ATI and NVidia are in business to fool you into believing you are participating in a live 3D enviroment. It is all illusion, trickery, deception to our senses of vision and hearing. It is accomplished via tradeoffs between framerates and various filtering techniques.

I am not well verse in the technical aspects of how its done from code on through to the monitor, nor do I care. I am an end user who wants the best compromise between framerates and filtering techniques. Nvidia and ATI each choose their compromise that they think will fool you, the end user, into believing the illusion.

Why such a great concern about having a tab to enable full, not adaptive, for benchmarking. It seems reasonable that the framerates should play second fiddle to the image quality. If the duck has been painted to look like a sheep (9600 series) and has fooled everyone until now ---- kudos, its the illusion that counts.

The benchmarking for frame rates should come only after equivalent image quality settings are found, not necessarily equivalent tabs checked off in a control panel. If Company A's colors are all washed out in comparison to Company B's, yet Company A does everything else better, has all the appropriate control settings and filtering techniques that Joe public demands, and does it at a greater frame rate, is Company A's card better??? I know I wouldn't buy it.

If ATI's adaptive filtering gives equivalent or better IQ than Nvidia's IQ, who's at fault? ATI?? And why, because you can't benchmark based on tabs checked off in a control panel with complete disregard for equivalent image quality? If one company's 2XAA is equivalent in IQ to anothers 4XAA, should they not be benched at these settings, not both at 2XAA or 4XAA??

I currently own both brands, have in the past, and probably will in the future. My criteria in choice is IQ first (no matter how its done) then framerate. Both cards need to be on the market with a round of driver optimiztion before I decide. The company that can fool me the best will be my choice. My the best "cheater" win.
 

ChkSix

Member
May 5, 2004
192
0
0
If ATI's adaptive filtering gives equivalent or better IQ than Nvidia's IQ, who's at fault? ATI?? And why, because you can't benchmark based on tabs checked off in a control panel with complete disregard for equivalent image quality? If one company's 2XAA is equivalent in IQ to anothers 4XAA, should they not be benched at these settings, not both at 2XAA or 4XAA??

So if I paid 100,000 for a Porsche 911 Turbo and in reality I got a Carrera 4 with a body kit on it that performs equally to the Turbo for the 100,000...didn't I just get ripped because even though the performance is similiar across the boards, I got a car that really isn't what it looks to be except to the trained eye for the same amount of money? I mean granted if I paid less for the supped up Carrera, than hell yeah I got a deal, but paying equal amounts, I wouldn't call it a fair deal at all.

And also, if I was racing the quarter mile and two cars met at the finish line at the same time, even though one had a 500ft lead advantage, wouldn't many argue that because they didn't start at the same line, the race could not be judged fairly?

If one card is running at 1280x960 @2aa/4Anio and producing the same results as the other card IQ wise in 1280x960 @4aa/8anio (which isn't even remotely the case) and FPS advantage are in favor of the former because of lesser work, how can anyone think it is a fair judgement at all? The only way to judge is to base a conclusion after all tests were ran equally and uniformly on both platforms, where IQ and FPS can be judged using the same scale, not one scale for one party and another scale for the other. That is the same thing as doing independent tests seperately and then trying to compare the end results against each other. No one would compare a 6800 Open GL benchmark result to a X800 D3D benchmark with the same settings because it isn't the same test or end result.

IQ is nice, but FPS rules the gaming world. Without it, a crystal clear IQ image producing 5fps will do nothing for nobody and won't sell well at all I can guarantee. And if both are producing similiar IQ yet one isn't doing half the time what the other is doing full time , and gaining an upper hand by making it 'look' just as fast as the harder worker...than that is more efficient only because it is cutting corners, and not because it is getting there by doing the same thing.

Same thing implies as me saying to you, "wanna race to the corner"? Here are your sneakers and I'll take the scooter. Fair? Hardly! More efficient? Absolutely! Am I better than you because I took the scooter? Anyone who thinks so needs to see a therapist since we both didn't do the same thing to get to the corner, unless the test was conducted solely out of efficiency purposes - where I would undeniably be the clear victor. Testing and comparisons as well as conclusions of these cards aren't conducted out of efficiency, but out of doing exactly the same thing and producing the best IQ and FPS in the process.

Efficiency is something that is completely different than doing the same exact thing and arriving at the finish line first.
 

ChkSix

Member
May 5, 2004
192
0
0
Who is ignorant? And who is biased?

And what does this link prove considering it was compared to the NV17? That the company admitted algorithm in their own chat today uses optimized "trilinear" filtering that was never told to to reviewers or end users, and that the methods are hard to spot? That with this mixed moding, they gain an upper hand fps wise, as genius and efficient as it may be, it was still "subjectively" and "purposely" left out of official documents as well as the ears of the reviewers? Please enlighten me on it since your initial response says so little, if anything at all.

Can you try to be more precise and leave out the immature fanboy based remarks next time, if this was directed at me? If you choose to go down that path, especially when you haven't responded at all in this thread, except to say "bump", then you're not worth a formulated response or opinion from me at all.

And by the way, I own both cards. Must mean that I am an oxymoroned-based biased fanboy, no? Check my webshot link below under 'my computer' if you still feel like I lean 'one way'. (If it was meant towards me)

Again, if your comment wasn't directed at me per si, than just ignore it because it holds no water at all. But if it was, my response stands, and won't be explained in detail any further.

Just a little hard to judge who you are talking to in particular, and I am taking defense because it is right under my post and automatically assuming you mean me.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
I like how the fanboys have changed the subject completely to whether or not the IQ is justifiable when the point is that the benchmarks they hold so dear have been completely called into question. Yes, the optimization would be nice and most people would probably use this adaptive trilinear algorithm on their cards but right now what we really want is true, full trilinear to re-run the tests with. Frankly, you can tell they are scared, because up until now most places are 50/50 on who the winner is in this war: Nv has features (that may or not be useful), ATi has performance lead, albeit marginal at best. However make all these cards run actual trilinear and we may very well see the lead go to Nv, at which point only the diehard fans will want x800's. Oh dear, to be a fanboy for the losing team (just wait a little while and they'll win something...the Nv fanboys survived the FX tragedy, you can deal with this). This win/lose cycle is inevitable: the GF4 series was great, the ATi competition didn't hold up...but then Nv gets lazy, ATi works hard and the next round goes without question to ATi...so then they basically re-release the 9800pro with nothing but a speed upgrade while Nv releases a powerful more feature-rich card...and so on. ATi's next offering will probably be amazing, but as for now, accept that they just might be fudging the benchies to save face on what seems more and more to be a so-so product when compared to the current competition.

As for the algorithm itself, why would it only turn on for coloured mipmaps? Because it's the only time you will ever really notice it...it seems like it's less adaptive, more bait-and-switch. And the chat only makes it even more shady...lots of teams, must have lost that memo...ok, well all your other great, beneficial features always get loads of press hype with all the big words to make it seem like the best thing ever...why not even a mention of this?
 

hysperion

Senior member
May 12, 2004
837
0
0
Just so you know chksix I was being sarcastic about the 911 idea....Yes if you pay the same for a NA 911 with a body kit and added turbo and pay what a 911 turbo costs you are getting ripped off.....

And I own a 9800xt and a 9800 pro before that so to call me biased is ignorant (to anyone who's read my posts and think's that, not anyone in particular)....If you want a picture of me holding the 9800xt I'd be happy to post it. I'm going to buy the 6800ultra the same reason I switched from the GF4 to the ATI 9800 series. And that's because I'm going to buy a superior product that doesn't need to cheat to win, I could care less who manufactured it.....
 

ChkSix

Member
May 5, 2004
192
0
0
I never called you biased bro. Not in the least. I never thought that ever about you. My response was meant for Ackmed if he was referring his comments to me (which I assumed because his post was right below mine).

I never like that stance, and it only stems when one has his footing placed firmly in one direction and tries horribly to discount the facts through primitive name calling and branding, which results in a immature flame war. That hasn't been the case here so far, up until Ackmed's response, which might have been meant for me or someone else. In either case, it shows a complete disrespect on his end for other people's input by naming them something that they may not be simply for their own opinion.

Throughout this entire thread, I have yet to see someone siding one way blindly without having an intelligent response. It should stay like that. And again, nothing I said above was ever meant for you. However I did use your Porsche analogy because it makes complete sense and I thought it was an extremely good one.

You would never get a response like that from me without provocation. I never label someone a fanboy unless they respond in a way that is irritable and uneducated at best. And then, I usually only respond one time and ignore them the rest of the way through.

EDIT: I just saw that you said not anyone in particular (LOL). But my response stays. I think you're intelligent and articulate very well, and if anyone calls you a fanboy, just consider the source and all the wasted energy having to debate with them over it. In these cases, it is usually Darwinian principles that prevail over the long run -thankfully.


:beer:
 

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
Originally posted by: Ackmed
You're obviously ignorant about what is going on.

http://www.beyond3d.com/forum/viewtopic.php?t=12587

Open you eyes, and stop trying to be so bias.
Who here can honestly claim that they can tell a significant enough difference between the screenies? I sure as hell can't, the case is closed as far as I'm concerned.
I don't care if ATI or nVidia put a troll on their video card that hand draws every frame to simulate trilinear filtering; if the IQ is the same in all the games, then I'm happy.
 

ChkSix

Member
May 5, 2004
192
0
0
I agree Raynor to an extent. I think what upsets most people (including me) is the sneaky way they went about it. I didn't like it when Nvidia did it - in fact I was furious because I thermal epoxied a swiftech waterblock to my gpu hence no return, and I don't like it now that ATi is doing it.

Just don't lie to gain profit and fool the consumer when it is only a matter of time before it is uncovered, making you look 10 times worse. Honesty and business are difficult words to write on the same page I know, but it is a practice that good businesses should be working hard at to fuse together.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: sxr7171
Originally posted by: Rollo
Originally posted by: BFG10K
If this is true then ATi is definitely cheating. However I can't see how they could possibly detect coloured mip-maps in the first place.



What what what?! BFG, no defense of ATIs "superior" AF?!?!!?

[Rollo drops to the floor, out cold]


LOL Well, I guess the 8,782,415 posts I've read about "that damn nVidia brilinear" just became irrelevant. What will fanATIcs beat into the ground now?

How does it become irrelevant when you have just seen that he applies the standards to both companies. Believe it or not a lot of people supporting either ATI last generation were doing it because they actually had the superior product then. If it turns out that nVidia has the superior product this time (and it really is starting to look like that) then they will turn around and support Nvidia. Only fanboys stick with one company even when they suck. That Nvidia bilinear sucked, just like ATI's bilinear sucks. You seem to be still nVidia supporting nVidia's past bull$hit. You sound like a fanboy.


BFG doesn't apply the same standards to both companies. Now he says adaptive AF is fine, no problem. When he only knew nVidia was doing it, it was a "cheat". See my quotes/links if you don't believe me, he said it, not me.
Now he says he only cared when it was "application specific" but that's not what he has been saying. I can dig up more quotes.
 

MichaelZ

Senior member
Oct 12, 2003
871
0
76
Rollo is hardly a fanboy. His past arsenal included 9700 pro and 9800 pro and the 5800 ultra. Judging by that pattern of cards, looks to be more open minded than some of the real fanboys around here whom can't help but bash any NV product especially the big lemon that was the 5800 Ultra. :disgust:

"OMGHAX! who would buy 5800Ultra and sell his 9800Pro! ATI is da bomb and I think they are so much better than NV coz other ppl say so." < seen a million of these clueless idiots trying to diss that 5800U a month or so back. That was real interesting wasn't it... :disgust: