ATI cheating in AF?

Schadenfroh

Elite Member
Mar 8, 2003
38,416
4
0
hmmm, another addition to The Days of our Graphics

/pulls up chair and :beer:

let the flame war begin
 

ChkSix

Member
May 5, 2004
192
0
0
Yeah I saw this on another forum. Quite an interesting find, especially for ATi fans who sincerely believe Nvidia is the only one who optimizes. :frown:

Bring the beer, I got the popcorn.
 

Schadenfroh

Elite Member
Mar 8, 2003
38,416
4
0
Originally posted by: VIAN
I knew it. ATI is a genius. LOL.


hey, ATI has been copying everything nvidia has done in previous years......

so now they get to copy another part, in ATI's defense

good artist copy,
great artist steal
 

ponyo

Lifer
Feb 14, 2002
19,688
2,811
126
Originally posted by: Schadenfroh
Originally posted by: VIAN
I knew it. ATI is a genius. LOL.


hey, ATI has been copying everything nvidia has done in previous years......

so now they get to copy another part, in ATI's defense

good artist copy,
great artist steal


Nvidia is still the master.
If you can't beat them, cheat!

I think Nvidia has the superior product this cycle.
 

ShinX

Senior member
Dec 1, 2003
300
0
0
everyone optimizes , otherwise stuff would run like complete crap on thier hardware e.g. Tomb Raider on the FX
 

ponyo

Lifer
Feb 14, 2002
19,688
2,811
126
Originally posted by: Schadenfroh
Originally posted by: Naustica
So Matrox is the only honest company left. :)


nope, they got caught a few years ago, guess you will have to go with Cirrus Logic


I rather go with Intel Extreme Graphics.
 

Schadenfroh

Elite Member
Mar 8, 2003
38,416
4
0
Originally posted by: Naustica
Originally posted by: Schadenfroh
Originally posted by: Naustica
So Matrox is the only honest company left. :)


nope, they got caught a few years ago, guess you will have to go with Cirrus Logic


I rather go with Intel Extreme Graphics.

did not think of them, VIA's incarnation of S3 might be safe too
 

ponyo

Lifer
Feb 14, 2002
19,688
2,811
126
Originally posted by: VIAN
I rather go with Intel Extreme Graphics.
Yeah, Intel, the company that rips off so many people with their SUPER FAST Celerons.


No one is forcing you to buy it. Blame the uneducated consumers and dishonest store salesmen.

Celeron on laptop is decent.
 

ChkSix

Member
May 5, 2004
192
0
0
From what I have read so far on Beyond3D, it looks like ATi is forcing bilinear much like Nvidia did with the NV3x (5950), which according to some posters on there (and I agree) the tests and benchmarks done so far regarding the X800 and 6800 are junk. X800 was taking much less of a performance hit ( again from what I have read so far on beyond 3D) in AA/AF tests, keeping their results neck and neck with Nvidia (who was using full trilinear) across the board. I think the above poster is right, this time around, Nvidia has the superior product all around.

They are even stating that (again Beyond 3D) that images on the 9800 series doesn't do this.

Cheating may be cheating, and everyone does it. But if the cheating you're doing now puts you ten steps back, prior to your last video card in AA/AF, than that's not genius at all...but pure stupidity and extremely decietful to it's client base and the public in general.

Here is another good find. Notice that all these X800's shipped to reviewer's were all clocked at different speeds for the same card. Now that is something that says, at least to me, "desperation". I won't even mention the lack of OC tools when the CAT 4.5's are loaded with the X800, yet clearly visible in the display settings in the 9800XT. Well, you can read that all for yourselves in the article below.

http://www.pcper.com/article.php?aid=40&type=expert

(lLook at last page and the several different frequencies ATi sent the X800 out to reviewers with)
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: ChkSix
From what I have read so far on Beyond3D, it looks like ATi is forcing bilinear much like Nvidia did with the NV3x (5950), which according to some posters on there (and I agree) the tests and benchmarks done so far regarding the X800 and 6800 are junk. X800 was taking much less of a performance hit ( again from what I have read so far on beyond 3D) in AA/AF tests, keeping their results neck and neck with Nvidia (who was using full trilinear) across the board. I think the above poster is right, this time around, Nvidia has the superior product all around.

They are even stating that (again Beyond 3D) that images on the 9800 series doesn't do this.

Cheating may be cheating, and everyone does it. But if the cheating you're doing now puts you ten steps back, prior to your last video card in AA/AF, than that's not genius at all...but pure stupidity and extremely decietful to it's client base and the public in general.

Can we at least wait until this issue becomes clearer before we start declaring who's the 'winner' of the graphics card wars this week?

I read 8 pages of this on Beyond3D, and I haven't seen any definitive proof yet that this is really a cheat by ATI. Several posters over there suggested that it could be due to optimizations related to auto-generated mipmaps (as opposed to ones that are passed in along with the textures), or something related to how ATI compresses or filters solid-colored textures. I want to figure out what's going on here, but the issue is far from settled.

Here is another good find. Notice that all these X800's shipped to reviewer's were all clocked at different speeds for the same card. Now that is something that says, at least to me, "desperation". I won't even mention the lack of OC tools when the CAT 4.5's are loaded with the X800, yet clearly visible in the display settings in the 9800XT. Well, you can read that all for yourselves in the article below.

http://www.pcper.com/article.php?aid=40&type=expert

(lLook at last page and the several different frequencies ATi sent the X800 out to reviewers with)

Um... or it could be that they're prototype cards and drivers. Give 'em a chance with retail boards and non-beta drivers before we start calling ATI "desperate".
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Looks an awful lot like cheating to me. :beer: I guess we will have to let this play out and let smarter people than me figure out all the future implications. Best case scenario is ATI will have to dump the x800 for really cheap and I will get one.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
If this is true then ATi is definitely cheating. However I can't see how they could possibly detect coloured mip-maps in the first place.
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
Originally posted by: ChkSix
From what I have read so far on Beyond3D, it looks like ATi is forcing bilinear much like Nvidia did with the NV3x (5950), which according to some posters on there (and I agree) the tests and benchmarks done so far regarding the X800 and 6800 are junk. X800 was taking much less of a performance hit ( again from what I have read so far on beyond 3D) in AA/AF tests, keeping their results neck and neck with Nvidia (who was using full trilinear) across the board. I think the above poster is right, this time around, Nvidia has the superior product all around.

They are even stating that (again Beyond 3D) that images on the 9800 series doesn't do this.

Well, the difference betwen NV and ATI is, NV didnt hide it, ATI is trying to
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
I think the perfect example can be found at NVNews, where one guy brings up the point of ATI only losing 2.6FPS from 2xAF all the way upto 16xAF? Why have a setting at all if you are only gonna lose 2.6fps! There SHOULD be a performance hit SOMEWHERE!