toms hardware puts up article on ati's filtering game

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

hysperion

Senior member
May 12, 2004
837
0
0
toms article didn't say anything that hadn't been said already....so to claim that it's biased because you saw an nv article is crap....read their review on the cards...they gave the crown to ati.....usually the general consensus is toms likes intel.....I don't see why there's an argument....the point isn't whether the card produces the same IQ or not....I want a test without optimizations to see who has the more capable hardware period. Why? Because if nvidia has higher performance without any optimizations, don't you think their driver programmers can make a similiar adaptive algorithm? They've written better drivers for how long now?
 

MaddSquirrel

Junior Member
Apr 21, 2004
4
0
0
I am a fairly new member to this forum, but I have been waiting for a year to buy a new vid card. I just don't think it is right for everyone to jump nvidia's butt.
1. Nvidia has 32 bit floating point precision and ATI only has 24.
2. Nvidia and ATI both have driver optimizations
3. Nvidia offers true trilinear filtering (as option) and ATI doesn't currently.
4. Nvidia supports Pixel Shader 3.0, ATI does not.
Looks like Nvidia wins with points 1, 3 ,and 4. So stop whinning becasue ATI got their ass busted. Just look at the facts. ATI does have a little better IQ, but Nvidia does do things correctly. Do not say that the 24 bit precision is okay because it is not in compilance with direct x9. Direct x9 is supposed to have 32 bit. Nvidia did not follow all the other specifications, but that one. Pixel Shader 3.0 is useless (currently, except for a few games) but Nvidia does have it, along with true trilinear filtering and 32 bit pixel shader precicision. SO ATI assholes just say that nvidia did a good job and ATI still has better image quality. ATI has always had better IQ, but ATI use to be canadian geese crap too. I currently own a nvidia card, but am on buying an ATI card. ATI finally has better drivers in my opinion. SO basically give credit to Nvidia for what they have done better than ATI.
 

BugsBunny1078

Banned
Jan 11, 2004
910
0
0
Originally posted by: OfficerDoofey
Originally posted by: Aelius
Whenever Tom's posts something take it with a grain of salt.

That site is one of the biggest sellouts, so much of what he says or doesn't say is utter crap.

i must admit i thought it was kinda funny how i was reading an article effectively 'bagging' ATI and in the corner was a nice shiny nvidia add with quotes from reviewers (THG for one) saying it is the greatest thing since sliced bread..

not the right place for it i dont think lol :p
Well if you were Nvidia you would pay extra to get that add on that page.
 

sandorski

No Lifer
Oct 10, 1999
70,697
6,257
126
Originally posted by: BugsBunny1078
Originally posted by: sandorski
If it can't be seen(with your eyes), what's the problem?
read the article. It is well explained and I learned a few thingsabout this texture filtering from it too.
No other article on this has gotten this knowledge into my head as good as this one.

Read it. It's funny how actual ingame screenshots are not compared.
 

OfficerDoofey

Member
May 26, 2004
112
0
0
Originally posted by: hysperion
toms article didn't say anything that hadn't been said already....so to claim that it's biased because you saw an nv article is crap....read their review on the cards...they gave the crown to ati.....usually the general consensus is toms likes intel.....I don't see why there's an argument....the point isn't whether the card produces the same IQ or not....I want a test without optimizations to see who has the more capable hardware period. Why? Because if nvidia has higher performance without any optimizations, don't you think their driver programmers can make a similiar adaptive algorithm? They've written better drivers for how long now?

WOOOO dude calm down... get back down off your horse... i was not claiming biased... i was making a silly point. was not ment to hurt/offend/annoy anyone... its just a silly joking around point... relax a little
 

OfficerDoofey

Member
May 26, 2004
112
0
0
Originally posted by: BugsBunny1078
Originally posted by: OfficerDoofey
Originally posted by: Aelius
Whenever Tom's posts something take it with a grain of salt.

That site is one of the biggest sellouts, so much of what he says or doesn't say is utter crap.

i must admit i thought it was kinda funny how i was reading an article effectively 'bagging' ATI and in the corner was a nice shiny nvidia add with quotes from reviewers (THG for one) saying it is the greatest thing since sliced bread..

not the right place for it i dont think lol :p
Well if you were Nvidia you would pay extra to get that add on that page.

Agreed.. i bet they did.. but its all sorta done with a tongue in cheek kinda way... that was all i was saying..
 

Childs

Lifer
Jul 9, 2000
11,313
7
81
True 32bit color and True Trilinear filtering is a handicap when the competiton runs 24bit and Adaptive Trilinear. Had ATI not gone out of their way to taint the benchmarks then no harm no foul. NV's brilinear looks pretty good now, so there was no reason to claim NV's method reduces image quality to the point where its an advantage. Both should be tested using brilinear or full trilinear, and the current benchmarks are tainted. Remove those benches and look at the other benches, and the 6800 looks to be a better card this time around. I'm surprised no one is rerunning the benchmarks using brilinear on both.
 

Luthien

Golden Member
Feb 1, 2004
1,721
0
0
My take: ATI lied on purpose using subterfuge (ATI identified nVidia as cheating because nVidia was using and working on adaptive trilinear methods to improve benchmarks and demanded that nVidia be compared to ATI in benches both using full trilinear which of course we now know nVidia was the only one keeping that bargain while "ATI" used it's secret optimizations) in order to keep nVidia from doing exactly what it "ATI" was doing all along. ATI was afraid nVidia would reap the same benefits that it "ATI" was reaping if nVidia pursued its own adaptive trilinear strategy and seeking to improve its own sales and at the same time blacken nVidia's reputation "ATI" kept its adaptive trilinear strategy secret. This is why IMO ATI should be condemned for scumbags and lauded for being so perfectly duplicitous.

I posted this hitherto but wanted to expound on it more with the above:

"Well, OMG ATI outsmarted nVidia for two freaking years, LOL!!! ATI should be hired by Canada as a Canadian secret police they bushwhacked nVidia so well. Amazing isn?t it. I was buying nVidia 6800 whatever and now I feel even better about it. I think nVidia should sue ATI for fraudulent marketing in an attempt to put nVidia out of business. ATI accuses nVidia of the very thing ATI is doing but ATI manages to hide it far better and do it better to boot. Tragic and funny at the same time and FREAKING AMAZING!!!"



I posted all that a month ago!
 

Schadenfroh

Elite Member
Mar 8, 2003
38,416
4
0
you forget that its not the canadians, but the AMERICANS at ArtX (which they bought) that are the reason that ATI's products are no longer craptacular
 

Megatomic

Lifer
Nov 9, 2000
20,127
6
81
Originally posted by: rbV5
Originally posted by: Megatomic
Despite the fact that the article was written by THG, they presented ATI's faults properly. If you don't see the problem then you probably didn't read the entire article. They did 2 things wrong:

1. They didn't disclose the "optimization" to the press when they sent review samples/drivers out.

2. They recommended that the press cripple the competition's products when conducting side by side comparisons trying to make it look like an apples to apples comparison when it fact it was nothing of the sort.

That is shady marketing at it's shadiest.

Shadiest? Please, they're not killing people by not disclosing medical side-effects here. In the end, IMO, they should have touted it as a great new feature and dealt with it up front back with the 9600 cards.
It's shady marketing. S - H - A - D- Y.
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
Why do people keep saying IQ isnt degraded?

http://www.ixbt.com/video2/nv40-rx800-3.shtml
http://www.computerbase.de/artikel/hardware/grafikkarten/radeon_x800_texturen/
http://www.pcper.com/article.php?aid=40&type=expert&pid=4
http://www.digit-life.com/articles2/gffx/nv40-rx800-3.html
http://www.ixbt.com/video2/images/rx800-2/r420-pir.jpg (bottom left) digit life claims they see no difference, but then look here it is pointed out clearly
http://www.nvnews.net/vbulletin/showthread.php?t=29315 <---download the HTML documents, they are mouseovers of DigitLife's screenshots. Then goto post #12, MikeC even circle's the AF problem and IQ degredation

here is Beyond3d and their AF test (directly from ATI, supposedly) here: http://www.beyond3d.com/forum/viewtopic.php?t=12594 yet I figured it out, and explained it here http://www.nvnews.net/vbulletin/showthread.php?t=29019&amp;page=2&amp;pp=40&amp;highlight=inch (post #63)

and here is another theroy, just speculation, not fact...Why is ATI's screenshots, ALWAYS darker? I dunno if it's a bug, or not, but tell me this, what color of pixel is easiest to render? black I'm just speculating with a theory
 

OfficerDoofey

Member
May 26, 2004
112
0
0
its kinda like when u buy a car for only $19,990

then find out its only $19,990 + stamp duty + on road costs + rust protection + car salesmans kids college fund equalling $24,990..


wait... actually its nothing like that lol :confused:
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I have to agree with Shamrock... Honestly, prior to reading Tom's article, I didn't fully understand what the optimizations could mean. However, once you know what to look for it is pretty easy to spot.

Take these two screenies for example:

R420

NV40

Open them up in two browser windows and compare the left corner, just at the top right side of the box with the face in it. On the R420 screenshot you can clearly see a transition between the mip maps, while on the NV40 screenshot, the transition is smooth. Once you see this, you can actually trace the entire line across the image, even under the shadow cast by the character. Look at the image long enough, and the bottom left of the R420 image begins to look like the edge of a table that is perpendicular to the top surface. Obvioulsy, you won't be staring at this screenshot during gameplay, but I am curious if the line between the mip maps moves as you walk forward.

Judge for yourself...

EDIT: hit refresh on your browser if the links give you errors.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
The article incorrectly stated that nVidia doesn't reduce quality on texture stages when in fact this is exactly what brilinear is.

Why do people keep saying IQ isnt degraded?
Because it isn't.
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
I can see now you arent LOOKING at the screenshots, are you...you are blindly stating that IQ isnt degraded...

I am now convinced you really do work for ATI

oh wait, I am now awaiting "those screenshots are screwed up, the reviewers didnt know what they were doing"

IQ IS degraded..substantially.
 

sandorski

No Lifer
Oct 10, 1999
70,697
6,257
126
Originally posted by: nitromullet
I have to agree with Shamrock... Honestly, prior to reading Tom's article, I didn't fully understand what the optimizations could mean. However, once you know what to look for it is pretty easy to spot.

Take these two screenies for example:

R420

NV40

Open them up in two browser windows and compare the left corner, just at the top right side of the box with the face in it. On the R420 screenshot you can clearly see a transition between the mip maps, while on the NV40 screenshot, the transition is smooth. Once you see this, you can actually trace the entire line across the image, even under the shadow cast by the character. Look at the image long enough, and the bottom left of the R420 image begins to look like the edge of a table that is perpendicular to the top surface. Obvioulsy, you won't be staring at this screenshot during gameplay, but I am curious if the line between the mip maps moves as you walk forward.

Judge for yourself...

EDIT: hit refresh on your browser if the links give you errors.

What? I don't see anything. Besides, comparing 1 screenshot from each card means nothing. Picture quality between video cards/chips is a known quantity. To prove a "degradation" you need 2 screenshots from the card in question. 1 Using the Optomization, the other not.
 

Luthien

Golden Member
Feb 1, 2004
1,721
0
0
Shamrock I see what your talking about and it is very clear imo and my eyes are not so hot compared to some folks. However it just shows NV with better picture quality. It may also show you what the dif. between brilinear and true trilinear with ATI to be exact we do still need to compare ATI with it on and off. That said I think ATI is absolutely cheating as everyone is starting to understand. If you read my post higher up I you will understand that I got a great kick out of ATI's excuse that they didnt reveal their brilinear method because they are patenting it. OMG, more subterfuge and lies and purely hypocritical nonsense. ATI should have just owned up and told the truth this is going to cost them. Nvidea if it isnt caught cheating too and get's it cards out asap in volume which isnt looking likely could definately spank ATI this time around. ATI's is fighting all this with $435 800XT which is a very smart move again. All I can say is that Nvidea better get it's product to market fast or all this is going to make little difference. They need to ride the negativity wave against ATI while it is hot and they are slow to market.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: MaddSquirrel
I am a fairly new member to this forum, but I have been waiting for a year to buy a new vid card. I just don't think it is right for everyone to jump nvidia's butt.
1. Nvidia has 32 bit floating point precision and ATI only has 24.
2. Nvidia and ATI both have driver optimizations
3. Nvidia offers true trilinear filtering (as option) and ATI doesn't currently.
4. Nvidia supports Pixel Shader 3.0, ATI does not.
Looks like Nvidia wins with points 1, 3 ,and 4. So stop whinning becasue ATI got their ass busted. Just look at the facts. ATI does have a little better IQ, but Nvidia does do things correctly. Do not say that the 24 bit precision is okay because it is not in compilance with direct x9. Direct x9 is supposed to have 32 bit. Nvidia did not follow all the other specifications, but that one. Pixel Shader 3.0 is useless (currently, except for a few games) but Nvidia does have it, along with true trilinear filtering and 32 bit pixel shader precicision. SO ATI assholes just say that nvidia did a good job and ATI still has better image quality. ATI has always had better IQ, but ATI use to be canadian geese crap too. I currently own a nvidia card, but am on buying an ATI card. ATI finally has better drivers in my opinion. SO basically give credit to Nvidia for what they have done better than ATI.

You need to learn some manners. If you cant talk to people without childish name calling, you wont be taken seriously at all.

I did notice the fact you mentioned NV has PS 3.0, but neglected to mention that ATi has 3Dc. Funny that.

Originally posted by: Shamrock


IQ IS degraded..substantially.

Not in games it not. Comparing AF from one card, to AF from another card isnt the best way to test.
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
then what the heck is Pirates of the Carribean? and Call of Duty? Photoshop plugins? that screenshot I shoed you from Digi-Life is NOT from 3dMock, or from some trumped up benchmark...it's Pirates of the Carribean!
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
It will be hilarious if ATI lets you turn it off in the next set of drivers. :D
 

webmal

Banned
Dec 31, 2003
144
0
0
Originally posted by: fsstrike

If it looks trillinear, feels like trillinear, and has good FPS, whats the problem?

If it looks like a pussy, feels like a pussy but actually an anus then it's OK? :confused:
 

sandorski

No Lifer
Oct 10, 1999
70,697
6,257
126
Originally posted by: Shamrock
http://www.nvnews.net/images/news/200405/boundary.png

you cant see that? you know, that thin line that seperates AF from the NON AF?

Ok, I see what you are getting at now, but how do we know that it isn't also that way with the Radeon in full Trilinear if there is no comparison? It could be a bug with ATI cards and that particular game for all we know.

BTW, if you look at the plants at a distance, IMO, the graphic quality of the Radeon seems better than that of the NVidia shots. The Nvidia shot looks blurrier.
 

imported_Aelius

Golden Member
Apr 25, 2004
1,988
0
0
Those two shots show that nVidia has a blurry image up close and farther away and the ATI card has lines but isn't blurry from front to back.

Not sure why. It sure as heck doesn't prove anything.

It just leaves more questions.