ATI's Radeon X800 texture filtering game at Tech Report

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

pookie69

Senior member
Apr 16, 2004
305
0
0
Originally posted by: nemesismk2

Imagine if ATI sold cars, they would advertise power steering but it would only work some of the time! :D


LOL :) - THATS A FUNNY ANALOGY - although not quite the same - im pretty sure that having no power steering in a car does NOT increase the car's performance, but quite the contrary :)
 

pookie69

Senior member
Apr 16, 2004
305
0
0
Originally posted by: Ackmed

The two optimazations are not even in the same ball park. NV's degrades IQ, where as ATi's doesnt. Its obvious some of you dont know the difference from Brilinear, to adaptive Trilinear.. you really shouldnt be putting your two cents worth in.

Sorry if this sounds dumb, cos im not sure i know the difference. So theres TRILINEAR filtering, which produces the very best IQ but with greatest hit to performance, then theres also BILINEAR filtering that doesnt produce as good IQ, although im not sure how much 'worse' but with much less of a hit to performace.

>>>> this next one confuses me... ADAPTIVE TRILINEAR... im assuming thats a method of filtering that approximates to TRILINEAR, with an IQ that in most scenarios is very near or equal to that of 'true' TRILINEAR but without as much of a hit to performance.... isnt this ADAPTIVE TRILINEAR known as BRILINEAR filtering?

Ive seen all 3 terms being thrown about;

TRILINEAR
BILINEAR
BRILINEAR

Are BI and BRI two distinctly different things? Are some people accidentally using the two terms interchangedly, incorrectly?:confused:
 

pookie69

Senior member
Apr 16, 2004
305
0
0
Originally posted by: MechxWarrior

I personally do NOT blame ATi for this. I blame their PR/Marketing team for hiding this fact. It causes NO visual issues, its no problem for me.

I have to agree. Having said that, given all the hype surrounding the release of the new generation cards, and all the speculation of who would pull out the better product this time, and the inevitable scrutiny of their technologies, maybe ATi figured that their "Smart AF", had they publicly announced it as being some new and wonderful feature, might be seen by some as a sign that ATi had to somehow 'cut corners' to keep up with nVidia this time round. Conversely it could have been hearalded as an amazing and cool feature, but im sure it would have had just as many people debunking it as simply a 'cheat'.

>>> and this of course would have been from day #1. As someone said in a previous thread, upon release of a new product, you wanna make the "most noise" when "the most people are listening". Had there been tallk by some of ATi 'cuitting corners' upon release of their X800, this may not have worked in their favour.

Now however, a little while after all the excitement and hype has died down, and most important, those benchmarks having come in, it probably wont affect their sales/credability too much that there are 'rumours' of ATi cutting corners >>> and lets face it.... these are just rumours, on a small scale... yes, they are blowing up big time on the boards on the net, but your average-joe will never know >>> hes seen the release stuff - hes made up his mind :)
 

pookie69

Senior member
Apr 16, 2004
305
0
0
Originally posted by: ZobarStyl


I personally just want to see both NV and ATi focus harder on the hardware instead of stooping to these lousy optimizations that negate what they accomplished by building such good chips.

An ugly side-effect of capitalism and western-world philosophy. Its ironic that whilst such ideals on the one hand promote competition that gives way to the advancement of technologies and products, at the same time the desire for profit and 'market share', also leads to companies resorting to such tricks that in many ways undermine, like you quite rightly said, the quality of their products.

Its the old time tested and proved human physchi that turns a good idea bad. Ive always been a firm believer that the good lord had only one design flaw when making man, which was to give us a brain that was far too advanced for our physiology to keep pace with. :)

>>>> hence the world wars, iraq and this darn ATi AF cheating - its all the same :)
 

T9D

Diamond Member
Dec 1, 2001
5,320
6
0
I'm buying nvidia for sure this time.

the 6800 cards have PS3.0 and also don't have to use these cheats. I want to see the performance hit that ATI takes if they take away their cheat. Or think of how much better nvidia cards would test if they had the same cheat. So the raw power is in Nvidia's corner. Just wait until they improve their drivers even more too.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
but to be honest all of the existing evidence makes this little algorithm look bad
I think you're confusing algorithm and PR again. I've yet to see sufficient proof that trylinear offers worse IQ. Though I'm expecting it to, the question is, "To what extent?"

Imagine if ATI sold cars, they would advertise power steering but it would only work some of the time!
It's a funny analogy in that some cars do come with variable power-steering. IOW, it's a bad analogy.
 

Curley

Senior member
Oct 30, 1999
368
3
76
Work Smarter not Harder.

Nvidia optimized drivers to gain only benchmark scores that the end user would not benefit from.

Nvidia claimed DirectX9 support but in small print (partial 2.0 shader support).

Nvidia's paperlaunch and PR report said you needed a 480Watt Power supply and two molex connectors to run the card.

Woops, to many complaints, you only need a 350 watt power supply and one molex connector, and you will only see artifacts only under certain circumstances, (HardOCP review with one molex, 350watt ps).

Nvidia again requires a pci slot to install and run.

Unfortunately for me, I will buy a 6800Ultra because of it's onboard encoder that will significantly improve video editing with Premiere Pro 1.5.

OK ATI has improved it's performance without degrading it's image quality, (Lied about how they did it).


Lowered the power consumption from previous high end card.

Uses only one slot to accomadate Small Form Factor Shuttle XPCs.

Significantly lower priced than the Nvidia high end solutions.

And the card was released the day of the Paperlaunch.

These are what I have read over the last 3 weeks.

What is my point? I don't know but this is food for thought.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
We've gone from using Trilinear to Bilinear; can that be described as anything near to progress?
As ATI stated: trilinear is the act of blending between the mip-maps, so we still have trilinear, just that it's optimized.

In the article that spawned this thread there is a picture of an ATi slide that says reducing trilinear to bilinear is an unacceptable filtering trick, and I'm inclined to agree with ATi on that one.
We do not know all the details yet. Unfortunate as it may sound, it does seem like they stuck a knife through themselves, but not all the details have emerged - and until then I will just be keeping my eye on ATI.

why did no one bother to tell us, and why when we asked the devs that question as well as asking what this algorithm does, were they struggling to answer? One line answers to complex questions?
A perplexing question. The response to what trilinear is made sense to me, but this quesion does hold it's ground. Maybe they thought that we would find it unacceptable.

Then they said the reason they didn't put in a 'always trilinear' option is to have a more simple layout in their settings...they'll put in a full slider bar for this algorithm but not a single lousy checkbox?
agreed that, this answer was pretty stupid, just hide it under advanced options button, simple as that.

>>>> this next one confuses me... ADAPTIVE TRILINEAR... im assuming thats a method of filtering that approximates to TRILINEAR, with an IQ that in most scenarios is very near or equal to that of 'true' TRILINEAR but without as much of a hit to performance.... isnt this ADAPTIVE TRILINEAR known as BRILINEAR filtering?
What brilinear is, is pretty much trilinear with less transitions. In Nvidia's case there seems to be some IQ loss, but I can't confirm that, as I have never seen it. But in ATI's case, adaptive trilinear, it uses the so called brilinear in places where IQ would not be compromised.

Trilinear Filtering; (====) = trilinear transition

|-----====-----====-----====-----

Brilinear Filtering; (==) = half trilinear, half bilinear transition

|-------==--------==---------==-----

Bilinear Filtering: | = bilinear transistion

|--------|-----------|------------|-----

Nvidia again requires a pci slot to install and run.
It is recommended that you leave you PCI open anyway so that you don't sufficate you GPU, so that arguement is dead.

Unfortunately for me, I will buy a 6800Ultra because of it's onboard encoder that will significantly improve video editing with Premiere Pro 1.5.
You have 2 other options, so don't feel to sorry for yourself.

Significantly lower priced than the Nvidia high end solutions.
How so?
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
As ATI stated: trilinear is the act of blending between the mip-maps, so we still have trilinear, just that it's optimized.

lmao.. so what is bilinear then?
 

Curley

Senior member
Oct 30, 1999
368
3
76
6800Ultra Cost $545.00 with possibility of new 500Watt power supply

Radeon X800XT Plat Price $470.00

Nvidia and BFG technologies disagree on the 350 Watt PS and one molex connector

The PCI slot I was referring to was for Small Form Factor Lan boxes like the Shuttle XPC.

Also, what are my two other options on the video encoding chip. Canopus Xplode software spefically recommends Nvidia's new encoding chip.

Your points are well taken and we can split hairs for days on these cards. I all honesty, I'd like to have one of each, ATI X800XT Plat and Nvidia 6800 Ultra.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
lmao.. so what is bilinear then?
According to this, which SHOWS THE DIFFERENCE BETWEEN trilinear and brilinear and bilinear. Brilinear only does half the blend that Trilinear. Look at the linked page and the next page for nicer look. Brilinear does fit the technique and it looks like half bilinear and half trilinear when applied to mip-maps, it is harder to see when looking at colored maps.
 

Curley

Senior member
Oct 30, 1999
368
3
76
Here is a good article and I am wondering what the final product will be?

"Of course this sat sour with me to some extent as it left me thinking that NVIDIA is sending us samples that in no way shape or form will represent retail products. Of course, if this is the case, then there are no reasons to preview their products."


In defense of my reference, I really hope they do not change the hardare encoder chip as this is truely revalutionary for Digital Video Editing.

Although Nvidia built thier last few cards just for gaming, the 6800Ultra will be, for once, a card with more depth and features than ATI.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
You're two other options would be the 6800 Gt and the base model, most likely they will have the decoder.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
Yeah I'm inclined to agree with the guy who said that the two slot problem is kind of moot...I've never had a put anything in the PCI slot below the vid card. If anything, the only thing that should go there is one of those thin blowers that sucks the air off the card for extra cooling.
 

pookie69

Senior member
Apr 16, 2004
305
0
0
Originally posted by: ZobarStyl
Yeah I'm inclined to agree with the guy who said that the two slot problem is kind of moot...I've never had a put anything in the PCI slot below the vid card. If anything, the only thing that should go there is one of those thin blowers that sucks the air off the card for extra cooling.

Agreed. It only makes sense.

;)
 

Curley

Senior member
Oct 30, 1999
368
3
76
Originally posted by: VIAN
You're two other options would be the 6800 Gt and the base model, most likely they will have the decoder.

Thanks vian, I was thinking only the top of line would have the chip. Now I'm excited again!!!
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
You are right...ATi's adaptive trilinear is always working against you.
No it isn't. It is providing a speed gain without reducing IQ.

lmao.. so what is bilinear then?
Bilinear samples 4 texels from each mip-map and their average colour produces the final texel. Thus bilinear can smooth and X and Y surfaces but it can't affect the Z axis because each mip-map is still separately filtered.

Trilinear picks up 4 texels from the closest mip-map and then another 4 from the next closest. In addition to performing bilinear on both mip-maps it also creates a weighted average between them, thus smoothing the Z transitions.

Adaptive trilinear still samples between the mip-maps like normal trilinear but it might only use 6 or 7 samples if it can get away with not reducing IQ by doing so.

Brilinear usually does trilinear normally but the after a certain texture stage is passed it reverts to bilinear filtering. Because it doesn't use any intelligence and because it does use bilinear filtering it can and does impact IQ.
 

Luthien

Golden Member
Feb 1, 2004
1,721
0
0
My take: ATI lied on purpose using subterfuge (ATI identified Nvidea as cheating because Nvidea was using and working on adaptive trilinear methods to improve benchmarks and demanded that Nvidea be compared to ATI in benches both using full trilinear which of course we now know Nvidea was the only one keeping that bargain while "ATI" used it's secret optimizations) in order to keep Nvidea from doing exactly what it "ATI" was doing all along. ATI was afraid Nvidea would reap the same benefits that it "ATI" was reaping if Nvidea persued it's own adaptive trilinear strategy and seeking to improve its own sales and at the same time blacken Nvidea's reputation "ATI" kept it's adaptive trilinear strategy secret. This is why IMO ATI should be condemned for scumbags and lauded for being so perfectly duplicitous.

I posted this hitherto but wanted to exspound on it more with the above:

"Well, OMG ATI outsmarted Nvidea for two freaking years, LOL!!! ATI should be hired by Canada as a Canadian secret police they bushwacked Nvidea so well. Amazing isnt it. I was buying Nvidea 6800 whatever and now I feel even better about it. I think Nvidea should sue ATI for fraudulent marketing in an attempt to put Nvidea out of business. ATI accuses Nvidea of the very thing ATI is doing but ATI manages to hide it far better and do it better to boot. Tragic and funny at the same time. FREAKING AMAZING!!!"