• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Proof : ATI?s X800 Tri is much better than NV?s 6800 Bri.

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
BTW I think it is very apparent both brilinear filtering methods produce almost exactly the same iq.
Lol.:roll:


They did this obviously because they knew if the 6800s were allowed to run in a mode much more like the X800s they would lose.
Right. Not like the XT doesn't trash the 6800 lineup already for a cheaper price.
 
Originally posted by: GeneralGrievous
BTW I think it is very apparent both brilinear filtering methods produce almost exactly the same iq.
Lol.:roll:


They did this obviously because they knew if the 6800s were allowed to run in a mode much more like the X800s they would lose.
Right. Not like the XT doesn't trash the 6800 lineup already for a cheaper price.

A cheaper price? XTs are moving for $570+
 
No, they are honoring the prices. And even put them back up. You will get it for that price, just not until mid/end July.
 
omg i cannot believe someone is posting about this again. There is barely any difference between this "garbage". In a game if you stop look down take a screenshot, zoom in and then analyze the differences between pixels and if one or two things are wrong then something is automatically garbage and has bad IQ PLZ raise your hand.

Why do people try to start these arguments. Why cant we have an educated discussion, not one like this.

As for this round of graphics cards i believe Nvidia has won. Ati has indeed come out with another round of very efficient chips that are excellent. However, Nvidia though they might need a lot of power has won this round. They are a bit ahead of Ati right now in terms of benchmarks, i would estimate around a 65:35 win ratio. This is by a small margin but Nvidia has room to improve with new drivers as it is a new architecture.

Maybe you see things differently but hey heres what i made of the situation.

-Kevin
 
Originally posted by: ChkSix

Blastman: You're dead WRONG. Your first assumption claiming ATi's Adaptive Trilinear is 'Full Trillinear' shows that you're misinformed. No offense honestly, but I suggest you do a little research before you claim that something is one thing when in reality it is a completely different thing.
No I?m not misinformed. And you?re the one who?s wrong here. I (amazingly) read through every thread at Beyond3d on the topic and looked at virtually every image that was posted. I participated in several of the discussions there when the story was breaking. I spend at least a couple of hours looking at the mipmaps in the xbit article with MS photo editor. So I think I have a pretty good idea of the quality level of ATI?s adaptive filtering. There wasn?t one confirmed case of someone spotting mipmaps on the X800 filtering -- nada, none, zippo, zero, didn?t happen --unless I missed it. They all turned out to be either borked settings (bilinear) or something else. If you have a link to a specific confirmed case of mipmaps on the X800 --please point me to it.

Originally posted by: ChkSix

Next, IQ degradation with adaptive trilinear algorithm on the R420:
http://www.beyond3d.com/forum/viewtopic.php?t=12587
Did you actually read any of that thread ChkSix? It doesn?t seem so if you are trying to prove ATI?s adaptive-Tri degrades IQ. If anything, the conclusion was the R420 was better in that thread.

Originally posted by: ChkSix

This one is a kicker if you really believe ATi loses no IQ with it's new filtering techniques.
http://www.beyond3d.com/forum/viewtopic.php?t=12578
Again, read the thread. A few posts down ?the answer ?.(quote) ? after looking at the shots i can tell you that something is fishy with the r420 pics as they show pure bilinear filtering -> user error or driver bug ?? (quote)


And what did THG conclude ?
THG?

???.ATi deserves credit for the fact that the image quality of the cards is not visibly compromised by this filtering; at least no example has yet been seen of this.?..

The initial shot in that THG article supposedly showing ATI?s ? ?lowered?? filtering is ridiculous. They obtained some shot from some other site and didn?t post any details -- card, drivers, settings, tools ?etc?. etc. ? of the shot so it can be checked to see what it really means. I wonder why that is? Blowing up some minute bit differences using some super-contrast tool on a mipmap in the lab doesn?t prove one will be able to see differences in games when the adaptive-Tri is really in action. They concluded at the end of the article there was no evidence it could be spotted anyway. Another site forced the adaptive-Tri on a color mipmap to see a difference which is just plain stupid. The adaptive-Tri isn?t used on colored mipmaps. There is a lot of misinformation be spread on this topic and it needs to be sorted through carefully.

Is there one strictly defined method for smoothing out mipmap transitions that is Trilinear? No there is not. Unless you can make a case that ATI?s new Adaptive-Trilinear is not doing as good a job as blending mipmaps as Full-Tri on other cards like the 6800 and 9800. Then case closed. The X800 is doing Full-Trilinear or Pure-Trlinear or whatever you want to label it. Everything I?ve seen to date proves ATI?s public statement at the Tech-report that the filtering on the X800 is = Full-Tri.

(quote)??..l??Our testing, at least, has indicated that the texture filtering quality of the X800 series was always the same or better than the Radeon 9800 series, which does not use this adaptive technique???.(quote).

Some of these sites like Firingsquad and Guru3D are just brain-dead on the issue. They?re simply too lazy to do the proper research so they say both NV and ATI have optimizations -- so we should compare the two. It's plain stupidity. They assume both opts are equal when it's clear from the info in my OP that the optimizations are not equal. NV reduces filtering Q ATI doesn't. One needs to compare IQ --not simply the fact that they both have some sort of optimization.

People are acting as if we need the old Trilinear just for the sake of having Trilinear. It?s nonsense. What do you care what they call it or how it is achieved. The whole point is to make mipmap boundaries disappear -- and as already pointed out, the evidence shows ATI?s adaptive-Tri does this as good as the ?old? method of Trilinear on the 9800.
 
Originally posted by: ChkSix
Originally posted by: SickBeast
Originally posted by: ikickpigeons
Originally posted by: ChkSix
I saw that article. The only place that is true is in painkiller...no difference anywhere else or in any other game whatsoever.

In my own opinion, the X800 line is garbage and has nothing going for it down the road. I would have bought an X800pro, but seeing that with early drivers the 6800GT stomps it, the GT will be in my second comp with the Ultra in the main.

Nuff said.

what he said

Sorry to call you both out, guys, but if you're calling a piece of hardware that performs nearly IDENTICALLY to the comparable nVidia stuff GARBAGE then you are both obviously fanboys. The X800's actually perform BETTER than the 6800's in many situations.

I agree that the 6800 series is overall better, but it's such a close call that to call X800 "garbage" shows bias and a lack of objectivity.

I guessed you missed where I said "my own opinion", which I am more than entitled to. I used that term specifically so that no one would think I am trying to force my own point of view onto anyone else.

I was objective early on when the cards were initially released, but as more news becomes available (a lot of bad as well from different forums from owners who actually bought the r420), I think forming an objective opinion does not show a bias, but in fact is the intelligent thing to do.

OMG you consider calling a card GARBAGE when it performs the SAME as the product offered by the only competitor OBJECTIVE??? Give me a break. You didn't post a single piece of evidence suggesting that the X800 is "garbage". You need to head on over to dictionary.com and look up the word "subjective". There will probably be a large portrait of you staring you in the face. You are seething with personal bias and SUBJECTIVITY.

Opinion or not, you need to admit that you are biased and to call your comments objective is laughable.
 
ATi fanbois in a corner. ouch. how could we let this happen to ourselves? 🙁 I guess we should hope ATi develops better products next time around. 🙁
 
Originally posted by: CaiNaM
Originally posted by: Bar81
I choose not to respond to your or aghmed's kind. You don't have anything particularly intelligent to add. We're done.

once again, when anything reasonable is put in front of you, you ignore it and continue regurtitating the same, sad rhetoric. here's hoping you can keep your word; i for one and glad you're done 🙂

CaiNaM, you are one of the few respectable, non-biased, objective people left in the video forum. On top of that you seem very knowledgeable. I have had my own issues with Bar81 in the past. He is one of the most beligerant and disrespectful people you will come across on these boards, and believe me when I say that-- I've been a member of these boards for a long time. On top of that he rarely has anything intelligent to add to the discussion. It's just flames on top of flames if he happens to disagree with you.

You would be well-off to reciprocate what he is doing to you (not the flames, I'm talking about the ignoring bit).
 
Originally posted by: Gamingphreek
As for this round of graphics cards i believe Nvidia has won.

won before they've delivered their product in any quantity whatsoever? i think that's a bit premature...
 
Originally posted by: Blastman
No I?m not misinformed. And you?re the one who?s wrong here. I (amazingly) read through every thread at Beyond3d on the topic and looked at virtually every image that was posted. I participated in several of the discussions there when the story was breaking. I spend at least a couple of hours looking at the mipmaps in the xbit article with MS photo editor. So I think I have a pretty good idea of the quality level of ATI?s adaptive filtering. There wasn?t one confirmed case of someone spotting mipmaps on the X800 filtering -- nada, none, zippo, zero, didn?t happen --unless I missed it. They all turned out to be either borked settings (bilinear) or something else. If you have a link to a specific confirmed case of mipmaps on the X800 --please point me to it.

http://www.darkvengeance.net/test/DAoC_2004-06-05.bmp

pretty easy to spot, even from a "still" screenshot. it's even more pronounced when moving.
 
Originally posted by: g3pro
ATi fanbois in a corner. ouch. how could we let this happen to ourselves? 🙁 I guess we should hope ATi develops better products next time around. 🙁

heh.. and on the other side, you have the nv fanbois in a feeding frenzy.. tho it's not much more than dozens of sharks fighting over a small scrap of bacon, lol...

frankly, i don't care much for 'fanbois' regardless of whose camp they side with. it degrades useful discussions mindless flamewars.

the reality is, with the recent 'unofficial' driver release (56.72 is still the whql), the (still unreleased) GT marginally wins a few, loses a few, and shows a significant advantage in one title (in the limited games benchmarked): CoD, which has resulted in all the nv 'fanbois' coming out of hiding and proclaiming their card the "winner", when in fact there is no such thing. simply put, nv has a great product, and arguably shows a slight advantage... but ati has a strong offering as well.

both sides have their driver issues, such as far cry with nv, splinter cell with ati, and both sides have otimizations that, depending on the particular situation, arguments can be made as to the slight superiority of each under unique circumstances.

i doubt either side is thru with performance improvements via improved drivers in the future (ati's new memory controller is only around 70% driver efficiency with current drivers, and they are rewriting the OGL side). new features such as sm3 or 3Dc have not yet shown their mettle, and may never in this generation (tho i still see sm3 a "trump card" over 3Dc, but that's just my speculation/opinion).

while i can certainly understand preferring one product over the other, to say one is significantly better than the other is completely unrealistic - both are very competetive products, and neither is a bad choice; it's more personal taste than actual differences...
 
Originally posted by: Blastman
Originally posted by: CaiNaM
http://www.darkvengeance.net/test/DAoC_2004-06-05.bmp

pretty easy to spot, even from a "still" screenshot. it's even more pronounced when moving.
Sorry, that IS NOT ATI"S adaptive-Tri. NO way. Nadda. NOT. Borked settings, user error, or the game is not doing Trilinear.


Jesus christ dude get off of the crusade already.
It IS the card no matter how much you wanna fight it...its the damn adaptive crap...hes not a moron or a new user Ill bet my bottom dollar he can replicate that EVERYTIME on the 420 vs. his older Geforce card.
Now how the hell is it the Geforce doesnt do it yet the 420 does?

CaiNam...just for hte hell of it format then reinstall everything for each card...that way this guy will shut the hell up and stop claiming you dont know WTF you are doing and or saying.
 
Originally posted by: g3pro
ATi fanbois in a corner. ouch. how could we let this happen to ourselves? 🙁 I guess we should hope ATi develops better products next time around. 🙁

Then we have Nvidiot hypocrites attacking the same ideals they defended last round, and even more hyporcrites sitting back and pretending like they are better than anybody else when they flame/attack/make stupid comments just as much as everybody else, getting all worked up and then commenting on how everybody but them isn't getting obsessed with this stuff and relentlessly posting
 
Originally posted by: Blastman
the game is not doing Trilinear.[/quote]

IMHO

ive had other games (SWG mainly though) do the same thing on occasion. not always and without changing settings, even if i close and restart it *might* not do that again, on both a 9800 Pro and a FX5600. i offer no explaination for this and please dont take my comments as a flame.

Regards,
Y.A.F.
:beer:
 
Sorry, that IS NOT ATI"S adaptive-Tri. NO way. Nadda. NOT. Borked settings, user error, or the game is not doing Trilinear.

The burden of proof is on you. We'll be waiting.

p.s. And for all we know, above Painkiller shots may have resulted from "borked settings, user error". Something to think about 🙂
 
Back
Top