• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

What is firingsquad smoking???

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: otispunkmeyer
i like the way they bench mark with AA and AF in pretty much everytest......they must know ATI have better aa and af than nvidia....but i have to say nv are gettin better, they at least win a few of the tests

but i like to see raw numbers since i dont play with either aa or af

Fair enough, but ?

Why buy a $350+ - $500 videocard and not use AA/AF? You'd be way better off getting 9800Pro or 5900xt

It is understandable that they cannot provide for the uses of most users but they try to fit the most common denominator. I think in this case it is fair to assume that the future does involve using AA/AF or otherwise what is the point of such high-end cards and advances in visuals? One benefit of the high-end market is the perks of being able to use these features, hence the elevated price for these products. In this case, is it unfair for them to benchmark the cards with AA/AF even if ATI is slightly faster? I dont think it is.

Of course in terms of "raw" performance, either company provides products that are fast enough, making it absolutely irrelevant which card you purchase if "raw" performance fits your needs.
 
Originally posted by: RussianSensation

Fair enough, but ?

Why buy a $350+ - $500 videocard and not use AA/AF? You'd be way better off getting 9800Pro or 5900xt

It is understandable that they cannot provide for the uses of most users but they try to fit the most common denominator. I think in this case it is fair to assume that the future does involve using AA/AF or otherwise what is the point of such high-end cards and advances in visuals? One benefit of the high-end market is the perks of being able to use these features, hence the elevated price for these products. In this case, is it unfair for them to benchmark the cards with AA/AF even if ATI is slightly faster? I dont think it is.

Of course in terms of "raw" performance, either company provides products that are fast enough, making it absolutely irrelevant which card you purchase if "raw" performance fits your needs.

part of the demonstration of power a card has is in it's ability to juggle AA, AF, and high resolution.
 
So now we can see that x800xt only lost TWICE in all the gaming benchmarks in this review, then how can you possibly compare the performance of x800xt to 6800gt when x800xt is generally faster than 6800ultra???? I think that is what Gururu is trying to point out. This helps to justify the high price of X800xt on the market, far from the $450 MSRP at most places online. This goes on to show just how Sh*t 6800Ultra is if it sells for $500, especially since 6800GT can be bought for around $380. Still the sweet spot for high end is 6800GT overclocked to Ultra speeds. But for ultimate performance x800xt or x800xt pe still rule. S whats new about these observations? ...... Most importantly, since the benchmarks for this generation are so close, perhaps after discounting Far Cry and Doom 3, 6800GT/X800Pro/X800xt/x800xt pe/ 6800Ultra should all pretty much offer an equal level of gaming experience in real life. It's not like you are going to say "oh f*ck......my card only does 60FPS, thats 10FPS slower than yours....I hate my videocard....it's so slow!"
Hmm, I disagree with this somewhat. If you compare the x800 Pro to the XTPE, you'll find that the XT, you'll
find that the latter can be up to 35% faster or so. I hardly consider that a negligible difference.

Even if you only look at 16 pipe cards, the XTPE offers a solid 20% improvement in many games. In UT2k4, that ends up meaning a full resolution higher with the XT and having the same framerate as the GT. Doom 3 of course is the exception.

IMO anything above 20-25% is a substantial difference, as that is the gap between 40 and 50 fps. That 100 vs 70 fps argument becomes relevant in a year when it becomes 50 vs 35.

The higher end cards over 6800 GT might not be worth the price premium, but calling them equal is wrong.

Just my $.02.
 
Back
Top