A different shootout ATI vs NVidia (open minds!)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Wreckage
Originally posted by: munky
I want benchmarks made by people who know how to make a valid benchmark.
Well monkee until ATI starts releasing major benchmarks you will never be happy.

You got a quote of me saying that? Or are you just trolling as usual?
 

Alaa

Senior member
Apr 26, 2005
839
8
81
Originally posted by: munky
Originally posted by: Alaa
Originally posted by: munky
Originally posted by: Alaa
Originally posted by: munky
I'm gonna make a chart that includes these sites (just a little bit more technical knowledge than the others): ;)
http://www.computerbase.de/artikel/hard...900_cf-edition/1/#abschnitt_einleitung
http://www.hardware.fr/articles/605-1/r...xtx-x1900-xt-x1900-crossfire-test.html

Unless someone else wants to do it...

maybe more ATi biased? i think so, u love them then they're biased

No, ya newb. The kind of sites that have a much higher reputation among the people that actually know something about the hardware.

u r just another fanboy..do u gain anything from recommending ati or using ati cards? u just pay and want other ppl to pay too..stupid

LOL :laugh:
Hey, fanboi... since you're such a wiseguy, why dont you explain to us why one site shows NV winning in AOE3 period, and 2 other sites show it wining without HDR and losing with HDR?
i only commented on u as a fanboy!i didnt talk about games.. MUNKY THE BIG FANBOY
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Alaa
Originally posted by: munky
Originally posted by: Alaa
Originally posted by: munky
Originally posted by: Alaa
Originally posted by: munky
I'm gonna make a chart that includes these sites (just a little bit more technical knowledge than the others): ;)
http://www.computerbase.de/artikel/hard...900_cf-edition/1/#abschnitt_einleitung
http://www.hardware.fr/articles/605-1/r...xtx-x1900-xt-x1900-crossfire-test.html

Unless someone else wants to do it...

maybe more ATi biased? i think so, u love them then they're biased

No, ya newb. The kind of sites that have a much higher reputation among the people that actually know something about the hardware.

u r just another fanboy..do u gain anything from recommending ati or using ati cards? u just pay and want other ppl to pay too..stupid

LOL :laugh:
Hey, fanboi... since you're such a wiseguy, why dont you explain to us why one site shows NV winning in AOE3 period, and 2 other sites show it wining without HDR and losing with HDR?
i only commented on u as a fanboy!i didnt talk about games.. MUNKY THE BIG FANBOY

Answer the question, fanboi.
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
Originally posted by: Rollo
Originally posted by: Elfear
Interesting chart, although I don't know why benchmarks without AA are given as much emphasis as benchmarks with it. I mean really, who pays $500 for a card and doesn't use AA. That's like doing an exotic car shootout and than giving gas mileage as much emphasis as track times.:disgust:

Maybe as a baseline? Or for the online guys who want the 100fps minimums?


Good point, although most of the hardcore gamers I know just use a mid-grade card and fly with it. Of all the lan parties I've ever been to, a 9800Pro was considered high-end for most of the guys, even the hardcore headshoting-you-with-a-pistol-from-300-yards-while-I-jump-through-the-air kinda guys. I remember one was quite excited about receiving his new 6800NU this last December. NOt saysing that's bad at all, just stating what my experience has been with the hardcore guys.

IMO the overwhelming majority of people who buy $500 cards want to play with AA and AF so those benchmarks should receive greater emphasis than the ones without. Say card A won five benchmarks without AA/AF and card B won five benchmarks with AA/AF, which card would most people (willing to pay $500 for a card) buy?
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Wreckage, munky, beggarking, alaa, and everyone else name-calling and slinging insults: cease-fire, please? Let's try going without "monkey," "retard," "prick," "fanboy," and so on for a few days. If you're going to trump someone with logic or facts or links, go for it, but keep your venom and vitriol on your side of the keyboard. (Feel free to sneer, sigh, LOL, or roll your actual eyes at your monitor as much as you want.)

DeathReborn, munky is right. Forcing every site to use the same benchmarks will only ensure ATI and NV will spend most of their time optimizing just for those few benches, probably ignoring smaller titles. It's best if each site chooses its own range of titles, preferably not only the big sellers or those with handy benchmark modes.

munky, I compiled #s from a few more sites and posted them here (second to last post, as of now). I guess either no one saw or no one cares until you've got omg pics. I've since added Rage3D's #s, and I'll post a new summary (and pics!!1) in a bit.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Ronin
Originally posted by: munky
But that's the general trend in ATi's high end cards, and something to be considered for anyone planning to buy one of these cards.
Edited for truth.
Well, recent truth, maybe. We've seen the X800XT and the 7800GTX stay in their predecessors' power envelopes b/c they're both basically the same tech on a smaller process. The 6800U was a more seriously revamped core on the same old process, true, but it also dropped core speed and used newer RAM. And you of all ppl should know the GTX-512 puts a hurtin' on a PSU. :)

I don't think it'd be a stretch to think R600 and G80 will consume yet more power if they'll use even faster RAM and add more transistors to offset a smaller process. And if they're unified shader architectures, it might be reasonable to expect more of the core will be in use when gaming, so may draw more power per transistor per clock simply b/c more transistors are in use. Well, reasonable to this total EE nub, anyway. :p

Xbit just published a timely article: Power Consumption of Modern Graphics Cards 2. About all I can gather is that power draw becomes logarithmic past a certain boundary. You can see that most easily with X1800XL to X1800XT and 6600DDR2 to 6600GT: double the power draw for much smaller clock bumps. (Though you have to think something's up with the 6600GT if it draws about as much power as a 6800GS with a larger core and twice the RAM.)

Yeah, we see the XTX drawing a lot more than the GTX-512. ATI should be happy that the similarly-clocked 1900XT and X1800XT each draw basically the same power. Considering R580 is ~380M transistors and G70 is ~300M, R580 doesn't look too bad for drawing 30% more power for 25% more transistors. Of course, R580 is 90nm and G70 is 110nm, so that's not an entirely even comparison--this may bode well for a G71 that gains 80M+ transistors but drops to 90nm (low-k?).
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
OK, for the trifecta....

I've compiled some 16x12 0x0 and 4x16 #s from sites MikeC didn't include but that I thought merited equal attention. I'm also going to split the #s a bunch of different ways b/c I'm stupid (massochistic) like that. I'll accept the fact that we're comparing two cards that I don't consider reasonable purchases for various reasons, just b/c I don't want to redo MikeC's work. (So I'm not a total massochist--sue me. ;))

MikeC's chart takes #s from AT, ET, FS, HH, TR, and Xbit, and gives the X1900XTX a win-loss-draw total of 43-38-2.

My chart takes #s from ComputerBase.de, Digit-Life.com (aka iXBT), Hardware.fr (aka BeHardware.com), Hexus.net, Rage3D, and ZDNet.de. Yeah, a few foreign sites, but they've been around a while and seem solid. These sites give the XTX a record of 49-44-3, so still pretty close.

Combining both, the XTX is 92-82-5. Here's a summary.

I'm going to wander off the beaten path, so feel free to skip this italicized bit.

It might be more enlightening to work with the #s a bit. For instance, I've always thought calling 33 vs. 32fps a win was a bit of a stretch. So I got a bit more liberal with the conditions for a draw. Let's see how the

If I define a draw as:

<5% difference, XTX is 79-62-38.
<10% difference, XTX is 60-50-69.
<15% difference, XTX is 50-45-84.

The XTX still wins more tests than the GTX-512, and its win percentage declines only slightly. So, no real change. I think pushing a "draw" up to even 15% is fair, as I'm not sure you'll easily notice 30 vs. 34.5fps, or even 100 vs 115fps. But that's just me.


But do we really consider 16x12 without AA a valid test for these $650+ cards? Let's focus on two more enthusiast-oriented settings, HDR and AA+AF.

HDR is tricky, as there are only five games with explicit HDR modes, some of them don't run with the same precision on each card (which can result in visible differences), and some of these "HDR" modes don't appear to be anything more than a bloom effect (which blinds you, more than anything). In the five "HDR" games--AoE3, FC, HL2:LC, SC:CT, and SS2--the XTX's record at 16x12 0x0 is 6-2-0. Again, not really an even comparison, but it's there.

AA+AF is a bit easier to compare, as both cards should default to similar IQ benefits (ignoring HQ AF and TAA/AAA). So, looking at 16x12 w/ 4xAA & 8/16xAF #s, the XTX is 28-22-1.

Again, NV is still the convincing OpenGL champ. Tho ATI closed the gap in Doom 3 and Quake 4, NV still rocks in IL-2, Pacific Fighters, and Riddick.

I'm still not convinced XTX vs. GTX-512 is a worthwhile comparison, though, as the XTX is overpriced compared to the XT and the GTX-512 as a single-card solution is both hard to find and overpriced relative to its performance vs. the XT(X). If we look down a notch, it's also not quite fair to compare a X1900XT to a 7800GTX b/c of different street prices ($490 vs. $445) and RAM sizes (512 vs 256MB)--but the X1900XT wins just about every benchmark in this comparison, so it's easier to justify its price. Comparing a X1900XTX to a 7800 GTX-512 seems more an academic exercise than anything else, but at least the GTX-512 is a better hint of what's to come with the 7900. Obviously it's not a great hint, especially if the 7900 will sport significant increases in both core clock and # of pipes, but it's better than jumping from a GTX-256.

Aaaanyway, a few more weeks til 7900. Can't wait to see if NV's gonna do some "housecleaning" with ATI. :)
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
on a related note...if anyone bothered to read the forum thread. Mike is also gonna do a Crossfire vs SLI chart soon (maybe the weekend?)
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Hey, nice work on the chart, Pete. One correction I'd make is on the AOE3 benches from ComputerBase - I believe they had HDR enabled in that test.
 

Alaa

Senior member
Apr 26, 2005
839
8
81
Originally posted by: munky
Answer the question, fanboi.
why should i answer it? am i the developer of the AOE3 part in nvidia cards?just believe it..ur a fanboy, and u wud recommend ati cards even if nvidia cards are released 6 months ahead of ati's..luk at u, u r so happy that ati has the cards 1 month ahead FANBOY :Dplz dont quote me again thts boring
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Alaa
Originally posted by: munky
Answer the question, fanboi.
why should i answer it? am i the developer of the AOE3 part in nvidia cards?just believe it..ur a fanboy, and u wud recommend ati cards even if nvidia cards are released 6 months ahead of ati's..luk at u, u r so happy that ati has the cards 1 month ahead FANBOY :Dplz dont quote me again thts boring

Come back when you know the answer...
 

Alaa

Senior member
Apr 26, 2005
839
8
81
Originally posted by: munky
Originally posted by: Alaa
Originally posted by: munky
Answer the question, fanboi.
why should i answer it? am i the developer of the AOE3 part in nvidia cards?just believe it..ur a fanboy, and u wud recommend ati cards even if nvidia cards are released 6 months ahead of ati's..luk at u, u r so happy that ati has the cards 1 month ahead FANBOY :Dplz dont quote me again thts boring

Come back when you know the answer...
am not comin back for fanboys, stop quoting..I PRAY FOR THE 7900 TO BE LESS THAN 32 PIPES :D:D
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: munky
Hey, nice work on the chart, Pete. One correction I'd make is on the AOE3 benches from ComputerBase - I believe they had HDR enabled in that test.
Thanks, and arrrrrrrrgh. I really don't feel like redoing everything for that one #, so I'll just leave it in and hope no one else notices. He-he ... eh.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Pete
Originally posted by: munky
Hey, nice work on the chart, Pete. One correction I'd make is on the AOE3 benches from ComputerBase - I believe they had HDR enabled in that test.
Thanks, and arrrrrrrrgh. I really don't feel like redoing everything for that one #, so I'll just leave it in and hope no one else notices. He-he ... eh.

Nothing a little photochop cant fix

:beer:
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
What I'm most preturbed by is how different Anandtech is from other sites on FEAR and Splinter Cell.

Xbit labs doing the most thorough review as usual.
 

mindless1

Diamond Member
Aug 11, 2001
8,729
1,745
126
Xbit just published a timely article: Power Consumption of Modern Graphics Cards 2. About all I can gather is that power draw becomes logarithmic past a certain boundary. You can see that most easily with X1800XL to X1800XT and 6600DDR2 to 6600GT: double the power draw for much smaller clock bumps. (Though you have to think something's up with the 6600GT if it draws about as much power as a 6800GS with a larger core and twice the RAM.)

This is commonly the case when the GPU core voltage is increased to retain stability.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Yep, I've noticed. It's interesting, if I may extrapolate half-assedly, that NV seems to hit the process sweet spot closer to 400MHz since NV30 (130nm), whereas ATI's closer to 500MHz since R420 (130nm low-k) before power draw shoots sky-high. I mean that, very roughly, NV's recent "volume" high-end part tends to be around 400MHz (5800, 5900XT, 6800GT, 7800GT/X), whereas ATI's tend to be around 500MHz (X800XT, X1800XL, AIWX1900[XL]). Low-k in action?