G70 Vs. R520 All things being equal

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
I thought the article was very interesting. Not exactly sure what it all means, but it's interesting.
 

jam3

Member
Apr 9, 2003
90
0
0
How is it an error when at the moment it seems to give generally better performance? Even if it is only a moderate improvement?

Power usage, plain and simple. x1800xt uses approximatly 50 more watts of power than the 7800gtx. And if you put them both on the same coolers I would bet there would be more noticeable gap in heat as well (would like to see some cooling tests. This is very reminiscent of AMD(Nvidia, lower mhz, more pipes, less power, less heat) vs Intel(less pipes, more mhz, more power, more heat).
 

Hacp

Lifer
Jun 8, 2005
13,923
2
81
Originally posted by: jam3
How is it an error when at the moment it seems to give generally better performance? Even if it is only a moderate improvement?

Power usage, plain and simple. x1800xt uses approximatly 50 more watts of power than the 7800gtx. And if you put them both on the same coolers I would bet there would be more noticeable gap in heat as well (would like to see some cooling tests. This is very reminiscent of AMD(Nvidia, lower mhz, more pipes, less power, less heat) vs Intel(less pipes, more mhz, more power, more heat).


From what I hear, the power usage increase is due to the memory. Anyone with Xls care to share their power consumption compared to the Gt????
 

jam3

Member
Apr 9, 2003
90
0
0
Oh, and I see alot of people saying that this type of test is not usefull in making a purchasing decision, which is just completly wrong, although this particular article is not as extensive as I wish it would have been.

This is basically making the baseline, once some serious overclocking tests are done you can take the baseline and use it as a comparision for the overclock tests to determine price/watt, price/performance, and heat/cooling.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Matt2
Originally posted by: munky
Originally posted by: Matt2
Originally posted by: munky
Originally posted by: DLeRium
Originally posted by: LTC8K6
So, what will ATI have if they get 20 or 24 "pixel processors" going considering they are doing pretty well with just 16?

They're not goign pretty well with 16. NV can just clock higher and then they would be going pretty well. This bench shows NV has the upper hand because fo higher efficiency. Many of us including me were fooled that the R520 would be more efficient with only 16 pipes, but in reality, speed change is more responsive to clock speed change, so R520 doesnt make up by having more efficient pipes, but rather just faster clocked stuff....

I disagree. If anything, it's the r520 that has more efficient "pipes". If you multiply the number of pipes times the clock frequency, both cards have roughly 10 gigatexels per second fillrate, with the gtx having a bit more.

http://www.techreport.com/reviews/2005q4/radeon-x1000/index.x?pg=14
If you look at the above shadermark benches, you can see that each card wins some and loses some. But then look at the second chart, showing flow control (aka dynamic branching) performance, and you can definitely see which card is more efficient running the hyped SM3 feature.

I think you're wrong, unless I am misunderstanding you.

With the help of rivatuner we were able to ensure the card had 8 pipelines disabled while leaving the 8 vertex units intact.

With the same 450/1000 clock and BOTH cards using 16 pipes, the G70 clearly beat the R520 in games that were benched. That's not to say it will happen in all games, but even in CS:VST, an engine that favors ATI, the G70 beat the R520... and beat it pretty good, no 1% victory here. Even in 3DMark where the R520 beats the GTX in every comparison I have seen, loses to the G70 by almost 500 3dmarks.

It seems pretty clear to me that when pipes and clocks are the same, Nvidia does have the faster design.

I meant comparing the performance and efficiency of the cards as they are, without downclocking and disabling pipes. The whole topic of video card efficiency is pretty pointless since you're comparing different games coded in different ways running on different hardware that favors some methods over others. What I was refering to is a specific case of running shaders with flow control.


Hey, with that system of yours, why even bother with this thread? Shouldnt you be helping John Carmack code Doom20???

You should be a comedian...
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: Cooler
Ok They under clocked that card :confused:. Why not just OC GTX to ATI speed maybee because they cant without extream cooling. Thus showing the strong point of r520 extream clock speeds.



Are you using "extream" spelling???
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
When you compare the two cards "apples to apples" the ATi cards don't seem very efficient. If NVIDIA does have some headroom with their clock speeds they could release a higher clocked 512 part and just about dominate.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
I don't see what the big deal is really guys.

Clock for clock, pipe for pipe, vertex unit for vertex unit.

This test, however brief, shows which GPU does more, or less, in a single clock when all variables that can be equalized, are equalized.

G70: 450 core, 1000MHz mem, 16 pipes (2 quads disabled), 8 vertex units.

R520: 455 core, 1000MHz mem, 16 pipes, 8 vertex units.

2 games tested: Doom3 and CS:Source (Most efficient GPU in these 2 games: G70)

Synthetic Bench: 3DMark05 (G70 by 400 points)

Synthetic Bench: ShaderMark 2.1 (G70- 6717) (R520- 6686) G70 very slightly ahead.

I would definately love to see a whole mess of games tested like this and I hope Ronin can provide this like he plans to.

So, anyone can argue that these are totally different architectures, and they would be right. But that is exactly why this test is a good indication of who's architecture is more efficient. All things equal (clocks, pipes, units), there is nothing left to differentiate except the architecture. Out of these tests (and we would need a great many more) G70 is the more efficient. It seems G70's pipes and vertex units are just a tad more extreme than ATI. I wanted this type of review for the longest time and I'm glad someone took the time who had the resources to do it. Albeit very briefly.
Maybe we can email this reviewer to get more games tested like HL2, FarCry, SC:CT, you name it.

Sorry to interrupt with logic in the middle of all the blasting. Please continue. ;)
 

EightySix Four

Diamond Member
Jul 17, 2004
5,122
52
91
I find the overclocking potential of these cards insane though... anyone see what Macci did with them? He beat a 7800gtx SLI rig with a single x1800xt with his custom cooling... Some people are rumoring over 10k with air in 3dmark on the xt's... that's nuts
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
I don't think Doom3 shows which has a more efficient core since Doom3 is such as wash for Nvidia cards. Sure, it shows how good NV cards are at heavy stencil shadow loads, but in terms of efficiency, those numbers are in line with full clockspeed comparisions as well.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: crazySOB297
I find the overclocking potential of these cards insane though... anyone see what Macci did with them? He beat a 7800gtx SLI rig with a single x1800xt with his custom cooling... Some people are rumoring over 10k with air in 3dmark on the xt's... that's nuts


Yes and we all know how important 3dmark05 is in determining real world gaming performance...
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: crazySOB297
I find the overclocking potential of these cards insane though... anyone see what Macci did with them? He beat a 7800gtx SLI rig with a single x1800xt with his custom cooling... Some people are rumoring over 10k with air in 3dmark on the xt's... that's nuts

Both cards can OC just as well.

The 7800GTX achieved nearly the same as SLIed GTX when OCed as well.

-Kevin
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: jiffylube1024
I don't think Doom3 shows which has a more efficient core since Doom3 is such as wash for Nvidia cards. Sure, it shows how good NV cards are at heavy stencil shadow loads, but in terms of efficiency, those numbers are in line with full clockspeed comparisions as well.

Well, I did mention that more game testing was needed. But for the sake of your comment, remove Doom3 completely from the test. And I am not quite sure what you mean by "those numbers are in line with fulll clockspeed comparisons as well.". What exactly do you mean?

 

IeraseU

Senior member
Aug 25, 2004
778
0
71
The test doesnt even show AA or AF filtering. IMO this is where ATI cards excell, they seem to take less of a hit under those conditions.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: IeraseU
The test doesnt even show AA or AF filtering. IMO this is where ATI cards excell, they seem to take less of a hit under those conditions.

On the contrary they all have AA + AF enabled.

-Kevin

Edit: Err both games tested have AA + AF enabled
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: IeraseU
The test doesnt even show AA or AF filtering. IMO this is where ATI cards excell, they seem to take less of a hit under those conditions.

Doom3: 1600x1200 4AA 8AF high
R520: 34 (23fps hit)
G70: 48 (36 fps hit)

Doom3: 1600x1200 0AA 8AF high
R520: 57
G70: 84


Counter Strike Source: 1600x1200 4Aa 16AF high
R520: 91 (24fps hit)
G70: 99 (32fps hit)

Counter Strike Source: 1600x1200 0AA 8AF high
R520: 115
G70: 131

Yes, the R520 definitely takes less of a hit when AA is enabled. But they did in fact use AA and AF when you mentioned they did not. Why did you say that?

 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: keysplayr2003
Originally posted by: jiffylube1024
I don't think Doom3 shows which has a more efficient core since Doom3 is such as wash for Nvidia cards. Sure, it shows how good NV cards are at heavy stencil shadow loads, but in terms of efficiency, those numbers are in line with full clockspeed comparisions as well.

Well, I did mention that more game testing was needed. But for the sake of your comment, remove Doom3 completely from the test. And I am not quite sure what you mean by "those numbers are in line with fulll clockspeed comparisons as well.". What exactly do you mean?

I just meant that the difference in framerate between the two cards in Doom3 was pretty similar to the X1800XT running at 625 MHz and the 7800GTX running @ 24 pipes.

I do agree that in order to make this comparison make any sense, a lot more tests would need to be run.

Originally posted by: IeraseU
The test doesnt even show AA or AF filtering. IMO this is where ATI cards excell, they seem to take less of a hit under those conditions.

^ This would no longer be the case as the ATI card has been stripped of 50% of its memory bandwidth (running at 1 Ghz vs 1.5 GHz on stock X1800XT's), while the NVidia card has lost only 20% of its memory bandwidth, taking a drop from 1.2 Ghz to 1 Ghz.

So, while I do admit that a full gamut of tests at these speeds would be interesting for speculation, aside from an e-penis comparison within an e-penis comparison ("my 7800GTX is faster than your X1800XT, plus it's more efficient clock-per-clock!"), I think the test is, on the whole, trivial. Interesting - yes. But trivial nonetheless.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: jiffylube1024
Originally posted by: keysplayr2003
Originally posted by: jiffylube1024
I don't think Doom3 shows which has a more efficient core since Doom3 is such as wash for Nvidia cards. Sure, it shows how good NV cards are at heavy stencil shadow loads, but in terms of efficiency, those numbers are in line with full clockspeed comparisions as well.

Well, I did mention that more game testing was needed. But for the sake of your comment, remove Doom3 completely from the test. And I am not quite sure what you mean by "those numbers are in line with fulll clockspeed comparisons as well.". What exactly do you mean?

I just meant that the difference in framerate between the two cards in Doom3 was pretty similar to the X1800XT running at 625 MHz and the 7800GTX running @ 24 pipes.

I do agree that in order to make this comparison make any sense, a lot more tests would need to be run.

Originally posted by: IeraseU
The test doesnt even show AA or AF filtering. IMO this is where ATI cards excell, they seem to take less of a hit under those conditions.

^ This would no longer be the case as the ATI card has been stripped of 50% of its memory bandwidth (running at 1 Ghz vs 1.5 GHz on stock X1800XT's), while the NVidia card has lost only 20% of its memory bandwidth, taking a drop from 1.2 Ghz to 1 Ghz.

So, while I do admit that a full gamut of tests at these speeds would be interesting for speculation, aside from an e-penis comparison within an e-penis comparison ("my 7800GTX is faster than your X1800XT, plus it's more efficient clock-per-clock!"), I think the test is, on the whole, trivial. Interesting - yes. But trivial nonetheless.

Interesting, very much so. These are the things I enjoy discussing. Not fighting over.

 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: jiffylube1024
So, while I do admit that a full gamut of tests at these speeds would be interesting for speculation, aside from an e-penis comparison within an e-penis comparison ("my 7800GTX is faster than your X1800XT, plus it's more efficient clock-per-clock!"), I think the test is, on the whole, trivial. Interesting - yes. But trivial nonetheless.
Unless of course NVIDIA does come out with a higher clocked card. The 512 GTX is already rumored to have better memory. So this comparison could give us a hint as to how well it will perform.

 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
Originally posted by: crazySOB297
I find the overclocking potential of these cards insane though... anyone see what Macci did with them? He beat a 7800gtx SLI rig with a single x1800xt with his custom cooling... Some people are rumoring over 10k with air in 3dmark on the xt's... that's nuts


The XL may be a cgood reason why we dont judge a whole line on a few possibly "hand picked cherries" that the reviewers often get...This ihas happened for years in the cpu market....

I would wait to see more results before I assume all XTY's will OC the same....

remember Macci was running ice cooling and had an FX oc'd to 3.6ghz...MOst wont touch his numbers...
 

Hail The Brain Slug

Diamond Member
Oct 10, 2005
3,784
3,101
146
How stupid can I be, with my first post on this forum?
(post meant for 7800GTX vs. X1800XT)
Is there any way to delete a post?


Crazy.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Wreckage
Originally posted by: jiffylube1024
So, while I do admit that a full gamut of tests at these speeds would be interesting for speculation, aside from an e-penis comparison within an e-penis comparison ("my 7800GTX is faster than your X1800XT, plus it's more efficient clock-per-clock!"), I think the test is, on the whole, trivial. Interesting - yes. But trivial nonetheless.
Unless of course NVIDIA does come out with a higher clocked card. The 512 GTX is already rumored to have better memory. So this comparison could give us a hint as to how well it will perform.

This is supposed to come out November 1st. And it could have the same memory used on the X1800XT. So, tack on $100.00 to this card over 256MB GTX's. Crap. :D

 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: XabanakFanatik
How stupid can I be, with my first post on this forum?
(post meant for 7800GTX vs. X1800XT)
Is there any way to delete a post?


Crazy.


Don't beat yourself up over it. No we can't delete posts, only edit. And welcome to the forums!
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Wreckage
Originally posted by: jiffylube1024
So, while I do admit that a full gamut of tests at these speeds would be interesting for speculation, aside from an e-penis comparison within an e-penis comparison ("my 7800GTX is faster than your X1800XT, plus it's more efficient clock-per-clock!"), I think the test is, on the whole, trivial. Interesting - yes. But trivial nonetheless.
Unless of course NVIDIA does come out with a higher clocked card. The 512 GTX is already rumored to have better memory. So this comparison could give us a hint as to how well it will perform.


? Nvidia's higher clocked card still will compete against an X1800XT running at 625/1500, not 450/1000.

If you want to know how Nvidia's higher clocked card will perform, paruse various overclock threads for performance to get a good indication.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
I wonder if the extra 512MB of memory will help the G70 with it's AA/AF stamina.

EDIT: I meant "extra 256"