X1900XTX Benchmarks

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

imported_dwalton

Junior Member
Aug 12, 2005
19
0
0
I don't trust the scores. Everyone has assumed including me that the x1900 would be roughly a higher piped version of the 1600XT (4 x pipes and shaders). Problem is the 1600XT post abnormally high 3DMark05 scores. Look at the reviews on the web, a 1600XT will outscore an 6800GS, 6800GT, X800XL and come within 200-500 points of a 6800 Ultra and a 850XT in 3dMark05. However, real world it can barely out perform a vanilla 6800.

I always assumed that the increased Mark05 scores were the results of the additional shaders per pipe and that the 580s would have the same phenom, a monster 3dmark05 score (~12000-13000) but real world games scores with a smaller performance delta versus anything else (excluding the 7900, which would be a toss up).
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: dwalton
I don't trust the scores. Everyone has assumed including me that the x1900 would be roughly a higher piped version of the 1600XT (4 x pipes and shaders). Problem is the 1600XT post abnormally high 3DMark05 scores. Look at the reviews on the web, a 1600XT will outscore an 6800GS, 6800GT, X800XL and come within 200-500 points of a 6800 Ultra and a 850XT in 3dMark05. However, real world it can barely out perform a vanilla 6800.

I always assumed that the increased Mark05 scores were the results of the additional shaders per pipe and that the 580s would have the same phenom, a monster 3dmark05 score (~12000-13000) but real world games scores with a smaller performance delta versus anything else (excluding the 7900, which would be a toss up).

Interesting post. It does seem likely that if the r580 was to beat the crap out of every other card it would be in 3dmark and not necessarily in actual games.
 

TecHNooB

Diamond Member
Sep 10, 2005
7,458
1
76
Originally posted by: Looney
Who freaking cares about ATI (and i use to be such a huge ATI fan) when the card won't even be available in the retail channel for months. By that time, there's going to be another 'new' card that is being released. The X1800 are just finally out there, and now there's the X1900?

Wow, horrible assumption. The X1800 was late because it ran into unexpected problems. You should expect new/innovative? technology to be delayed, particularly Crossfire. New tech needs to mature, so no bashing Crossfire just yet. R580 is supposed to be on schedule and in mass production as we speak. Or so I've read on these forums and elsewhere =) No company makes it their business to dissappoint their customers... actually I take that back :eek:
 

DrZoidberg

Member
Jul 10, 2005
171
0
0
I think people are expecting too much from refresh cards. This is not a new generation.

I still dont believe nVidia will bring out a 700MHZ 32 pipleline card. Maybe for G80. Why do the need to clock it so high if it had 32 pipes, they could leave the clock at 550mhz and it will still destroy the 7800gtx 256mb. Or maybe the clock is 700mhz but on a 24 pipe card thats why they needed such high clocks.

Dont forget inquirer 2-3 months before r520 kept saying r520 was a 24-32 pipeline card. Hype was so high it was going to kill the 7800 series kinda like what Radeon 9800 did to Geforce FX series. It never turned out that way.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
But I think they're right this time around. It's no secret that the r580 will have 48 pixel shaders - this has been mentioned on leaked Ati slides about half a year ago. Nv needs a 32 pipe card to stay competitive. These refresh models are definitely not the typical refresh cards we've seen in the past.
 

hemmy

Member
Jun 19, 2005
191
0
0
NVIDIA will increase 7800GTX 512 availability, ATI will launch R580, then NVIDIA will launch G71

Mid 2006 next-gen
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
If 3DM05 is indeed vertex setup limited, as some have said, then G71 may show greater gains b/c it's supposed to a) include more VS units and b)e clocked much higher, whereas R580 is rumored to a) include more VS units but b)e clocked about the same as R520. So we might need to look at other benchmarks to explore each card's strengths and weaknesses.

To relate to dwalton's argument, RV530 (X1600) had an unusually high VS count (relative to other cards in its class, just like X700), which may explain its unusually (again, relatively) high 3DM05 score. Then again, it also packed 12 PS pipes @ 590MHz, so vertex setup might not be the only reason.
 

imported_dwalton

Junior Member
Aug 12, 2005
19
0
0
Originally posted by: munky
Originally posted by: dwalton
I don't trust the scores. Everyone has assumed including me that the x1900 would be roughly a higher piped version of the 1600XT (4 x pipes and shaders). Problem is the 1600XT post abnormally high 3DMark05 scores. Look at the reviews on the web, a 1600XT will outscore an 6800GS, 6800GT, X800XL and come within 200-500 points of a 6800 Ultra and a 850XT in 3dMark05. However, real world it can barely out perform a vanilla 6800.

I always assumed that the increased Mark05 scores were the results of the additional shaders per pipe and that the 580s would have the same phenom, a monster 3dmark05 score (~12000-13000) but real world games scores with a smaller performance delta versus anything else (excluding the 7900, which would be a toss up).

Interesting post. It does seem likely that if the r580 was to beat the crap out of every other card it would be in 3dmark and not necessarily in actual games.

Exactly, based of the 3dmark05 info by hexus and how R580 is supposely just an extenstion of the 530, a 11k 1900 would get smashed by a 13k 7900. It would be the worst drubbing since before the 5800 vs 9700 fiasco.

I hardly doubt that ATI would short change the 1800 by releasing the underperforming 1900 3 months later unless ATI is trying to commit suicide. However, this is all null and void if the 580 lacks the 530 ability to produced borked 3dmark05 scores.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Matt2
Originally posted by: schtuga
Originally posted by: Matt2
Originally posted by: SickBeast
Something looks borked with their numbers, or else ATi borked their card somehow.

Something looks borked with a bunch of their scores.

I posted this in the other thread too, but why does the 7900 score read "~13,000"???


I thought the reason for the question marks is because it is the only one listed without an msrp.All the others have the score followed by the price.


I was talking about the "~". Approximately 13,000??

WHat does that mean? They were dreaming of 13,000? :)

13,000 is the PRICE of the Ultra GTX :p
:Q

in dollars
:shocked:


:D
 

the Chase

Golden Member
Sep 22, 2005
1,403
0
0
Originally posted by: DrZoidberg
I think people are expecting too much from refresh cards. This is not a new generation.

I still dont believe nVidia will bring out a 700MHZ 32 pipleline card. Maybe for G80. Why do the need to clock it so high if it had 32 pipes, they could leave the clock at 550mhz and it will still destroy the 7800gtx 256mb. Or maybe the clock is 700mhz but on a 24 pipe card thats why they needed such high clocks.

Dont forget inquirer 2-3 months before r520 kept saying r520 was a 24-32 pipeline card. Hype was so high it was going to kill the 7800 series kinda like what Radeon 9800 did to Geforce FX series. It never turned out that way.

If the G71 turns out to be 32 pipelines would you still consider it a refresh card? I consider both the R580 and G71 to be totally new "cores" and too many changes to call them "refresh" cards if they turn out to be as we are speculating. But I don't know the strict definition of "refresh" or if even there is one or why I'm taking the time to type this as it really doesn't matter one way or the other!

Edit- I still think they are not refreshes.:D
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Originally posted by: Rollo
Meh. I want benchmarks, I don't play that 3DMark game.

Agreed. Gaming benchmarks and in-game stress test on new games will tell the real story. I want performance and features for my money, not theories.
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
One thing about why the memory is is slower than the G71 card. Or rumoured specs of these cards is i think the Ring Bus ARchitecture for the memory as its a more efficient way of handle and much faster for less speed.

Much like the comparisons of Intel and AMd a few years ago of works per clock cycle.
 

imported_dwalton

Junior Member
Aug 12, 2005
19
0
0
Originally posted by: the Chase
Originally posted by: DrZoidberg
I think people are expecting too much from refresh cards. This is not a new generation.

I still dont believe nVidia will bring out a 700MHZ 32 pipleline card. Maybe for G80. Why do the need to clock it so high if it had 32 pipes, they could leave the clock at 550mhz and it will still destroy the 7800gtx 256mb. Or maybe the clock is 700mhz but on a 24 pipe card thats why they needed such high clocks.

Dont forget inquirer 2-3 months before r520 kept saying r520 was a 24-32 pipeline card. Hype was so high it was going to kill the 7800 series kinda like what Radeon 9800 did to Geforce FX series. It never turned out that way.

If the G71 turns out to be 32 pipelines would you still consider it a refresh card? I consider both the R580 and G71 to be totally new "cores" and too many changes to call them "refresh" cards if they turn out to be as we are speculating. But I don't know the strict definition of "refresh" or if even there is one or why I'm taking the time to type this as it really doesn't matter one way or the other!

Edit- I still think they are not refreshes.:D

Why is anyone expecting anything logical from Nvidia and ATI?

Nvidia buck the trend of refreshes of highend cards during the 6800. Then they release a new board with a faster G70 with more and faster memory and simply threw "512" at the end of the name with limited availability. If the 6800 Ultra/GT vs. 6800 vanilla or 7800GTX vs. 7800GT differ in the number of pipes, core and memory speed then why would anyone believe that a "7900" designation would simply be a refresh card with a bumped core.

ATI went from the 9700/9800 to 800 (16-?-?-?)/850(16-?-?-?) to 1800 (16-1-1-1)/1900 (16-1-3-1). Not as bad as nvidia but not easy to follow unless you keep up with GPU tech.

ATI, Nvidia....logical or rational? They basically both gone insane trying to one up each other.
 

DrZoidberg

Member
Jul 10, 2005
171
0
0
no it is technically a refresh cause they ATI isnt calling the r580 a x2800xt. And nvidia isnt calling the G71 a 8800GT. If it was a new core they would follow their past 5 year naming schemes.

R580 will have 48 shaders but there is only 1-2 shader limited games today like FEAR. Most other games wont use all 48 shader units at same time therefore i dont believe nvidia needs to release a 32 pipe 700mhz core. If it was 32 pipe 700mhz core then the price would be $800 which is out of most peoples budget.
 

imported_dwalton

Junior Member
Aug 12, 2005
19
0
0
Originally posted by: DrZoidberg
no it is technically a refresh cause they ATI isnt calling the r580 a x2800xt. And nvidia isnt calling the G71 a 8800GT. If it was a new core they would follow their past 5 year naming schemes.

R580 will have 48 shaders but there is only 1-2 shader limited games today like FEAR. Most other games wont use all 48 shader units at same time therefore i dont believe nvidia needs to release a 32 pipe 700mhz core. If it was 32 pipe 700mhz core then the price would be $800 which is out of most peoples budget.

technical? there is nothing technical about this. nvidia or ati don't have any international guidelines to follow. Do you consider the 7800 GT a refresh of the GTX, the 6600GT a refresh of the 6800 Ultra. Probably not, those cards were release after thier generation's highend. They both differ in the number of pipes, core speed and memory speed vs. thier highend relatives. If you can go down there is nothing to say you can't go up.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Originally posted by: Stoneburner
I hope everybody remembers these are refresh products.

speaking of refreshes, the 9X00 series for ATI lasted about 18 months, the 9800xt was top dog for a year. I thought ATI and Nvidia were going to slow down their refresh cycles... but they are going faster than ever. ANyway, my x1800xl gets about 9000 3dmarks and can probably go higher if i put a zalman on it. Not impressed considering that 3dmark05 typically shows the biggest gains.

assuming the scores are correct.

Forget the Zalman & AC Silencer, put Water/Phase Change on there, lol. I'm gonna wait and see with the "refreshes" as to the games & software around at the time to take advantage of it. For me that will involve Oblivion & Alpha & Beta tests on Warhammer Online.
 

the Chase

Golden Member
Sep 22, 2005
1,403
0
0
Originally posted by: DrZoidberg
no it is technically a refresh cause they ATI isnt calling the r580 a x2800xt. And nvidia isnt calling the G71 a 8800GT. If it was a new core they would follow their past 5 year naming schemes.

R580 will have 48 shaders but there is only 1-2 shader limited games today like FEAR. Most other games wont use all 48 shader units at same time therefore i dont believe nvidia needs to release a 32 pipe 700mhz core. If it was 32 pipe 700mhz core then the price would be $800 which is out of most peoples budget.

Well I see your point with the naming scheme..But also G70 core and related cards and now G71 core and related cards.R520 core/cards and R580 core/cards. But yeah these are certainly not entire new arcitechtures?(spel) like the unified G80 and R600 will be.
 

Skott

Diamond Member
Oct 4, 2005
5,730
1
76
Originally posted by: mokert
Will this card need a Beefy PSU?



Probably. Whats your definition of 'beefy'? It'll Depend on your overall system setup as to how big a psu you'll need in the end.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,393
8,552
126
Originally posted by: the ChaseIf the G71 turns out to be 32 pipelines would you still consider it a refresh card? I consider both the R580 and G71 to be totally new "cores" and too many changes to call them "refresh" cards if they turn out to be as we are speculating. But I don't know the strict definition of "refresh" or if even there is one or why I'm taking the time to type this as it really doesn't matter one way or the other!

Edit- I still think they are not refreshes.:D

i would consider it a refresh because the cards do not add major new features compared to the previous lineup. every new line offers increased performance, whether that is gotten through increase in pipelines or clockspeed does not matter. major new features are the only way you can tell one generation from another. anything that increases playability (speed), but does not add major new features is a refresh. using that:
gen 1: riva 128, 128zx ---> first commercially successful 3d chips from nvidia
gen 2: riva tnt, tnt2 ---> the tnt added 32 bit color, the tnt2 made it playable
gen 3: geforce, geforce2 ---> added t&l, the gf2 increased frames
gen 4: geforce3, geforce4 ---> added programability to t&l (SM1.x)
gen 5: geforce fx5800, geforce fx5900 ---> the fx added SM2, the 5900 was actually playable
gen 6: geforce 6800, geforce 7800, g71 ---> 6800 added SM3, afaik neither the 7800 nor the g71 have added any major new features, just upped the frame rates.

and for ati (not going all the way back to the early rage cards because i don't know crap about them)
gen 1: radeon, radeon 7500 ---> added t&l (really ati's first truly competitve generation, i guess the prior rage was pretty good toward the end of its life. maxx concept never really took off)
gen 2: radeon 8500 ---> added programability (SM1) (8500 was repackaged as the 9000 and 9200, but it made it slower)
gen 3: radeon 9700, 9800, x800, x850 ---> SM2
gen 4: radeon x1800, x1900 ---> SM3
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Is the Power of 3 .. times the shader performance a major new feature?

Sorry, had to say it. And now it need not be said ever again. (Yes, it was that lame.) Carry on.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Can anyone check if the pure video on the 7 series different to ones on the 6 series?
Gaming phreek said it was pure video2. Is this true?