AT's X1900 review

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: M0RPH
Originally posted by: keysplayr2003

Tell that to CoD2 where the CPU means next to nada. 16x12 is not obtainable with a single card using all features and High AA and AF in CoD2.

X1900XTX in CoD2 at 1600x1200 HQAF and 4xAA = min 22 max 50 avg 33.6 according to the best playable settings at H.

FYI, those framerates ARE NOT playable in CoD2 and I don't care how unsensitive someone is to framerates. You play CoD2 at an avg framerate of 33.6, and you will most certainly die a lot. It is simply not high enough. Setting will have to be reduced to increase framerate. So, yes, you DO need an XTX or even a GTX even at 1600x1200 in CoD2 (best seller, UBER popular) and you will have to turn down the settings to get it playable there.

You can argue that this is only one game, but this is one of the most purchased titles in the last quarter 2005. Even my brother in law, who is not much into gaming, owns CoD2.

I disagree. That's playable in my book. I played the demo with similar framerates and had no trouble. I imagine there are lots of people out there that play these types of FPS games at similar framerates and get by just fine. Not everyone likes to spend hundreds of dollars on video cards.

CoD2 is a poorly coded graphics engine. The visuals are not that much better than others that require far less graphics power. When someone goes out and spends 900 bucks on a SLI GTX setup just to play this game, what they're doing is paying for the laziness and incompetence of the game developers. Bloated games like CoD2 and FEAR are the reason Nvidia has managed to convince so many geeks that they need to buy two video cards from them instead of one. It's a great cash cow for Nvidia of course.

According to you "coding experts" every new game is poorly coded. Im so sick of hearing fear is poorly coded, bf2 is poorly coded, COD2 is poorly coded... They arent poorly coded, they look great and run acceptably on the HARDWARE IT WAS DESIGNED TO BE PLAYED ON AT THAT SETTING.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: RobertR1
Originally posted by: beggerking
Originally posted by: MegaWorks
Originally posted by: beggerking
Originally posted by: MegaWorks
Originally posted by: apoppin
Originally posted by: Acanthus
The GTX512 was limited edition... it will never be sold in mass quantities, G71 will replace it as the top tier Nvidia card.

that's what "we" said :p

and - NOW - how do you know? the superfast RAM that was [supposedly] holding nVidia back is in good supply. ;)

Maybe the 512-GTX be nvidia's "value" [highend] card. . . .

:D

Well the GTX 512 is a piece of trash if you want ot compare it to this card.


Not really... this card is faster in most benchmarks, but its not that much faster.
The performance advantage benchmark is an exaggeration.. look again, read the real bench, and you will see its not that much faster than 7800 512.

This is a better card though.

How much is this card again + $700

7800 512 is ~$550 at Dell now..

This card is 1900xtx is ~$600

The x1900xt/x is a superior card that will only continue to get better and pull away harder over time with newer games. Get over it. How about a dell link showing the 7800GTX 512 in stock at $550?

Heh, I'll remind you to "get over it" when the G71 arrives. Lose the 'tude.

And beggerking, lets see that $550 link? And we would have to be able to buy it separately and not have to buy a Dell system. And it would of course have to be in stock. -Thanks.
 

tvdang7

Platinum Member
Jun 4, 2005
2,242
5
81
Originally posted by: M0RPH
Originally posted by: keysplayr2003

Tell that to CoD2 where the CPU means next to nada. 16x12 is not obtainable with a single card using all features and High AA and AF in CoD2.

X1900XTX in CoD2 at 1600x1200 HQAF and 4xAA = min 22 max 50 avg 33.6 according to the best playable settings at H.

FYI, those framerates ARE NOT playable in CoD2 and I don't care how unsensitive someone is to framerates. You play CoD2 at an avg framerate of 33.6, and you will most certainly die a lot. It is simply not high enough. Setting will have to be reduced to increase framerate. So, yes, you DO need an XTX or even a GTX even at 1600x1200 in CoD2 (best seller, UBER popular) and you will have to turn down the settings to get it playable there.

You can argue that this is only one game, but this is one of the most purchased titles in the last quarter 2005. Even my brother in law, who is not much into gaming, owns CoD2.

I disagree. That's playable in my book. I played the demo with similar framerates and had no trouble. I imagine there are lots of people out there that play these types of FPS games at similar framerates and get by just fine. Not everyone likes to spend hundreds of dollars on video cards.

CoD2 is a poorly coded graphics engine. The visuals are not that much better than others that require far less graphics power. When someone goes out and spends 900 bucks on a SLI GTX setup just to play this game, what they're doing is paying for the laziness and incompetence of the game developers. Bloated games like CoD2 and FEAR are the reason Nvidia has managed to convince so many geeks that they need to buy two video cards from them instead of one. It's a great cash cow for Nvidia of course.



cod2 is a factor for me!! and i also beleive 33 frams is not play able. im thinkg 45 at least. my XL right now gets high frames but occasionally when zoomed in or close up fire fights i hit 30ish and it lags and slows down. thats at 10x7. 12x10 is half the time unplayable at a high player server.
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Originally posted by: keysplayr2003
Originally posted by: RobertR1
Originally posted by: beggerking
Originally posted by: MegaWorks
Originally posted by: beggerking
Originally posted by: MegaWorks
Originally posted by: apoppin
Originally posted by: Acanthus
The GTX512 was limited edition... it will never be sold in mass quantities, G71 will replace it as the top tier Nvidia card.

that's what "we" said :p

and - NOW - how do you know? the superfast RAM that was [supposedly] holding nVidia back is in good supply. ;)

Maybe the 512-GTX be nvidia's "value" [highend] card. . . .

:D

Well the GTX 512 is a piece of trash if you want ot compare it to this card.


Not really... this card is faster in most benchmarks, but its not that much faster.
The performance advantage benchmark is an exaggeration.. look again, read the real bench, and you will see its not that much faster than 7800 512.

This is a better card though.

How much is this card again + $700

7800 512 is ~$550 at Dell now..

This card is 1900xtx is ~$600

The x1900xt/x is a superior card that will only continue to get better and pull away harder over time with newer games. Get over it. How about a dell link showing the 7800GTX 512 in stock at $550?

Heh, I'll remind you to "get over it" when the G71 arrives. Lose the 'tude.

And beggerking, lets see that $550 link? And we would have to be able to buy it separately and not have to buy a Dell system. And it would of course have to be in stock. -Thanks.


What 'tude. He's been crapping/trolling in every x1900 review and got called out. His link is a few posts above but I'll save you the scrolling:

http://www.fatwallet.com/t/18/572494/

"Ships: 6weeks+"

Pointless link.

 

sisq0kidd

Lifer
Apr 27, 2004
17,043
1
81
Wow, awesome card. For some reason I didn't like the review once again. I think a little more color diversity for the graphs would make it easier to read :)
 

Kabob

Lifer
Sep 5, 2004
15,248
0
76
So, this is a 16 pixel pipe card? I thought the 7800 GTX was a 24 pipe card. Is this a step backwards or just different technology?
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,551
136
Originally posted by: M0RPH
Originally posted by: keysplayr2003

Tell that to CoD2 where the CPU means next to nada. 16x12 is not obtainable with a single card using all features and High AA and AF in CoD2.

X1900XTX in CoD2 at 1600x1200 HQAF and 4xAA = min 22 max 50 avg 33.6 according to the best playable settings at H.

FYI, those framerates ARE NOT playable in CoD2 and I don't care how unsensitive someone is to framerates. You play CoD2 at an avg framerate of 33.6, and you will most certainly die a lot. It is simply not high enough. Setting will have to be reduced to increase framerate. So, yes, you DO need an XTX or even a GTX even at 1600x1200 in CoD2 (best seller, UBER popular) and you will have to turn down the settings to get it playable there.

You can argue that this is only one game, but this is one of the most purchased titles in the last quarter 2005. Even my brother in law, who is not much into gaming, owns CoD2.

I disagree. That's playable in my book. I played the demo with similar framerates and had no trouble. I imagine there are lots of people out there that play these types of FPS games at similar framerates and get by just fine. Not everyone likes to spend hundreds of dollars on video cards.

CoD2 is a poorly coded graphics engine. The visuals are not that much better than others that require far less graphics power. When someone goes out and spends 900 bucks on a SLI GTX setup just to play this game, what they're doing is paying for the laziness and incompetence of the game developers. Bloated games like CoD2 and FEAR are the reason Nvidia has managed to convince so many geeks that they need to buy two video cards from them instead of one. It's a great cash cow for Nvidia of course.

SSHHHHHHHhhhhh......the video card manufactorers and the hard drive manufactorers might catch wind of this and have to silence you.



The most curious results in AT's review are instances where you only get like 2-3fps increase in Crossfire results and the fact that even though the XTX cards should be throttled to XT speeds in Crossfire, there are instances where the XTX Crossfire outperforms the XT Crossfire by a healthy margin. This seems, at least to me, that ATI still needs to properly optimize their drivers. Then there are the instances where a single XT or XTX significantly outperforms the Crossfire configurations. It's not that the results from the cards are bad (for the most part)...I just expect much better if running a dual video card solution.

Single card performance seems to be roughly where I'd expect it to be. A decent ugprade (for the most part) from the nearly non-exhistent 7800 GTX 512MB.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
It seems to me, that the x1900 has the best number crunching in its 48 shaders which the 512GTX lacks and is effvident in AA settings. The 512 GTX is a speed beast with its 24 pipes which makes up some ground. ATI should have upped its pipes. G71 has to increase its shaders to that of ATI. ATI has to do better with its OpenGL as it doesnt win all benchies!
At any rate, with a 12x10 limited LCD, none of these cards are for me!

Morph still giving constructive crititisms where Nvidia are concerned, man you have a big chip on your shoulder!
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: munky
Originally posted by: beggerking
Originally posted by: Diasper
The X1900Xt is a great card which will show its power more in the future as the next gen games come out and with driver maturity (unlike the reviews using very immature 6.2betas or hacked 5.13s etc) as previously that has been a source of very significant performance improvements.

Otherwise, I wonder if the move to 80nm would let ATI move to 20-24 pipelines because that then would be even more impressive - add faster RAM and tweaked+higher clocked core and you'd be getting some very nice numbers.

Agreed. I think ATI really needs to up the pipelines..16 was yesterday's technology..
I believe even if they up it to 20 it would performance much better.

What's so hard to understand about the r5xx cards? The term "pipe" has no relevance to this gpu architecture. You have 48 pixel shaders in groups of 12 on 4 quads, and then you have 16 separate texture units in groups of 4, also on 4 separate quads. Any quad of pixel shaders can work with any quad of texture units, there are no pipes. The only changes I expect from the 80nm r590 is higher clocks and possibly ddr4 mem.

No relevance? Then take away the 16 pipes it does have, then tell me how it performs.
You are moving to the conclusion that "pipes" mean ABSOLUTELY nothing. That might fly with R600 and G80, but this R580 is something in between an R520 and R600. We are not totally unified yet. The traditional pipe in fact DOES still mean something.

 

Fenixgoon

Lifer
Jun 30, 2003
33,059
12,458
136
Originally posted by: NeezyDeezy
uh as much as I think AT has the best reviews, isn't this one wrong at least on the specs? 48 pixel pipelines???

pixel shaders != piplines....
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Bunch of whiners in this thread!

Awesome release from ATI! Smoking card! Great Review, except the graphs sucked. Wow, they were horrible.
 

Fenixgoon

Lifer
Jun 30, 2003
33,059
12,458
136
Originally posted by: M0RPH
Originally posted by: keysplayr2003

Tell that to CoD2 where the CPU means next to nada. 16x12 is not obtainable with a single card using all features and High AA and AF in CoD2.

X1900XTX in CoD2 at 1600x1200 HQAF and 4xAA = min 22 max 50 avg 33.6 according to the best playable settings at H.

FYI, those framerates ARE NOT playable in CoD2 and I don't care how unsensitive someone is to framerates. You play CoD2 at an avg framerate of 33.6, and you will most certainly die a lot. It is simply not high enough. Setting will have to be reduced to increase framerate. So, yes, you DO need an XTX or even a GTX even at 1600x1200 in CoD2 (best seller, UBER popular) and you will have to turn down the settings to get it playable there.

You can argue that this is only one game, but this is one of the most purchased titles in the last quarter 2005. Even my brother in law, who is not much into gaming, owns CoD2.

I disagree. That's playable in my book. I played the demo with similar framerates and had no trouble. I imagine there are lots of people out there that play these types of FPS games at similar framerates and get by just fine. Not everyone likes to spend hundreds of dollars on video cards.

CoD2 is a poorly coded graphics engine. The visuals are not that much better than others that require far less graphics power. When someone goes out and spends 900 bucks on a SLI GTX setup just to play this game, what they're doing is paying for the laziness and incompetence of the game developers. Bloated games like CoD2 and FEAR are the reason Nvidia has managed to convince so many geeks that they need to buy two video cards from them instead of one. It's a great cash cow for Nvidia of course.

i run 1280x1024 with everything up in COD2 (no FSAA/AF) and it's quite playable with minor exceptions. i have a sempron64 2800+ (1.6ghz) OCed to 2.0 and an x800 pro. the 1900 should smoke my card!
 

MegaWorks

Diamond Member
Jan 26, 2004
3,819
1
0
Originally posted by: ArchAngel777
Bunch of whiners in this thread!

Awesome release from ATI! Smoking card! Great Review, except the graphs sucked. Wow, they were horrible.

Yeah! I had eye pain looking at these graphs.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: M0RPH
Originally posted by: keysplayr2003

Tell that to CoD2 where the CPU means next to nada. 16x12 is not obtainable with a single card using all features and High AA and AF in CoD2.

X1900XTX in CoD2 at 1600x1200 HQAF and 4xAA = min 22 max 50 avg 33.6 according to the best playable settings at H.

FYI, those framerates ARE NOT playable in CoD2 and I don't care how unsensitive someone is to framerates. You play CoD2 at an avg framerate of 33.6, and you will most certainly die a lot. It is simply not high enough. Setting will have to be reduced to increase framerate. So, yes, you DO need an XTX or even a GTX even at 1600x1200 in CoD2 (best seller, UBER popular) and you will have to turn down the settings to get it playable there.

You can argue that this is only one game, but this is one of the most purchased titles in the last quarter 2005. Even my brother in law, who is not much into gaming, owns CoD2.

I disagree. That's playable in my book. I played the demo with similar framerates and had no trouble. I imagine there are lots of people out there that play these types of FPS games at similar framerates and get by just fine. Not everyone likes to spend hundreds of dollars on video cards.

CoD2 is a poorly coded graphics engine. The visuals are not that much better than others that require far less graphics power. When someone goes out and spends 900 bucks on a SLI GTX setup just to play this game, what they're doing is paying for the laziness and incompetence of the game developers. Bloated games like CoD2 and FEAR are the reason Nvidia has managed to convince so many geeks that they need to buy two video cards from them instead of one. It's a great cash cow for Nvidia of course.

Did you play the single player CoD2 demo or the multiplayer CoD2 demo?. There is no comparison when you switch to multiplayer. Framerate is everything if you plan on ever winning a round. Anything less than 45fps minimum (agree with what the other guy says) and you can just shut the game off, because its too frustrating.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: keysplayr2003
Originally posted by: munky
Originally posted by: beggerking
Originally posted by: Diasper
The X1900Xt is a great card which will show its power more in the future as the next gen games come out and with driver maturity (unlike the reviews using very immature 6.2betas or hacked 5.13s etc) as previously that has been a source of very significant performance improvements.

Otherwise, I wonder if the move to 80nm would let ATI move to 20-24 pipelines because that then would be even more impressive - add faster RAM and tweaked+higher clocked core and you'd be getting some very nice numbers.

Agreed. I think ATI really needs to up the pipelines..16 was yesterday's technology..
I believe even if they up it to 20 it would performance much better.

What's so hard to understand about the r5xx cards? The term "pipe" has no relevance to this gpu architecture. You have 48 pixel shaders in groups of 12 on 4 quads, and then you have 16 separate texture units in groups of 4, also on 4 separate quads. Any quad of pixel shaders can work with any quad of texture units, there are no pipes. The only changes I expect from the 80nm r590 is higher clocks and possibly ddr4 mem.

No relevance? Then take away the 16 pipes it does have, then tell me how it performs.
What 16 pipes? the pixel shaders? the TMU's? the ROP's? Which part of the traditional pipeline that is now divided among 3 separate arrays of units that communicate mainly through the central dispatcher and the register array should I take away?
You are moving to the conclusion that "pipes" mean ABSOLUTELY nothing. That might fly with R600 and G80, but this R580 is something in between an R520 and R600. We are not totally unified yet. The traditional pipe in fact DOES still mean something.

The traditional pipeline only means something on the 7800 series and older cards, which has a pixel shader and one or more texture units in each pipe, and both work on the same data. In a traditional pipe you cant have the PS from one pipe sending or reading data from a TMU in another pipe, hence the whole reason they are divided into pipes. On the r5xx cards the PS are not tied to any particular TMU, and when the PS is finished with it's processing it stores the result in registers, and the central dispatcher can pick ANY available TMU quad to do the texturing on the stored fragment in the registers while the PS starts working on a completely new thread. This has nothing to do with unified shaders.
 

ayabe

Diamond Member
Aug 10, 2005
7,449
0
0
Originally posted by: keysplayr2003
Originally posted by: M0RPH
Originally posted by: keysplayr2003

Tell that to CoD2 where the CPU means next to nada. 16x12 is not obtainable with a single card using all features and High AA and AF in CoD2.

X1900XTX in CoD2 at 1600x1200 HQAF and 4xAA = min 22 max 50 avg 33.6 according to the best playable settings at H.

FYI, those framerates ARE NOT playable in CoD2 and I don't care how unsensitive someone is to framerates. You play CoD2 at an avg framerate of 33.6, and you will most certainly die a lot. It is simply not high enough. Setting will have to be reduced to increase framerate. So, yes, you DO need an XTX or even a GTX even at 1600x1200 in CoD2 (best seller, UBER popular) and you will have to turn down the settings to get it playable there.

You can argue that this is only one game, but this is one of the most purchased titles in the last quarter 2005. Even my brother in law, who is not much into gaming, owns CoD2.

I disagree. That's playable in my book. I played the demo with similar framerates and had no trouble. I imagine there are lots of people out there that play these types of FPS games at similar framerates and get by just fine. Not everyone likes to spend hundreds of dollars on video cards.

CoD2 is a poorly coded graphics engine. The visuals are not that much better than others that require far less graphics power. When someone goes out and spends 900 bucks on a SLI GTX setup just to play this game, what they're doing is paying for the laziness and incompetence of the game developers. Bloated games like CoD2 and FEAR are the reason Nvidia has managed to convince so many geeks that they need to buy two video cards from them instead of one. It's a great cash cow for Nvidia of course.

Did you play the single player CoD2 demo or the multiplayer CoD2 demo?. There is no comparison when you switch to multiplayer. Framerate is everything if you plan on ever winning a round. Anything less than 45fps minimum (agree with what the other guy says) and you can just shut the game off, because its too frustrating.



I agree with this as well, I run with no AA/AF to keep it in the high 40's, otherwise I get wtfpwnd.
 

Steelski

Senior member
Feb 16, 2005
700
0
0
Originally posted by: munky
Originally posted by: keysplayr2003
Originally posted by: munky
Originally posted by: beggerking
Originally posted by: Diasper
The X1900Xt is a great card which will show its power more in the future as the next gen games come out and with driver maturity (unlike the reviews using very immature 6.2betas or hacked 5.13s etc) as previously that has been a source of very significant performance improvements.

Otherwise, I wonder if the move to 80nm would let ATI move to 20-24 pipelines because that then would be even more impressive - add faster RAM and tweaked+higher clocked core and you'd be getting some very nice numbers.

Agreed. I think ATI really needs to up the pipelines..16 was yesterday's technology..
I believe even if they up it to 20 it would performance much better.

What's so hard to understand about the r5xx cards? The term "pipe" has no relevance to this gpu architecture. You have 48 pixel shaders in groups of 12 on 4 quads, and then you have 16 separate texture units in groups of 4, also on 4 separate quads. Any quad of pixel shaders can work with any quad of texture units, there are no pipes. The only changes I expect from the 80nm r590 is higher clocks and possibly ddr4 mem.

No relevance? Then take away the 16 pipes it does have, then tell me how it performs.
What 16 pipes? the pixel shaders? the TMU's? the ROP's? Which part of the traditional pipeline that is now divided among 3 separate arrays of units that communicate mainly through the central dispatcher and the register array should I take away?
You are moving to the conclusion that "pipes" mean ABSOLUTELY nothing. That might fly with R600 and G80, but this R580 is something in between an R520 and R600. We are not totally unified yet. The traditional pipe in fact DOES still mean something.

The traditional pipeline only means something on the 7800 series and older cards, which has a pixel shader and one or more texture units in each pipe, and both work on the same data. In a traditional pipe you cant have the PS from one pipe sending or reading data from a TMU in another pipe, hence the whole reason they are divided into pipes. On the r5xx cards the PS are not tied to any particular TMU, and when the PS is finished with it's processing it stores the result in registers, and the central dispatcher can pick ANY available TMU quad to do the texturing on the stored fragment in the registers while the PS starts working on a completely new thread. This has nothing to do with unified shaders.

I have seen you go over and over and over about this point. People just dont seem to get it. I think what they are mumbling on about is that it still has only 16 texture units and would require more.
 

Conky

Lifer
May 9, 2001
10,709
0
0
As usual, another fine AT review. And what a great new card!

I love the back and forth battle between ATI and Nvidia because we all win in the end. The $749 7800GTX 512MB just died a horrificly violent, and well-deserved at that price, death. :laugh:

I can't wait to see Nvidia's product answer because they can't allow ATI to have the crown for too long. Competition rules! :thumbsup:
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Steelski
I have seen you go over and over and over about this point. People just dont seem to get it. I think what they are mumbling on about is that it still has only 16 texture units and would require more.

That's about the only drawback I can see to these cards (well, other than the sticker price :p); the 3:1 shader:texture/ROP ratio seems to be too high for a lot of games. I suspect the shader units are just waiting a lot of the time in games like CoD2, and even moreso in older games that don't use DX9 extensively/at all.

The exception to this -- and it may be a big one -- is FEAR. If more upcoming games go towards uber-heavy shader loads over lots of high-res multitexturing, R580 will keep pulling further ahead of G70, and even 32 shaders in G71 may not be enough to catch up unless the clocks are very high.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: RobertR1
Originally posted by: beggerking
Originally posted by: RobertR1


The x1900xt/x is a superior card that will only continue to get better and pull away harder over time with newer games. Get over it. How about a dell link showing the 7800GTX 512 in stock at $550?


Text

"Usually Ships: 6+ Weeks"

Enjoy your card!


Perhaps you missed the point. I quoted the pricing for 7800gtx 512 because you said its a piece of trash for 700bux, and you said for 700bux it would be trash.

my point is, even though 1900xtx beats it, it is still not trash because price will come down.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: RobertR1
Originally posted by: keysplayr2003
Originally posted by: RobertR1
Originally posted by: beggerking
Originally posted by: MegaWorks
Originally posted by: beggerking
Originally posted by: MegaWorks
Originally posted by: apoppin
Originally posted by: Acanthus
The GTX512 was limited edition... it will never be sold in mass quantities, G71 will replace it as the top tier Nvidia card.

that's what "we" said :p

and - NOW - how do you know? the superfast RAM that was [supposedly] holding nVidia back is in good supply. ;)

Maybe the 512-GTX be nvidia's "value" [highend] card. . . .

:D

Well the GTX 512 is a piece of trash if you want ot compare it to this card.


Not really... this card is faster in most benchmarks, but its not that much faster.
The performance advantage benchmark is an exaggeration.. look again, read the real bench, and you will see its not that much faster than 7800 512.

This is a better card though.

How much is this card again + $700

7800 512 is ~$550 at Dell now..

This card is 1900xtx is ~$600

The x1900xt/x is a superior card that will only continue to get better and pull away harder over time with newer games. Get over it. How about a dell link showing the 7800GTX 512 in stock at $550?

Heh, I'll remind you to "get over it" when the G71 arrives. Lose the 'tude.

And beggerking, lets see that $550 link? And we would have to be able to buy it separately and not have to buy a Dell system. And it would of course have to be in stock. -Thanks.


What 'tude. He's been crapping/trolling in every x1900 review and got called out. His link is a few posts above but I'll save you the scrolling:

http://www.fatwallet.com/t/18/572494/

"Ships: 6weeks+"

Pointless link.

its more like YOU are the one crapping / trolling every 7800gtx review.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: munky
Originally posted by: beggerking
Originally posted by: Diasper
The X1900Xt is a great card which will show its power more in the future as the next gen games come out and with driver maturity (unlike the reviews using very immature 6.2betas or hacked 5.13s etc) as previously that has been a source of very significant performance improvements.

Otherwise, I wonder if the move to 80nm would let ATI move to 20-24 pipelines because that then would be even more impressive - add faster RAM and tweaked+higher clocked core and you'd be getting some very nice numbers.

Agreed. I think ATI really needs to up the pipelines..16 was yesterday's technology..
I believe even if they up it to 20 it would performance much better.

What's so hard to understand about the r5xx cards? The term "pipe" has no relevance to this gpu architecture. You have 48 pixel shaders in groups of 12 on 4 quads, and then you have 16 separate texture units in groups of 4, also on 4 separate quads. Any quad of pixel shaders can work with any quad of texture units, there are no pipes. The only changes I expect from the 80nm r590 is higher clocks and possibly ddr4 mem.

let me reword. won't it be faster if it had 20 or more texture units then?
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: beggerking
let me reword. won't it be faster if it had 20 or more texture units then?

Well, yes, but it would also have more transistors, which would drive up costs and heat output. And you eventually reach a point where more texturing won't help you either, and then you need more shaders...

R580 is built to play shader-heavy games (at which it should excel), and ATI is banking on that being the way developers go in the next year or two.

If you believe the rumors for G71 (32 ROPs/32 TMUs/32 shaders), it's going to have less pixel/vertex shading capability than R580, but more texturing capability and more ROPs. It would be better (on paper) for games with fewer shaders and more complex geometry/texturing.