So...what positives can we take from the HD2900XT launch?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Fenixgoon

Lifer
Jun 30, 2003
33,286
12,849
136
based on all the benchmarks i've seen, it looks like R600 has LOTS of potential, but waiting for driver releases at minimum would be the wisest course of action, assuming you're dying to purchase one. i'm putting my hopes on the 2600XT, because all my options are going to be lying on the midrange (sadly, ATI's 600 midrange lines have always been terrible)
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Some DX10 benches here:
http://www.computerbase.de/artikel/hard..._xt/31/#abschnitt_call_of_juarez_d3d10
Doesn't seem to be any different from DX9 performance scaling.

Overall, the performance of r600 is too inconsistent for my liking. Sometimes it's close to the gtx, sometimes it's the same as a gts, or slower. And the performance hit from AA is too high. Seems like Ati focused on technology and features, but made too many compromises in performance.
 

newb54

Senior member
Dec 25, 2003
216
0
0
Originally posted by: v8envy
Other reasons the HD2900 is not suitable for many people: the noise/heat means it will NOT be useful for homebrew DVRs. Many of those are based on SFF cases. The added noise and heat of a HD2900 and new monster power supply makes the integrated HD audio playback features moot.

The statement of 'nothing in the $399 price bracket can touch it' is patently false. The 640 meg 8800GTS, overclocked or not trades blows with the HD2900 on a title by title basis. Today's pricing is $330 AR vs $409. Those of you living in the future may find the HD2900 a better value, but those of us here today are in a completely different boat.

Lastly, there is Linux support, or the complete lack of it. Only a factor for a minority of people (myself included), but it is a factor.

I'm glad this seems like the perfect solution to you, apop. I can't understand why, but hey -- it takes all kinds. The HD2900 would have to be about 10% less than a 640M 8800GTS for me to consider it at all.

Well said v8envy, I was about to post the same thing. LOL at "nothing can touch it in the sub $400 bracket." This is obviously not the case, as the 8800 GTS 640MB has pretty similar performance (slightly better or worse depending on your view) for $70 less. I hope the HD2900Xt does recieve a boost through more mature drivers and lowers prices, but until then 8800 GTS 640MB is a much better value.
 

thilanliyan

Lifer
Jun 21, 2005
12,060
2,273
126
I am honestly very disappointed with the 2900. It will probably best the GTS with better drivers but come one!!! 6 months and this is all they could pull off? I really hope their midrange competes better...I'll probably build an HTPC with a 2600 sometime later. I'm glad that all cards have HDMI. However, I have to wonder if there will even BE an R700.

Anyone more technically minded than me, is there any specific hardware defficiency that is causing the poor performance? The AT article mentioned that they might be using shaders to do the AA work, rather than having dedicated hardware.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: thilan29
I am honestly very disappointed with the 2900. It will probably best the GTS with better drivers but come one!!! 6 months and this is all they could pull off? I really hope their midrange competes better...I'll probably build an HTPC with a 2600 sometime later. I'm glad that all cards have HDMI. However, I have to wonder if there will even BE an R700.

Anyone more technically minded than me, is there any specific hardware defficiency that is causing the poor performance? The AT article mentioned that they might be using shaders to do the AA work, rather than having dedicated hardware.

Yes, several reviews I've read mention that most of the AA workload in r600 is done in the shaders. While this offers truly programmable AA, the performance hit from AA is bigger.
 

palindrome

Senior member
Jan 11, 2006
942
1
81
Originally posted by: BFG10K
Positives? CFAA is a lot better than most claim. It's smarter and better than Quincunx.

Having said that I'll likely avoid the card because of the high noise levels.

Sapphire Toxic is right up your alley then.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
There isnt many. If you're into HTPC, then the audio cable will help somewhat, but thats negated by the noise. They seem to overclock really well, if you're into that. Which I am. TF2 and EP2 free is nice, which makes the card $40 cheaper right off the bat for me.

And then the very limited DX10 benches look good. And there is no doubt to me that the card will get much faster with newer drivers. That and NVs cards will drop in price with some competition.

If I was going to buy a card today, it wouldnt be a HD 2900XT. In a few months from now? Things may change with DX10 games, and new drivers. Ill probably pass it all together, and wait for a refresh of both cards. DX10 games will be out, and drivers for each will be better. NV's drivers apparently are not very good in Vista, and ATi's are not mature enough for performance. Im going to deploy for about a month June 9th.. be back early July. So there really is no point in buying a card now. They will be cheaper when I get back. Finally, NVs cards will drop in price.
 

palindrome

Senior member
Jan 11, 2006
942
1
81
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: AVP
I mean i know the scores are scattered but you are suggesting a pretty big leap in performance drayvn, I myself have thought and said somewhere that I am sure that these scores will get much better and even out more with maturity but I doubt this card will ever approach gtx speeds on the whole...

that will *never* happen

the HDXT does not compete with the GTX .. that would be a Sub$400 card vs a $600 one
--- an UNfair comparison

the *only* place the HDXT *may* "approach" GTX performance is in DX10 games

maybe .. we just don't know now

otOTHERh ... it DOES compete with the GTS - now - and will likely *surpass it* by an ever-widening margin as drivers mature


price/performance wise......




i'd rather save $70 for same performance, also there already had been a dx10 bench out


http://www.guru3d.com/article/Videocards/431/17/

lets try again with the MSRPs

gts= GeForce 8800GTS 640MB MSRP: $439.99
gtx= GeForce 8800GTX MSRP: $619.99
HDxt=$399

do you "get it" now? :p


What you dont get is......How much you can buy them NOW, my pricing chart is real world pricing. MSRP is nowhere near real world pricing at the moment, even on newegg the xt is $409
it came out TODAY .. no market forces are acting on it ... your GTX has been out for SIX months .. and you could NOT find one for even MSRP on 8800 launch day :p

i and i am not buying one *now*

and check back tomorrow or next week when there will likely be HD-2900xt for sub-GTS prices

--the 640MB version - it's real competitor .. not the 320MB ''discounted" version


$329 IS the 640 version...... owned


http://www.newegg.com/Product/Product.aspx?Item=N82E16814133188

http://www.newegg.com/Product/Product.aspx?Item=N82E16814130071



LOL

anything more you'd like to add?

I do not like the fact you would need to buy a new power supply to overclock also, my psu already has a 8 pin taken up by my motherboard.

CPU 8-pin and PCI-E 8-pin are keyed differently. Its also likely that the 8900s will need an 8 pin as well.
 

swtethan

Diamond Member
Aug 5, 2005
9,071
0
0
Originally posted by: palindrome
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: AVP
I mean i know the scores are scattered but you are suggesting a pretty big leap in performance drayvn, I myself have thought and said somewhere that I am sure that these scores will get much better and even out more with maturity but I doubt this card will ever approach gtx speeds on the whole...

that will *never* happen

the HDXT does not compete with the GTX .. that would be a Sub$400 card vs a $600 one
--- an UNfair comparison

the *only* place the HDXT *may* "approach" GTX performance is in DX10 games

maybe .. we just don't know now

otOTHERh ... it DOES compete with the GTS - now - and will likely *surpass it* by an ever-widening margin as drivers mature


price/performance wise......




i'd rather save $70 for same performance, also there already had been a dx10 bench out


http://www.guru3d.com/article/Videocards/431/17/

lets try again with the MSRPs

gts= GeForce 8800GTS 640MB MSRP: $439.99
gtx= GeForce 8800GTX MSRP: $619.99
HDxt=$399

do you "get it" now? :p


What you dont get is......How much you can buy them NOW, my pricing chart is real world pricing. MSRP is nowhere near real world pricing at the moment, even on newegg the xt is $409
it came out TODAY .. no market forces are acting on it ... your GTX has been out for SIX months .. and you could NOT find one for even MSRP on 8800 launch day :p

i and i am not buying one *now*

and check back tomorrow or next week when there will likely be HD-2900xt for sub-GTS prices

--the 640MB version - it's real competitor .. not the 320MB ''discounted" version


$329 IS the 640 version...... owned


http://www.newegg.com/Product/Product.aspx?Item=N82E16814133188

http://www.newegg.com/Product/Product.aspx?Item=N82E16814130071



LOL

anything more you'd like to add?

I do not like the fact you would need to buy a new power supply to overclock also, my psu already has a 8 pin taken up by my motherboard.

CPU 8-pin and PCI-E 8-pin are keyed differently. Its also likely that the 8900s will need an 8 pin as well.


actually in the review pics, I saw the 8 pin cord say "CPU ONLY"

so is it really different?
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: palindrome
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: AVP
I mean i know the scores are scattered but you are suggesting a pretty big leap in performance drayvn, I myself have thought and said somewhere that I am sure that these scores will get much better and even out more with maturity but I doubt this card will ever approach gtx speeds on the whole...

that will *never* happen

the HDXT does not compete with the GTX .. that would be a Sub$400 card vs a $600 one
--- an UNfair comparison

the *only* place the HDXT *may* "approach" GTX performance is in DX10 games

maybe .. we just don't know now

otOTHERh ... it DOES compete with the GTS - now - and will likely *surpass it* by an ever-widening margin as drivers mature


price/performance wise......




i'd rather save $70 for same performance, also there already had been a dx10 bench out


http://www.guru3d.com/article/Videocards/431/17/

lets try again with the MSRPs

gts= GeForce 8800GTS 640MB MSRP: $439.99
gtx= GeForce 8800GTX MSRP: $619.99
HDxt=$399

do you "get it" now? :p


What you dont get is......How much you can buy them NOW, my pricing chart is real world pricing. MSRP is nowhere near real world pricing at the moment, even on newegg the xt is $409
it came out TODAY .. no market forces are acting on it ... your GTX has been out for SIX months .. and you could NOT find one for even MSRP on 8800 launch day :p

i and i am not buying one *now*

and check back tomorrow or next week when there will likely be HD-2900xt for sub-GTS prices

--the 640MB version - it's real competitor .. not the 320MB ''discounted" version


$329 IS the 640 version...... owned


http://www.newegg.com/Product/Product.aspx?Item=N82E16814133188

http://www.newegg.com/Product/Product.aspx?Item=N82E16814130071



LOL

anything more you'd like to add?

I do not like the fact you would need to buy a new power supply to overclock also, my psu already has a 8 pin taken up by my motherboard.

CPU 8-pin and PCI-E 8-pin are keyed differently. Its also likely that the 8900s will need an 8 pin as well.

No, I highly doubt the 8900's will need an 8 pin. It's possible they might have it their but they will not need it. The 8800 Ultra doesn't need it, and the 8900's will use less power due to an 80nm fabrication process.

I think that, with HD 2900XT, GPU power consumption has reached its peak. From hear we should see LESS power being used; the R600 will be refreshed on 65nm and that will have much lower power usage.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Extelleron
Originally posted by: palindrome
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: AVP
I mean i know the scores are scattered but you are suggesting a pretty big leap in performance drayvn, I myself have thought and said somewhere that I am sure that these scores will get much better and even out more with maturity but I doubt this card will ever approach gtx speeds on the whole...

that will *never* happen

the HDXT does not compete with the GTX .. that would be a Sub$400 card vs a $600 one
--- an UNfair comparison

the *only* place the HDXT *may* "approach" GTX performance is in DX10 games

maybe .. we just don't know now

otOTHERh ... it DOES compete with the GTS - now - and will likely *surpass it* by an ever-widening margin as drivers mature


price/performance wise......




i'd rather save $70 for same performance, also there already had been a dx10 bench out


http://www.guru3d.com/article/Videocards/431/17/

lets try again with the MSRPs

gts= GeForce 8800GTS 640MB MSRP: $439.99
gtx= GeForce 8800GTX MSRP: $619.99
HDxt=$399

do you "get it" now? :p


What you dont get is......How much you can buy them NOW, my pricing chart is real world pricing. MSRP is nowhere near real world pricing at the moment, even on newegg the xt is $409
it came out TODAY .. no market forces are acting on it ... your GTX has been out for SIX months .. and you could NOT find one for even MSRP on 8800 launch day :p

i and i am not buying one *now*

and check back tomorrow or next week when there will likely be HD-2900xt for sub-GTS prices

--the 640MB version - it's real competitor .. not the 320MB ''discounted" version


$329 IS the 640 version...... owned


http://www.newegg.com/Product/Product.aspx?Item=N82E16814133188

http://www.newegg.com/Product/Product.aspx?Item=N82E16814130071



LOL

anything more you'd like to add?

I do not like the fact you would need to buy a new power supply to overclock also, my psu already has a 8 pin taken up by my motherboard.

CPU 8-pin and PCI-E 8-pin are keyed differently. Its also likely that the 8900s will need an 8 pin as well.

No, I highly doubt the 8900's will need an 8 pin. It's possible they might have it their but they will not need it. The 8800 Ultra doesn't need it, and the 8900's will use less power due to an 80nm fabrication process.

I think that, with HD 2900XT, GPU power consumption has reached its peak. From hear we should see LESS power being used; the R600 will be refreshed on 65nm and that will have much lower power usage.

That is unless the prayers of ATI fans is not answered by a miracle driver. In which case they're going to have to shrink it and boost the clocks an insane amount to compete with nvidia's next high end card. Depending on how much they have to boost the clocks, it could negate a lot of the power savings.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Matt2
Originally posted by: Extelleron
Originally posted by: palindrome
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: AVP
I mean i know the scores are scattered but you are suggesting a pretty big leap in performance drayvn, I myself have thought and said somewhere that I am sure that these scores will get much better and even out more with maturity but I doubt this card will ever approach gtx speeds on the whole...

that will *never* happen

the HDXT does not compete with the GTX .. that would be a Sub$400 card vs a $600 one
--- an UNfair comparison

the *only* place the HDXT *may* "approach" GTX performance is in DX10 games

maybe .. we just don't know now

otOTHERh ... it DOES compete with the GTS - now - and will likely *surpass it* by an ever-widening margin as drivers mature


price/performance wise......




i'd rather save $70 for same performance, also there already had been a dx10 bench out


http://www.guru3d.com/article/Videocards/431/17/

lets try again with the MSRPs

gts= GeForce 8800GTS 640MB MSRP: $439.99
gtx= GeForce 8800GTX MSRP: $619.99
HDxt=$399

do you "get it" now? :p


What you dont get is......How much you can buy them NOW, my pricing chart is real world pricing. MSRP is nowhere near real world pricing at the moment, even on newegg the xt is $409
it came out TODAY .. no market forces are acting on it ... your GTX has been out for SIX months .. and you could NOT find one for even MSRP on 8800 launch day :p

i and i am not buying one *now*

and check back tomorrow or next week when there will likely be HD-2900xt for sub-GTS prices

--the 640MB version - it's real competitor .. not the 320MB ''discounted" version


$329 IS the 640 version...... owned


http://www.newegg.com/Product/Product.aspx?Item=N82E16814133188

http://www.newegg.com/Product/Product.aspx?Item=N82E16814130071



LOL

anything more you'd like to add?

I do not like the fact you would need to buy a new power supply to overclock also, my psu already has a 8 pin taken up by my motherboard.

CPU 8-pin and PCI-E 8-pin are keyed differently. Its also likely that the 8900s will need an 8 pin as well.

No, I highly doubt the 8900's will need an 8 pin. It's possible they might have it their but they will not need it. The 8800 Ultra doesn't need it, and the 8900's will use less power due to an 80nm fabrication process.

I think that, with HD 2900XT, GPU power consumption has reached its peak. From hear we should see LESS power being used; the R600 will be refreshed on 65nm and that will have much lower power usage.

That is unless the prayers of ATI fans is not answered by a miracle driver. In which case they're going to have to shrink it and boost the clocks an insane amount to compete with nvidia's next high end card. Depending on how much they have to boost the clocks, it could negate a lot of the power savings.

There's no need for a "miracle driver." The 8800 series cards have seen huge improvements over original launch performance, and the HD 2900XT drivers, from a performance standpoint at least, are just not as well developed yet.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Matt2

That is unless the prayers of ATI fans is not answered by a miracle driver. In which case they're going to have to shrink it and boost the clocks an insane amount to compete with nvidia's next high end card. Depending on how much they have to boost the clocks, it could negate a lot of the power savings.

You dont have to be a ATi fan. You'd have to be an idiot to not want better performance for the 2900XT, no matter who your loyalty is to.

As Extelleron pointed out, the 8800 cards got faster with newer drivers as well. It always happens with new technology. Everyone expect the 2900XT to get faster, its just a question of how much, and how soon.

 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Extelleron
There's no need for a "miracle driver." The 8800 series cards have seen huge improvements over original launch performance, and the HD 2900XT drivers, from a performance standpoint at least, are just not as well developed yet.

Do you have a link to some benchmarks that show these "huge" improvements?

I'm not talking stability, I want to see exactly how big the performance increases were.
 

sandorski

No Lifer
Oct 10, 1999
70,783
6,341
126
Originally posted by: yacoub
AMD positively doesn't know how to make what the people want. They took all the time in the world and released a part that saps more power, makes more noise, and gives less FPS in the games people play. For that, they can lose more marketshare until they learn how to do it right.

How a card can have so many of the right hardware ingredients and lose to older products is beyond me. I guess I could ask NVidia about that too with their 8600 series having the same problem. ;)

Really, no one has a reason to celebrate this. I'm certainly more disappointed than anything. I wanted this card to be good, so that I could either buy it or see the GTS go down in price.

In AMD's defense: They really had little to do with the design behind this GPU.

In ATI's defense: This GPU just might pwn in the DX10 Realworld.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Matt2
Originally posted by: Extelleron
There's no need for a "miracle driver." The 8800 series cards have seen huge improvements over original launch performance, and the HD 2900XT drivers, from a performance standpoint at least, are just not as well developed yet.

Do you have a link to some benchmarks that show these "huge" improvements?

I'm not talking stability, I want to see exactly how big the performance increases were.

There are several reviews that comment on it. I dont recall which ones they were though.. here is one;

The latest 8.37.4.2_47323 drivers is supposed to implement a new intelligent algorithm that increases FPS while applying similar image quality when running Adaptive Anti-Aliasing. In Oblivion, performance several times faster than previous drivers using the new adaptive AA algorithm was claimed to have been acheived. New optimizations for HDR applications in general resulted in a 5-30% increase in performance.

http://www.vr-zone.com/print.php?i=4946

I dont feel like look for others. One I remember ran benches with two drivers, the newest one showed a large difference, much better frames. Its pretty silly to think that the 2900XT wont get faster with newer drivers.

edit, another increase in Stalker; http://forum.beyond3d.com/showpost.php?p=985906&postcount=5437
 

swtethan

Diamond Member
Aug 5, 2005
9,071
0
0
Originally posted by: sandorski
Originally posted by: yacoub
AMD positively doesn't know how to make what the people want. They took all the time in the world and released a part that saps more power, makes more noise, and gives less FPS in the games people play. For that, they can lose more marketshare until they learn how to do it right.

How a card can have so many of the right hardware ingredients and lose to older products is beyond me. I guess I could ask NVidia about that too with their 8600 series having the same problem. ;)

Really, no one has a reason to celebrate this. I'm certainly more disappointed than anything. I wanted this card to be good, so that I could either buy it or see the GTS go down in price.

In AMD's defense: They really had little to do with the design behind this GPU.

In ATI's defense: This GPU just might pwn in the DX10 Realworld.


you obviously have not seen the call of juarez dx10 benchmark
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
I agree with sandorski in AMD's defense that this is ATi's legacy but I believe AMD like the idea what ATi was doing behind the RV600 and supported it. Though it's not like they're going to cancel it after 3 years of development. They'd strap the thing to an A/C unit before scratching it.

The best thing I can say about it's launch is that it's out. AMD finally released something new. NEW!!! NEW!!! To me that's exciting. You can't make profit with old product when the competitor has something newer and stronger out all ready. Just won't happen. Unless you're Intel and can sell a 3.4GHZ EE Prescott to a tribal leader in the Congo. Though at least Intel branded the Prescott as something newer compared to the Northwood.

Even if you don't win marketshare with a 2nd place performer, at least with the current customer base you have you can sell something new to them. That's what really is killing AMD right now. You add the price war into the mix, then with added capital expenditure, and you got yourself a ticking time bomb.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Matt2
Originally posted by: Extelleron
There's no need for a "miracle driver." The 8800 series cards have seen huge improvements over original launch performance, and the HD 2900XT drivers, from a performance standpoint at least, are just not as well developed yet.

Do you have a link to some benchmarks that show these "huge" improvements?

I'm not talking stability, I want to see exactly how big the performance increases were.

buy a GTX and load up the drivers that come with the card and you tell us :p
--the release drivers at launch were Beta ...


then you can sell that PoS XTX .. and have a *real* card in your rig
... not a weak midrange product by a 2nd rate company

and you can really say how you feel about the HD-XT





 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
http://www.rage3d.com/board/showthread.php?t=33890940

Once again, pretty clear drivers helped out. And future drivers will help out even more. Just as they do for all cards. Im sure NV will squeeze some more performance out as well. The 2900XT takes a huge hit on frames with AF, much more than NV or the X1900XT. Obviously something is pretty wrong.

SS2 1280x1024 4xAAA/16xAF, from 20.9 to 31.5.
Oblivion 1280x1024 4xAAA/16xAF, from 15.3 to 41.6.
SC3 1280x1024 4xAAA/16xAF, from 91.0 to 129.4.

Once again, I dont think it will catch the 8800GTX, but it should get much closer. I guess we'll know in a few months, as of now, I wouldnt get a 2900XT. Few months down the road, when the price drops, and performance hopefully picks up? We'll see.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: swtethan
Originally posted by: palindrome
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: swtethan
Originally posted by: apoppin
Originally posted by: AVP
I mean i know the scores are scattered but you are suggesting a pretty big leap in performance drayvn, I myself have thought and said somewhere that I am sure that these scores will get much better and even out more with maturity but I doubt this card will ever approach gtx speeds on the whole...

that will *never* happen

the HDXT does not compete with the GTX .. that would be a Sub$400 card vs a $600 one
--- an UNfair comparison

the *only* place the HDXT *may* "approach" GTX performance is in DX10 games

maybe .. we just don't know now

otOTHERh ... it DOES compete with the GTS - now - and will likely *surpass it* by an ever-widening margin as drivers mature


price/performance wise......




i'd rather save $70 for same performance, also there already had been a dx10 bench out


http://www.guru3d.com/article/Videocards/431/17/

lets try again with the MSRPs

gts= GeForce 8800GTS 640MB MSRP: $439.99
gtx= GeForce 8800GTX MSRP: $619.99
HDxt=$399

do you "get it" now? :p


What you dont get is......How much you can buy them NOW, my pricing chart is real world pricing. MSRP is nowhere near real world pricing at the moment, even on newegg the xt is $409
it came out TODAY .. no market forces are acting on it ... your GTX has been out for SIX months .. and you could NOT find one for even MSRP on 8800 launch day :p

i and i am not buying one *now*

and check back tomorrow or next week when there will likely be HD-2900xt for sub-GTS prices

--the 640MB version - it's real competitor .. not the 320MB ''discounted" version


$329 IS the 640 version...... owned


http://www.newegg.com/Product/Product.aspx?Item=N82E16814133188

http://www.newegg.com/Product/Product.aspx?Item=N82E16814130071



LOL

anything more you'd like to add?

I do not like the fact you would need to buy a new power supply to overclock also, my psu already has a 8 pin taken up by my motherboard.

CPU 8-pin and PCI-E 8-pin are keyed differently. Its also likely that the 8900s will need an 8 pin as well.


actually in the review pics, I saw the 8 pin cord say "CPU ONLY"

so is it really different?

i guess i have 3 x 8 pin connectors ... :p

... 2 - CPU, 1 - 'main' ... which i believe can be interchanged

and then 2 - 6 pin PCIe

no worry
... beef curry
 

swtethan

Diamond Member
Aug 5, 2005
9,071
0
0
Originally posted by: apoppin

i guess i have 3 x 8 pin connectors ... :p

... 2 - CPU, 1 - 'main' ... which i believe can be interchanged

and then 2 - 6 pin PCIe

no worry
... beef curry
[/quote]

so.... I only see 1x8 pin on your psu, or am I wrong? Usually 8 pin cpu power is like a 20+4 pin motherboard plug, its attached together and kinda hangs off the side if you dont use it.

http://www.ocztechnology.com/products/p...xstream_power_supply-nvidia_sli_ready_

You cannot support a factory OC'ed 2900XT (if that IS your power supply). Since you dont overclock, you can still use your 2x pci-e connectors to power the card. SO............ OC'ed GTS = beats XT yes? or am I wrong?


 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Lots of nice stuff, but too loud and too much power. The mid range should give us an idea of whats next.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: swtethan
Originally posted by: apoppin

i guess i have 3 x 8 pin connectors ... :p

... 2 - CPU, 1 - 'main' ... which i believe can be interchanged

and then 2 - 6 pin PCIe

no worry
... beef curry

so.... I only see 1x8 pin on your psu, or am I wrong? Usually 8 pin cpu power is like a 20+4 pin motherboard plug, its attached together and kinda hangs off the side if you dont use it.

http://www.ocztechnology.com/products/p...xstream_power_supply-nvidia_sli_ready_

You cannot support a factory OC'ed 2900XT (if that IS your power supply). Since you dont overclock, you can still use your 2x pci-e connectors to power the card. SO............ OC'ed GTS = beats XT yes? or am I wrong?


[/quote]

what are you trying to show me? .. i know my PS ...

letsee ... from the manual ...

2 Channel PCIe connector
Dual CPU support ... 2 x 8-pin 12v

of course the 4 + 4 pin

and - i think - 1 more 12v connector ...

at the very least why is it again i cannot use one of the 2 CPU 8 pin plugs?

at ANY rate, i like to OC my GPUs only for testing ... i would "miss" that

... but i have never run with a permanently OC'd GPU ... my CPUs are what i like to keep OC'd as they usually are cheap as hell compared to my GPU ... look in sig ... the $110 e4300 is well-oc'd BUT my sapphire was tested and then returned immediately to stock speeds.

 

swtethan

Diamond Member
Aug 5, 2005
9,071
0
0
Originally posted by: apoppin
Originally posted by: swtethan
Originally posted by: apoppin

i guess i have 3 x 8 pin connectors ... :p

... 2 - CPU, 1 - 'main' ... which i believe can be interchanged

and then 2 - 6 pin PCIe

no worry
... beef curry

so.... I only see 1x8 pin on your psu, or am I wrong? Usually 8 pin cpu power is like a 20+4 pin motherboard plug, its attached together and kinda hangs off the side if you dont use it.

http://www.ocztechnology.com/products/p...xstream_power_supply-nvidia_sli_ready_

You cannot support a factory OC'ed 2900XT (if that IS your power supply). Since you dont overclock, you can still use your 2x pci-e connectors to power the card. SO............ OC'ed GTS = beats XT yes? or am I wrong?

what are you trying to show me? .. i know my PS ...

letsee ... from the manual ...

2 Channel PCIe connector
Dual CPU support ... 2 x 8-pin 12v

of course the 4 + 4 pin

and - i think - 1 more 12v connector ...

at the very least why is it again i cannot use one of the 2 CPU 8 pin plugs?

at ANY rate, i like to OC my GPUs only for testing ... i would "miss" that

... but i have never run with a permanently OC'd GPU ... my CPUs are what i like to keep OC'd as they usually are cheap as hell compared to my GPU ... look in sig ... the $110 e4300 is well-oc'd BUT my sapphire was tested and then returned immediately to stock speeds.

[/quote]


on their own website:

1 x 4-pin/8-pin CPU
(supports double CPUs/supplies stable voltage)
2 x PCI-E**
**600W, 700W, 850W models

is this the wrong page?